Apple’s researchers continue to focus on LLMs, with studies detailing the use of AI in UI prototype creation and a new dataset for image safety rating.

Apple’s latest AI research explores how vibe-coding UI designs can be made easier.

Apple’s latest AI research explores how vibe-coding UI designs can be made easier.
With Xcode 26.3, Apple introduced support for agentic coding tools to help developers plan, execute, and iterate on projects with the help of AI. In other words, Xcode offers built-in compatibility with popular LLM chatbots, such as Anthropic’s Claude Agent and OpenAI’s Codex.
And that looks to be only the start of Apple’s vibe coding-related endeavors, as its latest research offers a new twist on generating UI designs with the help of AI. Apple is also exploring the use of AI in evaluating the safety content of images, among other things.