Apple Researchers Reveal New AI System That Can Beat GPT-4
Apple researchers have developed an artificial intelligence system named ReALM (Reference Resolution as Language Modeling) that aims to radically enhance how voice assistants understand and respond to commands.

In a research paper (via VentureBeat), Apple outlines a new system for how large language models tackle reference resolution, which involves deciphering ambiguous references to on-screen entities, as well as understanding conversational and background context. As a result, ReALM could lead to more intuitive and natural interactions with devices.
Reference resolution is an important part of natural language understanding, enabling users to use pronouns and other indirect references in conversation without confusion. For digital assistants, this capability has historically been a significant challenge, limited by the need to interpret a wide range of verbal cues and visual information. Apple's ReALM system seeks to address this by converting the complex process of reference resolution into a pure language modeling problem. In doing so, it can comprehend references to visual elements displayed on a screen and integrate this understanding into the conversational flow.
ReALM reconstructs the visual layout of a screen using textual representations. This involves parsing on-screen entities and their locations to generate a textual format that captures the screen's content and structure. Apple researchers found that this strategy, combined with specific fine-tuning of language models for reference resolution tasks, significantly outperforms traditional methods, including the capabilities of OpenAI's GPT-4.
ReALM could enable users to interact with digital assistants much more efficiently with reference to what is currently displayed on their screen without the need for precise, detailed instructions. This has the potential to make voice assistants much more useful in a variety of settings, such as helping drivers navigate infotainment systems while driving or assisting users with disabilities by providing an easier and more accurate means of indirect interaction.
Apple has now published several AI research papers. Last month, the company revealed a new method for training large language models that seamlessly integrates both text and visual information. Apple is widely expected to unveil an array of AI features at WWDC in June.
Popular Stories
The iPhone 17 Pro and iPhone 17 Pro Max are three months away, and there are plenty of rumors about the devices.
Below, we recap key changes rumored for the iPhone 17 Pro models as of June 2025:Aluminum frame: iPhone 17 Pro models are rumored to have an aluminum frame, whereas the iPhone 15 Pro and iPhone 16 Pro models have a titanium frame, and the iPhone X through iPhone 14 Pro have a...
iPadOS 26 allows iPads to function much more like Macs, with a new app windowing system, a swipe-down menu bar at the top of the screen, and more. However, Apple has stopped short of allowing iPads to run macOS, and it has now explained why.
In an interview this week with Swiss tech journalist Rafael Zeier, Apple's software engineering chief Craig Federighi said that iPadOS 26's new Mac-like ...
Apple's iPhone development roadmap runs several years into the future and the company is continually working with suppliers on several successive iPhone models simultaneously, which is why we often get rumored features months ahead of launch. The iPhone 17 series is no different, and we already have a good idea of what to expect from Apple's 2025 smartphone lineup.
If you skipped the iPhone...
Alongside WWDC this week, Logitech announced notable new accessories for the iPad and Apple Vision Pro.
The Logitech Muse is a spatially-tracked stylus developed for use with the Apple Vision Pro. Introduced during the WWDC 2025 keynote address, Muse is intended to support the next generation of spatial computing workflows enabled by visionOS 26. The device incorporates six degrees of...
With iOS 26, Apple has introduced some major changes to the iPhone experience, headlined by the new Liquid Glass redesign that's available across all compatible devices. However, several of the update's features are exclusive to iPhone 15 Pro and iPhone 16 models, since they rely on Apple Intelligence.
The following features are powered by on-device large language models and machine...
Apple this week announced that iPhone users will soon be able to watch videos right on the CarPlay screen in supported vehicles.
iPhone users will be able to wirelessly stream videos to the CarPlay screen using AirPlay, according to Apple. For safety reasons, video playback will only be available when the vehicle is parked, to prevent distracted driving. The connected iPhone will be able to...
iOS 26 features a whole new design material that Apple calls Liquid Glass, with a focus on transparency that lets the content on your display shine through the controls. If you're not a fan of the look, or are having trouble with readability, there is a step that you can take to make things more opaque without entirely losing out on the new look.
Apple has multiple Accessibility options that ...
Apple today added Mac Studio models with M4 Max and M3 Ultra chips to its online certified refurbished store in the United States, Canada, Japan, Singapore, and many European countries, for the first time since they were released in March.
As usual for refurbished Macs, prices are discounted by approximately 15% compared to the equivalent new models on Apple's online store. Note that Apple's ...
Apple today provided developers with a revised version of the first iOS 26 beta for testing purposes. The update is only available for the iPhone 15 and iPhone 16 models, so if you're running iOS 26 on an iPhone 14 or earlier, you won't see the revised beta.
Registered developers can download the new beta software through the Settings app on each device.
The revised beta addresses an...