Apple's Voice-Powered Virtual "Assistant" Coming to iOS
Apple will have an extensive voice recognition service in a future version of iOS called 'Assistant', according to a report on 9to5Mac.
It has been rumored that Apple was planning to include voice-related features from its acquisition of personal assistant company Siri, and a partnership with speech-to-text specialists Nuance. Apple, however, didn't mention any such features when it first demonstrated iOS 5 at WWDC in June.
This new "Assistant" feature in iOS 5 is claimed to take voice input along with other user-specific information, such as location and contacts, to provide a powerful service to the user:
We can imagine a user asking their iPhone “Assistant” to setup a movie with one of their friends. The user might say “setup movie with Mark” and based on Mark’s contact info and the user’s location data, will be able to offer tickets to a local theater and send Mark the information.
9to5Mac notes that development of the feature is ongoing and may not be finished in-time for iOS 5, but did find mention of 'Assistant' in buried in the iOS SDK.
"ASSISTANT_ENABLE_WARNING" = "Assistant uses your voice input and other information like your contact names, song names, and location to understand your requests. This data will be sent to Apple to process your request and to improve Apple products and services."
The feature sounds just like what Siri had been working on prior to its acquisition. Their iOS app
Siri Assistant remains available on the App Store for download. Siri focused on a concept called "Virtual Personal Assistants" (VPAs) that would accomplish tasks for the user.
Virtual Personal Assistants (VPAs) represent the next generation interaction paradigm for the Internet. In today's paradigm, we follow links on search results. With a VPA, we interact by having a conversation. We tell the assistant what we want to do, and it applies multiple services and information sources to help accomplish our task. Like a real assistant, a VPA is personal; it uses information about an individual's preferences and interaction history to help solve specific tasks, and it gets better with experience.
Siri's implementation used Nuance's voice recognition engine to translate spoken requests to text which would then be processed. Apple has been rumored to be working closely with Nuance to provide the same transcription service to iOS users.
Popular Stories
iOS 26 was released last month, but the software train never stops, and iOS 26.1 beta testing is already underway. So far, iOS 26.1 makes both Apple Intelligence and Live Translation on compatible AirPods available in additional languages, and it includes some other minor changes across the Apple Music, Calendar, Photos, Clock, and Safari apps.
More features and changes will follow in future ...
With the fourth betas of iOS 26.1, iPadOS 26.1, and macOS 26.1, Apple has introduced a new setting that's designed to allow users to customize the look of Liquid Glass.
The toggle lets users select from a clear look for Liquid Glass, or a tinted look. Clear is the current Liquid Glass design, which is more transparent and shows the background underneath buttons, bars, and menus, while tinted ...
Apple plans to cut production of the iPhone Air amid underwhelming sales performance, Japan's Mizuho Securities believes (via The Elec).
The Japanese investment banking and securities firm claims that the iPhone 17 Pro and iPhone 17 Pro Max are seeing higher sales than their predecessors during the same period last year, while the standard iPhone 17 is a major success, performing...
Apple's software engineers continue to internally test iOS 26.0.2, according to MacRumors logs, which have been a reliable indicator of upcoming iOS versions.
iOS 26.0.2 will be a minor update that addresses bugs and/or security vulnerabilities, but we do not know any specific details yet.
The update will likely be released by the end of next week.
Last month, Apple released iOS 26.0.1,...
While the new iPad Pro's headline feature is the M5 chip, the device has some other changes, including N1 and C1X chips, faster storage speeds, and more.
With the M5 chip, the new iPad Pro has up to a 20% faster CPU and up to a 40% faster GPU compared to the previous model with the M4 chip, according to Geekbench 6 results. Keep in mind that 256GB and 512GB configurations have a 9-core CPU,...
iOS 26.4 is expected to introduce a revamped version of Siri powered by Apple Intelligence, but not everyone is satisfied with how well it works.
In his Power On newsletter today, Bloomberg's Mark Gurman said some of Apple's software engineers have "concerns" about the overhauled Siri's performance. However, he did not provide any specific details about the shortcomings.
iOS 26.4 will...
Apple on Wednesday updated the 14-inch MacBook Pro, iPad Pro, and Vision Pro with its next-generation M5 chip, but previous rumors have indicated that the company still plans to announce at least a few additional products before the end of the year.
The following Apple products have at one point been rumored to be updated in 2025, although it is unclear if the timeframe for any of them has...
With the fourth beta of iOS 26.1, Apple added a toggle that makes Liquid Glass more opaque and reduces transparency. We tested the beta to see where the toggle works and what it looks like.
Subscribe to the MacRumors YouTube channel for more videos.
If you have the latest iOS 26.1 beta, you can go to Settings > Display and Brightness to get to the new option. Tap on Liquid Glass, then...
Apple plans to launch MacBook Air models equipped with the new M5 chip in spring 2026, according to Bloomberg's Mark Gurman. Apple is also working on M5 Pro and M5 Max MacBook Pro models that will come early in the year.
Neither the MacBook Pro models nor the MacBook Air models are expected to get design changes, with Apple focusing on simple chip upgrades. In the case of the MacBook Pro, a m...