Apple's 'Hey Siri' Feature in iOS 9 Uses Individualized Voice Recognition
Following the release of the first public beta for iOS 9.1 yesterday, along with the GM version on Wednesday, a few of the testers have come across a new feature introduced in the update. Somewhere in the Settings app, it appears that Apple has quietly added a set-up process for the new "Hey Siri" feature coming to the iPhone 6s and iPhone 6s Plus, thanks to a built-in M9 motion coprocessor that enables the phones' always-on functionality.
Although unconfirmed by Apple, the discovery in iOS 9.1 suggests that Siri will be able to begin detecting specific user voices and determine whether or not the owner of the iPhone in question is speaking to her. Similar in vein to the way Apple aimed its Touch ID feature to work better and better the more you unlocked an iPhone using the fingerprint scanning sensor, it seems the set-up process will guide users into stating words or phrases to better acclimate Siri with each iPhone owner.
Found in General > Siri > Allow 'Hey Siri', the new always-on feature is the next step-up in the technology by Apple, allowing users to ask Siri questions or make changes within the iPhone's apps by simply stating "Hey Siri" near the iPhone. The new set-up process discovered today could also just be a way for Siri to work better detecting voices in general, and not be specific to each user. With the iPhone 6s and iPhone 6s Plus launching in just two weeks, it won't be long until everyone can find out for themselves.
Thanks Alan and Daniel!
Popular Stories
Apple has announced it will be holding a special event on Tuesday, May 7 at 7 a.m. Pacific Time (10 a.m. Eastern Time), with a live stream to be available on Apple.com and on YouTube as usual. The event invitation has a tagline of "Let Loose" and shows an artistic render of an Apple Pencil, suggesting that iPads will be a focus of the event. Subscribe to the MacRumors YouTube channel for more ...
Apple has dropped the number of Vision Pro units that it plans to ship in 2024, going from an expected 700 to 800k units to just 400k to 450k units, according to Apple analyst Ming-Chi Kuo. Orders have been scaled back before the Vision Pro has launched in markets outside of the United States, which Kuo says is a sign that demand in the U.S. has "fallen sharply beyond expectations." As a...
The upcoming iOS 17.5 update for the iPhone includes only a few new user-facing features, but hidden code changes reveal some additional possibilities. Below, we have recapped everything new in the iOS 17.5 and iPadOS 17.5 beta so far. Web Distribution Starting with the second beta of iOS 17.5, eligible developers are able to distribute their iOS apps to iPhone users located in the EU...
Apple is finally planning a Calculator app for the iPad, over 14 years after launching the device, according to a source familiar with the matter. iPadOS 18 will include a built-in Calculator app for all iPad models that are compatible with the software update, which is expected to be unveiled during the opening keynote of Apple's annual developers conference WWDC on June 10. AppleInsider...
Apple has stopped production of FineWoven accessories, according to the Apple leaker and prototype collector known as "Kosutami." In a post on X (formerly Twitter), Kosutami explained that Apple has stopped production of FineWoven accessories due to its poor durability. The company may move to another non-leather material for its premium accessories in the future. Kosutami has revealed...
Top Rated Comments
If I have two devices with hey Siri activated in the same area, both react... A possible solution would be that if two devices (with same iCloud account) get activated by the voice command, each one would give that information to the other devices before Siri reacts, and then determine the nearest by the voices level each device receives. Then only the nearest could respond.
Another option, instead of voice level detection could be to let Siri ask you on each device simultaneously which one was meant by asking for the device type (iPad, iPhone, etc...): "On what device do you want to ask me something?" - "iPad"
A last idea would be to let Siri ask first from the nearest device "did you mean me?" If users answers "yes", the user could go on with further commands on that device, or if he answers "no", the next device would ask the same question and so on ...
Just a thought, but maybe I am the only one with this "problem" :)