Following the release of the first public beta for iOS 9.1 yesterday, along with the GM version on Wednesday, a few of the testers have come across a new feature introduced in the update. Somewhere in the Settings app, it appears that Apple has quietly added a set-up process for the new "Hey Siri" feature coming to the iPhone 6s and iPhone 6s Plus, thanks to a built-in M9 motion coprocessor that enables the phones' always-on functionality.
Although unconfirmed by Apple, the discovery in iOS 9.1 suggests that Siri will be able to begin detecting specific user voices and determine whether or not the owner of the iPhone in question is speaking to her. Similar in vein to the way Apple aimed its Touch ID feature to work better and better the more you unlocked an iPhone using the fingerprint scanning sensor, it seems the set-up process will guide users into stating words or phrases to better acclimate Siri with each iPhone owner.
Found in General > Siri > Allow 'Hey Siri', the new always-on feature is the next step-up in the technology by Apple, allowing users to ask Siri questions or make changes within the iPhone's apps by simply stating "Hey Siri" near the iPhone. The new set-up process discovered today could also just be a way for Siri to work better detecting voices in general, and not be specific to each user. With the iPhone 6s and iPhone 6s Plus launching in just two weeks, it won't be long until everyone can find out for themselves.
Thanks Alan and Daniel!
Top Rated Comments
If I have two devices with hey Siri activated in the same area, both react... A possible solution would be that if two devices (with same iCloud account) get activated by the voice command, each one would give that information to the other devices before Siri reacts, and then determine the nearest by the voices level each device receives. Then only the nearest could respond.
Another option, instead of voice level detection could be to let Siri ask you on each device simultaneously which one was meant by asking for the device type (iPad, iPhone, etc...): "On what device do you want to ask me something?" - "iPad"
A last idea would be to let Siri ask first from the nearest device "did you mean me?" If users answers "yes", the user could go on with further commands on that device, or if he answers "no", the next device would ask the same question and so on ...
Just a thought, but maybe I am the only one with this "problem" :)