Apple's AR/VR Headset Could Use Iris Scanning for Apple Pay

Apple's long-rumored mixed-reality headset may feature iris scanning for authentication, according to reliable analyst Ming-Chi Kuo.

eye iris
In a recent note to investors, seen by MacRumors, Kuo explained that he has reached the conclusion that Apple's mixed-reality headset may sport iris recognition on the basis of the hardware currently understood to be inside the device.

We are still unsure if the Apple HMD can support iris recognition, but the hardware specifications suggest that the HMD's eye-tracking system can support this function.

If the Apple HMD can support iris recognition, it could provide a more intuitive way for users to use Apple Pay when using the HMD.

Kuo has previously said that Apple's headset will contain 15 camera modules in total. While eight of the 15 camera modules will be used for see-through augmented reality experiences, one module will be used for environmental detection, and six modules will be used for "innovative biometrics." These biometrics could presumably include the iris scanning technology Kuo is referring to, as well as directional eye-tracking.

One practical application presented for iris recognition in the headset is authentication for Apple Pay. Much like how Touch ID or Face ID can provide ‌Apple Pay‌ authentication on other Apple devices, iris scanning may be the equivalent technology for Apple's headset to purchase digital content. Apple purportedly wants to create an App Store for the headset, with a focus on gaming, streaming video content, and video conferencing, within which ‌Apple Pay‌ would likely be integral.

Other than ‌Apple Pay‌, iris recognition could also be used to unlock the device to prevent unauthorized wearers from using the headset.

Apple has heavily researched eye-tracking technology, having filed a number of patents around systems to track a user's gaze within a head-mounted display using reflected infrared light.

headset patent eye tracking
Kuo's understanding of the Apple eye-tracking technology is strikingly similar to the system outlined in Apple's patents, with a transmitter and receiver that can detect and analyze eye movement information, providing users with images and information based on algorithms.

Apple's eye-tracking system includes a transmitter and a receiver. The transmitting end provides one or several different wavelengths of invisible light, and the receiving end detects the change of the invisible light reflected by the eyeball, and judges the eyeball movement based on the change.

Kuo says that most head-mounted devices are operated by handheld controllers that can't provide a smooth user experience. He believes that there are several advantages to an eye-tracking system like the one Apple will use, including an intuitive visual experience that interacts seamlessly with the external environment, more intuitive operation that can be controlled with eye movements, and reduced computational burden in the form of a reduced resolution where the user is not looking.

The Information previously said that Apple's headset will feature advanced eye-tracking capabilities along with more than a dozen cameras for tracking hand movements, while Bloomberg explained that the headset will be a "mostly virtual reality device" offering a 3D environment for gaming, watching videos, and communicating. AR functionality will be limited, and Apple plans to include powerful processors to handle the gaming features.

Kuo earlier this month said that Apple would release its mixed reality headset in "mid-2022," with the headset to be then followed by augmented reality glasses in 2025.

Related Roundup: Apple Vision Pro
Buyer's Guide: Vision Pro (Buy Now)
Related Forum: Apple Vision Pro

Popular Stories

maxresdefault

Apple Announces 'Let Loose' Event on May 7 Amid Rumors of New iPads

Tuesday April 23, 2024 7:11 am PDT by
Apple has announced it will be holding a special event on Tuesday, May 7 at 7 a.m. Pacific Time (10 a.m. Eastern Time), with a live stream to be available on Apple.com and on YouTube as usual. The event invitation has a tagline of "Let Loose" and shows an artistic render of an Apple Pencil, suggesting that iPads will be a focus of the event. Subscribe to the MacRumors YouTube channel for more ...
Apple Vision Pro Dual Loop Band Orange Feature 2

Apple Cuts Vision Pro Shipments as Demand Falls 'Sharply Beyond Expectations'

Tuesday April 23, 2024 9:44 am PDT by
Apple has dropped the number of Vision Pro units that it plans to ship in 2024, going from an expected 700 to 800k units to just 400k to 450k units, according to Apple analyst Ming-Chi Kuo. Orders have been scaled back before the Vision Pro has launched in markets outside of the United States, which Kuo says is a sign that demand in the U.S. has "fallen sharply beyond expectations." As a...
iPad And Calculator App Feature

Apple Finally Plans to Release a Calculator App for iPad Later This Year

Tuesday April 23, 2024 9:08 am PDT by
Apple is finally planning a Calculator app for the iPad, over 14 years after launching the device, according to a source familiar with the matter. iPadOS 18 will include a built-in Calculator app for all iPad models that are compatible with the software update, which is expected to be unveiled during the opening keynote of Apple's annual developers conference WWDC on June 10. AppleInsider...
iOS 17 All New Features Thumb

iOS 17.5 Will Add These New Features to Your iPhone

Sunday April 21, 2024 3:00 am PDT by
The upcoming iOS 17.5 update for the iPhone includes only a few new user-facing features, but hidden code changes reveal some additional possibilities. Below, we have recapped everything new in the iOS 17.5 and iPadOS 17.5 beta so far. Web Distribution Starting with the second beta of iOS 17.5, eligible developers are able to distribute their iOS apps to iPhone users located in the EU...
Apple Silicon AI Optimized Feature Siri

Apple Releases Open Source AI Models That Run On-Device

Wednesday April 24, 2024 3:39 pm PDT by
Apple today released several open source large language models (LLMs) that are designed to run on-device rather than through cloud servers. Called OpenELM (Open-source Efficient Language Models), the LLMs are available on the Hugging Face Hub, a community for sharing AI code. As outlined in a white paper [PDF], there are eight total OpenELM models, four of which were pre-trained using the...