iOS 16 Brings New Personalized Spatial Audio Feature That Uses TrueDepth Camera

Apple in iOS 16 is enhancing the spatial audio experience with a new personalization feature. Personalized Spatial Audio uses the TrueDepth camera on an iPhone running iOS 16 to scan your ears and the area around you, delivering a unique listening experience that's tuned to you.

personalized spatial audio
The feature received just a brief mention during the keynote event, but it will make the listening experience on AirPods, AirPods Pro, AirPods Max, and other devices that support spatial audio better than ever.

Apple says that the tuned spatial audio feature brings an even more precise listening experience.

Related Forum: iOS 16

Top Rated Comments

GubbyMan Avatar
25 months ago
No no no. This doesn't scan the room. Craig never said that during the keynote. I thought this would scan your ear geometry to better simulate how you hear sound. Basically generating an HRTF. Or am I wrong?
Score: 10 Votes (Like | Disagree)
ignatius345 Avatar
25 months ago
No exaggeration, Spatial Audio on AirPods Max has been the biggest upgrade in sound I've ever experienced.
Score: 8 Votes (Like | Disagree)
reklex Avatar
25 months ago
I just got the beta and set up this and the whole process looks like some futuristic stuff i'm not gonna lie
Score: 5 Votes (Like | Disagree)
bobmans Avatar
25 months ago
When did they ever say this scans the room? I thought it scanned your head/ears proportions or w/e
Score: 4 Votes (Like | Disagree)
NeoSe7en Avatar
25 months ago
I’ve never understood Spatial Audio. It just makes it seem like the sound is coming directly from the source device rather than your headphones. How does that improve the sound experience?
Score: 3 Votes (Like | Disagree)
reklex Avatar
25 months ago
imagine if the Siri team was 1/5th as competent as the Audio team.
Score: 3 Votes (Like | Disagree)

Popular Stories

maxresdefault

Apple Announces 'Let Loose' Event on May 7 Amid Rumors of New iPads

Tuesday April 23, 2024 7:11 am PDT by
Apple has announced it will be holding a special event on Tuesday, May 7 at 7 a.m. Pacific Time (10 a.m. Eastern Time), with a live stream to be available on Apple.com and on YouTube as usual. The event invitation has a tagline of "Let Loose" and shows an artistic render of an Apple Pencil, suggesting that iPads will be a focus of the event. Subscribe to the MacRumors YouTube channel for more ...
Apple Vision Pro Dual Loop Band Orange Feature 2

Apple Cuts Vision Pro Shipments as Demand Falls 'Sharply Beyond Expectations'

Tuesday April 23, 2024 9:44 am PDT by
Apple has dropped the number of Vision Pro units that it plans to ship in 2024, going from an expected 700 to 800k units to just 400k to 450k units, according to Apple analyst Ming-Chi Kuo. Orders have been scaled back before the Vision Pro has launched in markets outside of the United States, which Kuo says is a sign that demand in the U.S. has "fallen sharply beyond expectations." As a...
iPad And Calculator App Feature

Apple Finally Plans to Release a Calculator App for iPad Later This Year

Tuesday April 23, 2024 9:08 am PDT by
Apple is finally planning a Calculator app for the iPad, over 14 years after launching the device, according to a source familiar with the matter. iPadOS 18 will include a built-in Calculator app for all iPad models that are compatible with the software update, which is expected to be unveiled during the opening keynote of Apple's annual developers conference WWDC on June 10. AppleInsider...
iOS 17 All New Features Thumb

iOS 17.5 Will Add These New Features to Your iPhone

Sunday April 21, 2024 3:00 am PDT by
The upcoming iOS 17.5 update for the iPhone includes only a few new user-facing features, but hidden code changes reveal some additional possibilities. Below, we have recapped everything new in the iOS 17.5 and iPadOS 17.5 beta so far. Web Distribution Starting with the second beta of iOS 17.5, eligible developers are able to distribute their iOS apps to iPhone users located in the EU...
Apple Silicon AI Optimized Feature Siri

Apple Releases Open Source AI Models That Run On-Device

Wednesday April 24, 2024 3:39 pm PDT by
Apple today released several open source large language models (LLMs) that are designed to run on-device rather than through cloud servers. Called OpenELM (Open-source Efficient Language Models), the LLMs are available on the Hugging Face Hub, a community for sharing AI code. As outlined in a white paper [PDF], there are eight total OpenELM models, four of which were pre-trained using the...