Apple Previews New Door Detection, Apple Watch Mirroring, and Live Captions Accessibility Features - MacRumors
Skip to Content

Apple Previews New Door Detection, Apple Watch Mirroring, and Live Captions Accessibility Features

Apple today previewed a range of new accessibility features, including Door Detection, Apple Watch Mirroring, Live Captions, and more.

Apple Accessibility OS features 2022
Door Detection will allow individuals who are blind or have low vision to use their iPhone or iPad to locate a door upon arriving at a new destination, understand how far they are from it, and describe the door's attributes, including how it can be opened and any nearby signs or symbols. The feature will be part of a new "Detection Mode" in Magnifier, alongside People Detection and Image Descriptions. Door Detection will only be available on iPhones and iPads with a LiDAR scanner.

Users with physical disabilities who may rely on Voice Control and Switch Control will be able to fully control their Apple Watch Series 6 and Apple Watch Series 7 from their iPhone with Apple Watch Mirroring via AirPlay, using assistive features like Voice Control and Switch Control, and inputs such as voice commands, sound actions, head tracking, and more.

New Quick Actions on the Apple Watch will allow users to use a double-pinch gesture to answer or end a phone call, dismiss a notification, take a photo, play or pause media in the Now Playing app, and start, pause, or resume a workout.

Deaf users and those who are hard of hearing will be able to follow Live Captions across the iPhone, ‌iPad‌, and Mac, providing a way for users to follow any audio content more easily, such as during a phone call or when watching video content. Users can adjust the font size, see Live Captions for all participants in a group FaceTime call, and type responses that are spoken aloud. English Live Captions will be available in beta on the iPhone 11 and later, ‌iPad‌ models with the A12 Bionic and later, and Macs with Apple silicon later this year.

Apple will expand support for VoiceOver, its screen reader for blind and low vision users, with 20 new languages and locales, including Bengali, Bulgarian, Catalan, Ukrainian, and Vietnamese. In addition, users will be able to select from dozens of new optimized voices across languages and a new Text Checker tool to find formatting issues in text.

There will also be Sound Recognition for unique home doorbells and appliances, adjustable response times for Siri, new themes and customization options in Apple Books, and sound and haptic feedback for VoiceOver users in Apple Maps to find the starting point for walking directions.

The new accessibility features will be released later this year via software updates. For more information, see Apple's full press release.

To celebrate Global Accessibility Awareness Day, Apple also announced plans to launch SignTime in Canada on May 19 to support customers with American Sign Language (ASL) interpreters, launch live sessions in Apple Stores and social media posts to help users discover accessibility features, expand the Accessibility Assistant shortcut to the Mac and Apple Watch, highlight accessibility features in Apple Fitness+ such as Audio Hints, release a Park Access for All guide in ‌Apple Maps‌, and flag accessibility-focused content in the App Store, Apple Books, the TV app, and Apple Music.

Popular Stories

Apple Event Logo

Apple Just Released a New Accessory

Monday May 4, 2026 8:13 am PDT by
Apple today released a new Pride Edition Sport Loop for the Apple Watch. The band features a rainbow design with 11 colors of woven nylon yarns. The new Pride Edition Sport Loop is available to order now on Apple.com and in the Apple Store app in 40mm, 42mm, and 46mm sizes, and it will be available at Apple Store locations starting later this week. In the U.S., the band costs $49. There...
Instagram Feature 2

PSA: Instagram Encrypted Messaging Ends on Friday, May 8

Tuesday May 5, 2026 8:24 am PDT by
Instagram will remove end-to-end encryption for direct messages between users from May 8, 2026. When the date comes around, Meta will potentially be able to see the contents of all messages between users on the social media platform. Encrypting messages has been an optional feature in Instagram since 2023, but in March of this year the social media platform quietly updated a help page to say ...
iOS 26

Apple Says iOS 26.5 Adds Three New Features to Your iPhone

Tuesday May 5, 2026 7:36 am PDT by
iOS 26.5 includes three new features for iPhones, according to Apple's release notes for the update, which is expected to be released next week. As discovered during beta testing, iOS 26.5 enables end-to-end encryption for RCS messaging between iOS and Android devices. Apple says this security upgrade is limited to supported carriers around the world and will continue to roll out....

Top Rated Comments

52 months ago

Apple again leads in accessibility. Love the Live captions and door detection.
To be fair, Android has this Live Captions feature already as well as Google Chrome. I had to rely on it on all platforms.

Microsoft announced and is testing Live Captions on Windows 11 insider builds for a few months now.

Apple is late as usual but I’m sure they will be the best implemented one as that is just them.

Regardless, everyone wins here. We need more accessibility support across the industry.
Score: 8 Votes (Like | Disagree)
52 months ago

I think the difference is that Google does all processing on their servers, Apple's implementation is on-device only and works offline. (not to mention your conversation stays private)
Actually, Google’s live caption is all done on-device and does not require an internet connection to function. They have been moving more and more voice request processing to on-device the past few years.
Score: 6 Votes (Like | Disagree)
52 months ago

Actually, Google’s live caption is all done on-device and does not require an internet connection to function. They have been moving more and more voice request processing to on-device the past few years.
This is correct. Taken from Android Accessibility Help ('https://support.google.com/accessibility/android/answer/9350862?hl=en') page: "All captions are processed locally, never stored, and never leave your device."

When it comes to accessibility, users need anything that can help them now. They can't sit around and wait for something else, so I would say Apple is late to the game here. I know a co-worker who switched to Android several years ago so he could use the live caption feature for meetings. Previously, he was using a captioning service over the phone, but was not a fan of having another live person listening in on the meetings.
Score: 5 Votes (Like | Disagree)
surfzen21 Avatar
52 months ago

Apple again leads in accessibility. Love the Live captions and door detection.
Agreed. A lot of their accessibility features seem to get over looked but they actually are life-changing for folks in need.
Score: 4 Votes (Like | Disagree)
Apple$ Avatar
52 months ago
Better late than never, Apple. As a CI Android user, I love the live captions feature so much! it's just so handy when you are watching a YouTube video that doesn't have captions. Instead of skipping it as I did in the past, I just turn on the live captions.
Score: 3 Votes (Like | Disagree)
52 months ago

To be fair, Android has this Live Captions feature already as well as Google Chrome. I had to rely on it on all platforms.

Microsoft announced and is testing Live Captions on Windows 11 insider builds for a few months now.

Apple is late as usual but I’m sure they will be the best implemented one as that is just them.

Regardless, everyone wins here. We need more accessibility support across the industry.
I think the difference is that Google does all processing on their servers, Apple's implementation is on-device only and works offline. (not to mention your conversation stays private)
Score: 3 Votes (Like | Disagree)