Hand and Body Pose Detection in iOS 14 Will Provide New Ways to Interact With Your iPhone Without Touching the Display - MacRumors
Skip to Content

Hand and Body Pose Detection in iOS 14 Will Provide New Ways to Interact With Your iPhone Without Touching the Display

Starting in iOS 14 and macOS Big Sur, developers will be able to add the capability to detect human body and hand poses in photos and videos to their apps using Apple's updated Vision framework, as explained in this WWDC 2020 session.

apple vision framework human body pose detection jumping jack
This functionality will allow apps to analyze the poses, movements, and gestures of people, enabling a wide variety of potential features. Apple provides some examples, including a fitness app that could automatically track the exercise a user performs, a safety-training app that could help employees use correct ergonomics, and a media-editing app that could find photos or videos based on pose similarity.

Hand pose detection in particular promises to deliver a new form of interaction with apps. Apple's demonstration showed a person holding their thumb and index finger together and then being able to draw in an iPhone app without touching the display.

apple vision framework hand pose detection
Additionally, apps could use the framework to overlay emoji or graphics on a user's hands that mirror the specific gesture, such as a peace sign.

apple vision framework hand emoji
Another example is a camera app that automatically triggers photo capture when it detects the user making a specific hand gesture in the air.

The framework is capable of detecting multiple hands or bodies in one scene, but the algorithms might not work as well with people who are wearing gloves, bent over, facing upside down, or wearing overflowing or robe-like clothing. The algorithm can also experience difficulties if a person is close to edge of the screen or partially obstructed.

Similar functionality is already available through ARKit, but it is limited to augmented reality sessions and only works with the rear-facing camera on compatible iPhone and iPad models. With the updated Vision framework, developers have many more possibilities.

Related Roundup: WWDC 2026

Popular Stories

TMRS 187 WWDC26 Thumb

The MacRumors Show: Apple Announces WWDC 2026

Friday March 27, 2026 9:51 am PDT by
On this week's episode of The MacRumors Show, we discuss Apple's announcement of its 37th annual Worldwide Developers Conference (WWDC), where the company is expected to unveil a major Siri overhaul alongside iOS 27, macOS 27, and other next-generation operating systems. Subscribe to The MacRumors Show YouTube channel for more videos Like last year, WWDC 2026 will be a primarily online...
2026 Swift Student Challenge

Apple Notifying WWDC 2026 Swift Student Challenge Winners

Thursday March 26, 2026 12:02 pm PDT by
Apple today began notifying students who won the WWDC 2026 Swift Student Challenge, held from February 6 to February 28. Students who entered the challenge can sign into the website to see their status. Apple did not say how many winners it chose this year, but in prior years, the company selected a total of 350 winners. Those who win the Swift Student Challenge are eligible to enter Apple's ...
WWDC26 Glowing Ring Feature

Apple Sending WWDC 2026 Invites to Special Event Lottery Winners

Thursday April 2, 2026 2:03 pm PDT by
Students and developers who won the lottery to attend the WWDC 2026 Special Event at Apple Park on June 8 have started receiving their invites. Apple is holding a WWDC keynote viewing at Apple Park, but space is limited so invites were done on a lottery basis. Apple accepted submissions from those interested in attending until Monday night, and winners are now being notified. Developers...

Top Rated Comments

76 months ago
When these phones can snap a photo right at the optimum height of a group jump, we will truly be in the future.
Score: 13 Votes (Like | Disagree)
76 months ago
That’s great. I have a hand gesture I’ve been giving to Siri for years. Now maybe she’ll get the message.
Score: 12 Votes (Like | Disagree)
luvbug Avatar
76 months ago

Seems a lot like what Xbox was able to do with the Kinect 10 years ago
That's funny, I didn't think the Xbox was a mobile phone??
Score: 6 Votes (Like | Disagree)
Appleman3546 Avatar
76 months ago
Seems a lot like what Xbox was able to do with the Kinect 10 years ago
Score: 5 Votes (Like | Disagree)
AngerDanger Avatar
76 months ago


Honestly, this seems like the kinda stuff that'd make Apple AR compelling—being able to draw in midair means you’d also be able to navigate an interface in midair with just your hands.

Using AR/VR without bringing a controller everywhere seems analogous to what set the iPhone apart from other touchscreen phones in 2007; you didn’t need a stylus.
Score: 4 Votes (Like | Disagree)
76 months ago

The Kinect required expensive 3D scanning hardware, which ultimately Microsoft couldn't afford and discontinued. (Kinect games even attracted an additional royalty which I recall was rumored at $10) This is all done with computer vision.

That's funny, I didn't think the Xbox was a mobile phone??

How is that product and technology doing, now?
Are none of you aware the TrueDepth Camera is made with PrimeSense technology, the same tech in Kinect? It works the same way, just miniaturized.

https://www.theverge.com/circuitbreaker/2017/9/17/16315510/iphone-x-notch-kinect-apple-primesense-microsoft

And, MS continues to use that technology in Hololens.


Which is a shame, I still have mine around.
Same here.
Score: 3 Votes (Like | Disagree)