Starting in iOS 14 and macOS Big Sur, developers will be able to add the capability to detect human body and hand poses in photos and videos to their apps using Apple's updated Vision framework, as explained in this WWDC 2020 session.
This functionality will allow apps to analyze the poses, movements, and gestures of people, enabling a wide variety of potential features. Apple provides some examples, including a fitness app that could automatically track the exercise a user performs, a safety-training app that could help employees use correct ergonomics, and a media-editing app that could find photos or videos based on pose similarity.
Hand pose detection in particular promises to deliver a new form of interaction with apps. Apple's demonstration showed a person holding their thumb and index finger together and then being able to draw in an iPhone app without touching the display.
Additionally, apps could use the framework to overlay emoji or graphics on a user's hands that mirror the specific gesture, such as a peace sign.
Another example is a camera app that automatically triggers photo capture when it detects the user making a specific hand gesture in the air.
The framework is capable of detecting multiple hands or bodies in one scene, but the algorithms might not work as well with people who are wearing gloves, bent over, facing upside down, or wearing overflowing or robe-like clothing. The algorithm can also experience difficulties if a person is close to edge of the screen or partially obstructed.
Similar functionality is already available through ARKit, but it is limited to augmented reality sessions and only works with the rear-facing camera on compatible iPhone and iPad models. With the updated Vision framework, developers have many more possibilities.
Thursday January 29, 2026 10:07 am PST by Joe Rossignol
Apple today confirmed to Reuters that it has acquired Q.ai, an Israeli startup that is working on artificial intelligence technology for audio.
Apple paid close to $2 billion for Q.ai, according to sources cited by the Financial Times. That would make this Apple's second-biggest acquisition ever, after it paid $3 billion for the popular headphone and audio brand Beats in 2014.
Q.ai has...
Saturday January 31, 2026 10:51 am PST by Joe Rossignol
Apple recently updated its online store with a new ordering process for Macs, including the MacBook Air, MacBook Pro, iMac, Mac mini, Mac Studio, and Mac Pro.
There used to be a handful of standard configurations available for each Mac, but now you must configure a Mac entirely from scratch on a feature-by-feature basis. In other words, ordering a new Mac now works much like ordering an...
A newly surfaced resale operation is seemingly offering Apple Store–exclusive display accessories to the public for the first time, potentially giving consumers access to Apple-designed hardware that the company has historically kept confined to its retail environments.
Apple designs a range of premium MagSafe charging stands, display trays, and hardware systems exclusively for displays in ...
Tuesday January 27, 2026 2:39 pm PST by Joe Rossignol
Update: Apple Creator Studio is now available.
Apple Creator Studio launches this Wednesday, January 28. The all-in-one subscription provides access to the Final Cut Pro, Logic Pro, Pixelmator Pro, Motion, Compressor, and MainStage apps, with U.S. pricing set at $12.99 per month or $129 per year.
A subscription to Apple Creator Studio also unlocks "intelligent features" and "premium...
Monday January 26, 2026 1:55 pm PST by Joe Rossignol
Apple today introduced its first two physical products of 2026: a second-generation AirTag and the Black Unity Connection Braided Solo Loop for the Apple Watch.
Read our coverage of each announcement to learn more:Apple Unveils New AirTag With Longer Range, Louder Speaker, and More
Apple Introduces New Black Unity Apple Watch BandBoth the new AirTag and the Black Unity Connection Braided...
Honestly, this seems like the kinda stuff that'd make Apple AR compelling—being able to draw in midair means you’d also be able to navigate an interface in midair with just your hands.
Using AR/VR without bringing a controller everywhere seems analogous to what set the iPhone apart from other touchscreen phones in 2007; you didn’t need a stylus.
The framework is capable of detecting multiple hands or bodies in one scene, but the algorithms might not work as well with people who are wearing gloves, bent over, facing upside down, or wearing overflowing or robe-like clothing.
I don't think the wizarding community is going to be too happy about this...