Starting in iOS 14 and macOS Big Sur, developers will be able to add the capability to detect human body and hand poses in photos and videos to their apps using Apple's updated Vision framework, as explained in this WWDC 2020 session.
This functionality will allow apps to analyze the poses, movements, and gestures of people, enabling a wide variety of potential features. Apple provides some examples, including a fitness app that could automatically track the exercise a user performs, a safety-training app that could help employees use correct ergonomics, and a media-editing app that could find photos or videos based on pose similarity.
Hand pose detection in particular promises to deliver a new form of interaction with apps. Apple's demonstration showed a person holding their thumb and index finger together and then being able to draw in an iPhone app without touching the display.
Additionally, apps could use the framework to overlay emoji or graphics on a user's hands that mirror the specific gesture, such as a peace sign.
Another example is a camera app that automatically triggers photo capture when it detects the user making a specific hand gesture in the air.
The framework is capable of detecting multiple hands or bodies in one scene, but the algorithms might not work as well with people who are wearing gloves, bent over, facing upside down, or wearing overflowing or robe-like clothing. The algorithm can also experience difficulties if a person is close to edge of the screen or partially obstructed.
Similar functionality is already available through ARKit, but it is limited to augmented reality sessions and only works with the rear-facing camera on compatible iPhone and iPad models. With the updated Vision framework, developers have many more possibilities.
Thursday January 15, 2026 10:56 am PST by Joe Rossignol
While the iPhone 18 Pro and iPhone 18 Pro Max are not expected to launch for another eight months, there are already plenty of rumors about the devices.
Below, we have recapped 12 features rumored for the iPhone 18 Pro models, as of January 2026:
The same overall design is expected, with 6.3-inch and 6.9-inch display sizes, and a "plateau" housing three rear cameras
Under-screen Face ID...
Friday January 16, 2026 7:07 pm PST by Joe Rossignol
Apple plans to upgrade the iPad mini, MacBook Pro, iPad Air, iMac, and MacBook Air with OLED displays between 2026 and 2028, according to DigiTimes.
Bloomberg's Mark Gurman previously reported that the iPad mini and MacBook Pro will receive an OLED display as early as this year, but he does not expect the MacBook Air to adopt the technology until 2028 at the earliest.
A new iPad Air is...
Sunday January 18, 2026 3:51 pm PST by Joe Rossignol
iOS 27 is still many months away, but there are already plenty of rumors about new features that will be included in the software update.
The first beta of iOS 27 will be released during WWDC 2026 in June, and the update should be released to all users with a compatible iPhone in September.
Bloomberg's Mark Gurman said that iOS 27 will be similar to Mac OS X Snow Leopard, in the sense...
Friday January 16, 2026 12:12 pm PST by Joe Rossignol
In select U.S. states, residents can add their driver's license or state ID to the Apple Wallet app on the iPhone and Apple Watch, and then use it to display proof of identity or age at select airports and businesses, and in select apps.
The feature is currently available in 13 U.S. states and Puerto Rico, and it is expected to launch in at least seven more in the future.
To set up the...
Sunday January 18, 2026 6:50 pm PST by Joe Rossignol
MacBook Pro availability is tightening on Apple's online store, with select configurations facing up to a two-month delivery timeframe in the United States.
A few 14-inch and 16-inch MacBook Pro configurations with an M4 Pro chip are not facing any shipping delay, but estimated delivery dates for many configurations with an M4 Max chip range from February 6 to February 24 or even later. At...
Honestly, this seems like the kinda stuff that'd make Apple AR compelling—being able to draw in midair means you’d also be able to navigate an interface in midair with just your hands.
Using AR/VR without bringing a controller everywhere seems analogous to what set the iPhone apart from other touchscreen phones in 2007; you didn’t need a stylus.
The framework is capable of detecting multiple hands or bodies in one scene, but the algorithms might not work as well with people who are wearing gloves, bent over, facing upside down, or wearing overflowing or robe-like clothing.
I don't think the wizarding community is going to be too happy about this...