Apple Releases ARKit 3.5 for Developers With Support for iPad Pro's LiDAR Scanner

Apple today informed developers that ARKit 3.5 is now available, with the update adding support for the LiDAR Scanner and depth-sensing system included in the new 11 and 12.9-inch iPad Pro models.

appleipadproar
According to Apple, the new LiDAR Scanner will allow for a "new generation of AR apps" that use Scene Geometry for enhanced scene understanding and object occlusion.

Existing AR experiences on ‌iPad Pro‌ can also be improved with instant AR placement and improved Motion Capture and People Occlusion.

The LiDAR Scanner uses reflected light to measure the distance from the sensor to surrounding objects up to five meters away, indoors and outdoors. Depth frameworks in iPadOS combine depth points measured by the LiDAR Scanner, data from the two cameras, and data from motion sensors with computer vision algorithms handled by the A12Z Bionic to create a detailed and complete understanding of a scene.

At the current time, the new LiDAR Scanner with its enhanced augmented reality capabilities is limited to the new ‌iPad Pro‌ models, but Apple is also expected to include the feature in the 2020 iPhone models set to be released this fall.

Tag: ARKit

Top Rated Comments

az431 Avatar
42 months ago

I take Gimmicks the world doesn't need for $100, Alex
Always fun when people pass judgment based on their own internal subjective beliefs. But at least you didn't say that you polled all 8 of your friends and none were interested in it.
Score: 8 Votes (Like | Disagree)
chucker23n1 Avatar
42 months ago
Thomas, post: 28306089, member: 1021539"]
But I guess the LiDAR should still help in normal AR applications as well, e.g. when it comes to detecting a surface. Currently this is done using the cameras and AI only. So, the LiDAR should be able to transparently assist that AI in current AR applications as well, or?
The question there would be: will it remove lag in AR applications? Last time I tried an AR app the stabilization of AR objects was imho too slow. For objects in space it doesn't really matter, but for objects supposed to be attached to or sitting on a surface it didn't really met my expectations...
Yes. The calibration step goes away.

Instant AR
The LiDAR Scanner on iPad Pro enables incredibly quick plane detection, allowing for the instant placement of AR objects in the real world without scanning. Instant AR placement is automatically enabled on iPad Pro for all apps built with ARKit, without any code changes.
Score: 5 Votes (Like | Disagree)
Defender2010 Avatar
42 months ago
This is all in preparation for their glasses. To gather real life data. iPhone next when devs have caught up.
Score: 3 Votes (Like | Disagree)
calstanford Avatar
42 months ago
I take Gimmicks the world doesn't need for $100, Alex
Score: 2 Votes (Like | Disagree)
citysnaps Avatar
42 months ago

I take Gimmicks the world doesn't need for $100, Alex
Nineteen years ago that would have been: I'll take Who Needs 1,000 Songs in Your Pocket for $100, Alex.
Score: 2 Votes (Like | Disagree)
farewelwilliams Avatar
42 months ago
people that say "but the 12Z chip is awful!" have no clue what it means to include an expensive lidar feature.
Score: 2 Votes (Like | Disagree)

Popular Stories

Google Assistant

Google I/O 2016: Assistant, Home, Allo, Duo, Android N, and More

Wednesday May 18, 2016 11:51 am PDT by
Google hosted its annual I/O developers keynote at the Shoreline Amphitheatre in Mountain View, California today, announcing multiple new products and services related to Android, search, messaging, home automation, and more. Google Assistant Google Assistant is described as a "conversational assistant" that builds upon Google Now based on two-way dialog. The tool can be used, for example,...