New in OS X: Get MacRumors Push Notifications on your Mac

Resubscribe Now Close

ARKit 2.0 Will Let Two iPhones See the Same Virtual Object

iOS 12, set to be unveiled at the Worldwide Developers Conference on Monday, will include ARKit 2.0, an upgrade to the existing ARKit 1.5 SDK that's available to developers to allow them to build augmented reality experiences in their apps.

ARKit 2.0, according to a new report from Reuters, will include a feature that's designed to let two iPhone users share an augmented reality experience, with the same objects displayed on multiple screens.


This is in line with previous rumors from Bloomberg that have said Apple is working on both multiplayer augmented reality gameplay and object permanence, which would allow a virtual object to remain in place across multiple app sessions.

Apple is aiming to allow two people to share data so they can see the same virtual object in the same space via each individual device, with Apple designing the feature in a privacy-friendly way.

Apple's multiplayer system, unlike similar offerings from Google, does not require users to share scans of their homes and personal spaces, working via a phone-to-phone system.
Apple designed its two-player system to work phone-to-phone in part because of those privacy concerns, one of the people familiar with the matter said. The approach, which has not been previously reported, differs from Google's, which requires scans of a player's environment to be sent to, and stored in, the cloud.
Full details on how Apple's multiplayer augmented reality system will work are unknown, and it's not yet clear if it works with three or more players. Apple will share more information on the feature on Monday.

Augmented reality has been a major focus for Apple over the course of the last two years, with Apple CEO Tim Cook calling AR "big and profound." "We're high on AR in the long run," Cook said in 2016.

Apple unveiled its first augmented reality product, ARKit, with iOS 11 at WWDC, and has since made improvements to the feature with the launch of ARKit 1.5 in March as part of iOS 11.3. ARKit brought mapping for irregularly shaped surfaces, vertical surface placement, and object and image recognition. With the additional changes coming in iOS 12, developers should be able to do a whole lot more with augmented reality.

Related Roundup: iOS 12
Tag: ARKit


Top Rated Comments

(View all)

27 weeks ago
Great tech demo. Meanwhile, still trying to find practical uses for AR in the real world.
Rating: 10 Votes
27 weeks ago

The sheer processing power required to render the same object, with the same vector, on two different phones, with different perspectives, is going to be insane. I wouldn't be surprised if they end up having to require a nearby Mac for processing, acting as a LAN server.

What's difficult in terms of processing about that? It seems like they just have to agree on where the object should be, then they can do their own thing.
Rating: 8 Votes
27 weeks ago
I want to place a huge pikachu 3D model on top of my building, and I want everyone to see it.

Can't wait
Rating: 6 Votes
27 weeks ago

Great tech demo. Meanwhile, still trying to find practical uses for AR in the real world.

You aren’t thinking nearly deeply enough. The uses are countless.
Rating: 6 Votes
27 weeks ago

If only someone could invent something even better than this. Something tangible, in the real world, without requiring special hardware and software. Maybe we could take a board or paper and write those things on it? Then somehow attach it to the building with nails or adhesive. We could call it a "sign" and anyone could use it! Will probably never work though because people's phones are always blocking their view of the world.

Adding signs for every piece of information one might want to know about a business is highly impractical. With such an app, there is no limit to the amount of information that can be exposed, by simply pointing the camera at the business.
Rating: 4 Votes
27 weeks ago

Pretty straightforward to me. Walk down the street, see a business, point your phone at it and see the business's info, website, hours, phone number, etc, without even having to do a google search. Lets you "business shop" in the blink of an eye. I'd really like this, tbh. :| Could probably simply integrate with Yelp, and use image recognition for business logos/names, as well as use location to aid in finding the business, if it's a small local thing.

If only someone could invent something even better than this. Something tangible, in the real world, without requiring special hardware and software. Maybe we could take a board or paper and write those things on it? Then somehow attach it to the building with nails or adhesive. We could call it a "sign" and anyone could use it! Will probably never work though because people's phones are always blocking their view of the world.
Rating: 4 Votes
27 weeks ago

The sheer processing power required to render the same object, with the same vector, on two different phones, with different perspectives, is going to be insane. I wouldn't be surprised if they end up having to require a nearby Mac for processing, acting as a LAN server.


Why would it need drastically increased processing power? Maybe i’m missing something, but seeing as all the processing will be occuring locally (as it is currently), surely the only change would be transmitting the ‘anchor point’ of the 3D models?
Rating: 3 Votes
27 weeks ago

After all of this “AR is the future” hype, it has turned out to be pretty useless. I guess kids are having some fun but for the most part it’s pretty lame. I guess some of those apps that let you “furnish” your home are cool.


It’s not Apple’s fault, it’s the fact that the developers are not fully utilizing the ideas behind the software.

For example, imagine a diet program that could show you the number of calories in a certain food when you focus the camera on it. Or walking around town looking at historical landmarks, and being able to see tons of information about it in AR. Or perhaps show you a picture of what the site used to look like hundreds of years ago in comparison. What about a live retail application that would allow you to compare prices across multiple vendors by just showing it? Or when you show it a peach at a supermarket, it tells you what to look forward to ensure it’s properly ripened? Or a plant or wildlife recognition program? Or a drug identification program in case you find pills in your children’s pockets at home?

There’s a lot more ideas than that, of course. But I agree right now we’re just seeing mostly play toys.
Rating: 3 Votes
27 weeks ago

Great tech demo. Meanwhile, still trying to find practical uses for AR in the real world.

Pretty straightforward to me. Walk down the street, see a business, point your phone at it and see the business's info, website, hours, phone number, etc, without even having to do a google search. Lets you "business shop" in the blink of an eye. I'd really like this, tbh. :| Could probably simply integrate with Yelp, and use image recognition and OCR for business logos/names, as well as use location to aid in finding the business, if it's a small local thing.
Rating: 3 Votes
27 weeks ago

I did. It’s just funny that I specifically said you need to track location and orientation, and you respond “not just location, but orientation!” as if you just discovered that.


Not reading you anymore. Go talk to yourself.
[doublepost=1528009650][/doublepost]

Go away.

My fearless predictions for WWDC '18:

Lotsa new emojis and animojis. Not a hell of a lot else.

Great job, Tim.


Want an applause for a 2016 meme. Maybe try harder.
Rating: 1 Votes

[ Read All Comments ]