Introduced last year, the U1 chip in iPhone 11 models enables Ultra Wideband support for improved spatial awareness, but Apple has so far only used the technology to power a directional AirDrop feature in iOS 13.
In the upcoming months, that should change, as Apple has announced that it is opening up its U1 chip to developers with a new "Nearby Interaction" framework for iOS 14. This framework can stream distance and relative direction between U1-equipped devices, paving the way for some interesting new spatial-related user experiences.
Apple provides some examples:
- A multiuser AR experience places virtual water balloons in the hands of its participants
- A taxi or rideshare app employs a peer user's direction in real time to identify the relative locations of a driver and a customer
- A game app enables a user to control a paddle with their device and respond to a moving ball on the peer user's screen
"Nearby Interaction" will function on an opt-in basis, with users having to grant permission for their iPhone to find and be found by nearby devices each time they open an app that incorporates the framework. Random identifiers are generated for each session.
In its video overview of the framework, Apple noted that both iPhones should be in portrait orientation to ensure accurate measurement of distance and direction. If one iPhone is in portrait orientation and the other is in landscape, this could limit the measurement capabilities, as can walls, people, pets, objects, and other obstacles between the devices.
Apple has added "Nearby Interaction" support to its Simulator tool within Xcode 12 so that developers can experiment with the framework.
Top Rated Comments
The fact that XCode and the other apps have been rewritten for A-chips or ARM architecture hints that they can finally come to the iPad.
Edit: More wishes:
* No notch on iPhone
* Multitasking on iPhone (side-by-side, top-bottom apps)
* Full Finder-like Files app on iPad
* Multiplatform iMessage.
* iMessage + Facetime in a single app.
"Find my" works pretty well to get me within 30m/100ft, even though it ignores what floor we're on and doesn't integrate with Apple's maps of the mall or compass (so I need to work out my own bearings). But it's not quite good enough.
Here’s hoping!
Wonder what are the practical advantages and how developers will make use of it.