AirPods Studio Rumored to Come With U1 Chip, Ultra-Wideband Said to Be Vital to Future Apple Ecosystem
Proven leaker known as "L0vetodream" has today shared a range of information about the ultra-wideband U1 chip in Apple's upcoming AirTags item trackers and AirPods Studio headphones.
The first of a series of tweets shared today simply stated that AirPods Studio will contain an ultra-wideband U1 chip. It seems likely that the U1 chip would be used in AirPods Studio to track the location of the headphones in the Find My app, but it could also have other functions such as directional detection of other in-range U1 devices.
u1 for studio — 有没有搞措 (@L0vetodream) September 20, 2020
Apple says that the U1 chip will "enable short-range wireless location to support new experiences, such as next-generation digital car keys," but other than directional AirDrop, much of its functionality has yet to be realized.
The distance between two devices that support ultra-wideband can be measured precisely by calculating the time that it takes for a radio wave to pass between the two devices, with much more accuracy than Bluetooth LE and Wi-Fi. The leaker went on to state that "The Internet of Everything starts with U1."
The U1 chip had only appeared in the iPhone 11 and the iPhone 11 Pro. The lack of a U1 in the 2020 iPad Pro and the second iPhone SE cast doubt over the future of the chip until its recent inclusion in the Apple Watch Series 6.
A further tweet, translated from Chinese, explained more about the utility of the U1 chip in AirPods Studio and its wider significance. The leaker believes that the expansion of the U1 chip to the Apple Watch Series 6 proves Apple's ongoing commitment to the technology and serves as an indication that the chip will go on to be much more important within Apple's ecosystem in the future.
I said this point a few months ago. With the launch of S6 and the U1 chip, it has confirmed my prediction that the ecosystem in the future will use U1 to determine distance and direction, similar to AirPods' spatial audio function in iOS 14. It is likely that the new headset with the U1 chip should be able to automatically recognize the left and right positioning of the headset.
The U1 chip will reportedly facilitate automatic recognition of the headphone's orientation, meaning that it would not matter which way around users would wear the headphones. There would be no static left or right side, and users could simply put on the headphones and the audio channels would switch as needed without user intervention.
With the release of spatial audio for AirPods Pro and the introduction of directional AirDrop for iPhone 11 with iOS 13, Apple appears to be increasingly interested in directional and location-centric technology.
A final translated tweet described how Apple's upcoming AirTags will have more nuanced importance than simply item tracking.
The tag is a node that interconnects everything. The node acts as a bridge to connect various devices. U1 is the most important part of the realization of this bridge. The privacy function of iOS 14 is created to make the tag better used in a private environment. A good solution to the problem of privacy violations involved in use.
The idea of AirTags being a key part of a larger U1 network to "bridge" different devices, with privacy at the forefront, may explain what makes AirTags different from existing item trackers and why Apple has seemingly waited so long to unveil them. Not only does this have potential for more private, accurate, and widespread item tracking capabilities, but also close-range data transfer between devices with supplementary directional information.
AirTags are believed to be arriving alongside the iPhone 12 in October. AirTags and AirPods Studio have reportedly been in production for some time.
As supposed renders and images, and videos of both products have been shared over the last week, it seems that the announcement of both products is not far away.
Top Rated Comments
No one seems to have noticed that Spatial Audio is, in fact, AR. Sure, it's audio-only AR, but that is step one: augmenting reality based on the reference frame of the device. It knows where you are relative to the device. If you add a fixed reference point, like a tile, and you use a 3D scanner, like the sensor on the iPhone, to anchor that to the physical space, now you know where you are in that space.
Brilliant, beautiful evolution of technology. Well done, Apple.
imagine you, wearing the Apple Glasses or whatever, get a notification from your AppleWatch. You glance at it, and emerging from the face of the watch is a 3D interface that you can interact with directly. If anyone has seen the older Final Fantasy: The Spirits Within movie, many of the interfaces are like this, including the one the lead interacts with on her wrist computer.
We shall see. Apple is testing and developing and teasing all of this stuff over a several year period, pretty much out in the open (ARKit, U1, LiDAR, etc.) and I think that whether it’s in 2021 or 2022, we are getting closer to those Apple AR specs.