Apple's Secret Augmented and Virtual Reality Project
Apple has been exploring virtual reality and augmented reality technologies for more than 10 years based on patent filings, but with virtual and augmented reality exploding in popularity with the launch of ARKit, Apple's dabbling is growing more serious and could lead to an actual dedicated AR/VR product in the not too distant future.
Apple is rumored to have a secret research unit with hundreds of employees working on AR and VR and exploring ways the emerging technologies could be used in future Apple products. VR/AR hiring has ramped up over the last several years, and Apple has acquired multiple AR/VR companies as it furthers its work in the AR/VR space.
Apple said to be working on multiple virtual and augmented reality headset prototypes as engineers search for the "most compelling application" for such a device, and current rumors indicate that Apple's first product will be an augmented reality headset and/or glasses.
According to The Information, Apple is working on two AR projects that include an augmented reality headset set to be released in 2022 followed by a sleeker pair of augmented reality glasses coming in 2023. Many rumors have focused solely on the glasses, however, leading to some confusion about Apple's plans.
AR Smart Glasses
Apple is working on a set of augmented reality glasses, which Leaker Jon Prosser has suggested Apple will call the "Apple Glass." That name would be an unusual choice given the similarity to the name of Google Glass, a product that existed long before Apple's work on AR glasses came to light, so it might not be accurate.
The glasses are said to look similar to regular glasses, with both lenses to feature displays that can be interacted with using gestures. There will be an option to get the glasses with no prescription lenses at a possible starting price of $499, with prescription lenses available at an additional cost.
Apple is allegedly planning to use "cutting edge" OLED microdisplays supplied by Sony for its rumored augmented reality glasses. Sony's OLED microdisplays feature an ultra-fast response rate, ultra-high contrast, a wide color gamut, high luminance, low reflectance, and integrated drivers for a thin and light design. The glasses are said to feature an 0.5-inch display with a 1280x960 resolution.
Well-respected Apple analyst Ming-Chi Kuo expects the AR glasses to be marketed as an iPhone accessory and will primarily take a display role offloading computing, networking, and positioning to the iPhone. Offering the AR glasses as an iPhone accessory will allow Apple to keep them slim and lightweight. Prosser says that the glasses will look similar to Ray-Ban Wayfarers or the glasses that Tim Cook wears.
Bloomberg has said the Apple glasses will run "rOS" or reality operating system. rOS is said to be based on iOS, the operating system that runs on the iPhone. For the AR headset, Apple is developing a "system-on-a-package" chip similar to what's in the Apple Watch, though it will rely on the iPhone as mentioned above.
Over the course of developing an AR headset, Apple has considered touch panels, voice activation, and head gestures as input methods, and a range of applications from mapping to texting are being prototyped. Virtual meeting rooms and 360-degree video playback are also concepts that are being explored.
Thought there were initial rumors of a 2020 launch, Bloomberg believes an AR/VR headset could come as early as 2021. A report from DigiTimes suggests Apple's AR glasses will launch in 2021, and Apple analyst Ming-Chi Kuo expects a 2022 launch at the earliest.
Leaker Jon Prosser believes Apple will unveil its AR glasses in March or June of 2021, which could mean the glasses will be shown off in 2021 and then released later in 2022.
Prosser also says that Apple is working on a limited-edition "Steve Jobs Heritage" version of the smart glasses that are designed to look like the round, frameless glasses that Steve Jobs used to wear, but Bloomberg's Mark Gurman has called this rumor "complete fiction."
The Information and Bloomberg have both said that Apple is working on smart glasses AND an AR headset, with the headset to come out in 2021 followed by the glasses in 2022.
Details on Apple's work on an AR headset were allegedly shared by Apple in an internal employee meeting, where Apple execs are said to have highlighted features like 3D scanning and advanced human detection.
The headset is rumored to be similar to Facebook's Oculus Quest virtual reality headset, but with a sleeker design that uses fabrics and lightweight materials to ensure the headset is comfortable.
It's said to feature a high-resolution display and cameras that will let users "read small type" and "see other people standing in front of and behind virtual objects." The headset will be able to map surfaces, edges, and dimensions of rooms with "greater accuracy than existing devices on the market."
Apple wants to create an App Store for the headset, with a focus on gaming, streaming video content, and video conferencing. It will be controlled via Siri, though Apple is also testing a physical remote.
Apple and Foxconn are developing semitransparent lenses for an AR headset, and the lenses have moved from the prototyping stage to trial production, the final step before mass production. At the trial production stage, the design is typically locked down, which suggests the product is in the final stages of development.
A Powerful AR/VR Headset
Along with augmented reality smart glasses of some kind, rumors have suggested that Apple is working on an incredibly powerful AR/VR headset that's not quite like anything else on the market. It is said to feature an 8K display for each eye that would be untethered from either a computer or a smartphone, and it would work with both virtual and augmented reality applications.
Rather than relying on a connection to a smartphone or a computer, the headset would connect to a "dedicated box" using a high-speed short-range wireless technology called 60GHz WiGig. The box would be powered by a custom 5-nanometer Apple processor that's "more powerful than anything currently available." The box apparently resembles a PC tower, but it "won't be an actual Mac computer." To use the headset, users will not need to install special cameras in a room to detect their location as with some available VR headsets. All of the technology will be built into the headset and the box.
Internal disagreements have shaped and changed Apple's goals for its AR headset over time. Apple was initially aiming for an ultra-powerful system that came with a hub to house the processor, but Jony Ive, who has since departed the company, did not want to sell a device that would require a separate, stationary device for full functionality.
Ive instead wanted a headset with less powerful technology that could be embedded directly in the device, but the leader of the AR/VR team, Mike Rockwell, wanted the more powerful device. It was a standoff that lasted for months, and Tim Cook ultimately sided with Ive. For this reason, the ultra powerful headset described in some rumors may have been abandoned.
iOS 14 AR Leaks
Code and images found in iOS 14 confirm Apple's work on an AR or VR headset, with a photo discovered that depicts a generic-looking controller for a headset that's similar in design to the HTC Vive Focus headset. Apple has been using HTC Vive hardware for internal testing purposes.
Apple is testing its AR equipment with an iOS 14 app called Gobi along with QR codes to test augmented reality experiences. One of these augmented reality experiences is a crosswalk bowling game triggered at a specific crosswalk in Sunnyvale, California.
AR in Xcode
Code in the Xcode 11 confirms Apple's work on some kind of AR headset. There are references to codenamed test devices plus frameworks and a system shell related to these devices. The references suggest Apple is developing support for a face-mounted AR experience that's similar to Google's Daydream.
According to Taiwanese site DigiTimes, Apple is partnering with game developer Valve for its rumored AR headset. Valve released its first VR headset, Valve Index, in April 2019.
Valve previously worked with Apple to bring native VR headset support to macOS High Sierra, leveraging the eGPU support with a Mac version of the SteamVR software.
In November of 2017, Apple purchased Vrvana, a company that developed a mixed reality headset called Totem. Vrvana's technology could potentially be used in a future Apple headset. Apple followed its purchase of Vrvana with an acquisition of Akonia Holographics, a company that makes lenses for AR smart glasses.
Rumors have also suggested Apple could incorporate its augmented reality research into its ongoing car project as part of an in-car software system that could include a heads-up display or other features.
Note: See an error in this roundup or want to offer feedback? Send us an email here.
Augmented Reality vs. Virtual Reality
Augmented reality (AR) and virtual reality (VR) are similar technologies, but there's a major difference between them and their potential applications vary widely. Virtual reality refers to a full immersive experience in a virtual world, while augmented reality refers to a modified view of the real world.
The difference can perhaps be best summed up by a comparison between two products, one AR and one VR. Google Glass, Google's now-defunct set of smart glasses, is an example of augmented reality. The eye-worn Google Glass let users view the world as it is, but it offered a heads-up display that overlaid relevant computer-provided information over that real world view, such as local weather, maps, and notifications.
This is similar to what Apple is said to be working on with its "smart glasses" or augmented reality headset.
In comparison, Facebook's Oculus Rift is a virtual reality headset offering an immersive visual experience that doesn't augment the real world with additional sensory information -- it fully replaces the real world with a simulated world.
Augmented reality provides computer-generated context and information about the world around us while allowing us to interact with our surroundings, while virtual reality is designed to isolate us from our surroundings so we can interact with fictitious worlds.
Potential applications for the two are vastly different. Virtual reality is singularly focused on immersive content consumption because it makes the wearer feel as if they're actually experiencing what's going on in the simulated world through visual, tactile, and audio feedback. Virtual reality is largely linked to gaming right now, but it also has the potential to recreate real world experiences for educational or training purposes.
Augmented reality doesn't hinge on immersive content and while less exciting because it's augmenting reality instead of replacing it, it has a wider range of potential applications. In fact, augmented reality apps and games are readily available in iOS 11 thanks to ARKit.
With ARKit, an iOS device is able to identify a surface like a table, and then virtual objects can be added to it. Because of the computing power of the iPhone and the iPad, ARKit's augmented reality capabilities are impressive. ARKit is already being used to create a huge range of apps and games, blending digital objects with the real world.
Apple may aim for a product that works with both augmented and virtual reality, as suggested by recent rumors indicating work on an AR/VR headset that would connect via Wi-Fi to a PC-like box that features a powerful processor. Such a product could be used for serious virtual reality applications as well as augmented reality experiences.
In macOS High Sierra, Apple introduced for VR in Metal 2 and partnering with Valve, Unity, and Unreal to bring VR creation tools to the Mac, which could hints at some kind of future hardware. At the very least, Apple's latest Macs and upcoming iMac Pro will support existing virtual reality hardware and VR content creation.
KGI Securities analyst Ming-Chi Kuo believes Apple's AirPods and its rumored high-end over-ear headphones are the future of Apple's AI and augmented reality ambitions. He believes the two audio devices will complement Apple's rumored augmented reality glasses.
Apple's VR/AR Team
Apple's work on virtual and augmented reality dates back multiple years, but rumors picked up starting in March of 2015 when news hit that Apple had a small team of people working on augmented reality. In 2015 and into early 2016, Apple's team grew as the company hired employees with expertise in AR/VR technology and made multiple related acquisitions.
Apple's AR/VR team includes several hundred engineers from across Apple, all of whom have expertise in virtual and augmented reality. The team works across office parks in both Cupertino and Sunnyvale, and Apple is exploring several hardware and software projects under the code name "T288."
Apple's augmented reality team combines "the strengths of its hardware and software veterans," and is led by Mike Rockwell, who came from Dolby. Former employees of companies like Oculus, HoloLens, Amazon (from the VR team), 3D animation company Weta Digital, and Lucasfilm are working on AR at Apple.
There are also many camera and optical lens engineers along with hardware talent who are sourcing raw materials for projects like Apple's AR glasses. And while Apple seems to be focusing on augmented reality, at least some of the team is working on virtual reality projects.
Apple developers have also joined the WebVR community group dedicated to make VR content viewable on any device and on the web.
One of Apple's most prominent AR/VR hires was computer science professor Doug Bowman, who previously led the Virginia Tech's Center for Human-Computer Interaction. He specializes in three-dimensional user interface design and has written a book on the subject covering 3D interfaces and the benefits of immersive virtual environments. He has expertise with both virtual and augmented reality.
Apple has also hired employees that have worked on virtual or augmented reality products at Microsoft and Lytro. Some recent hires are said to be from Microsoft's HoloLens team, while others worked at Lytro, a company working on a camera able to blend live action and computer graphics for a live action VR experience. Employees coming from the HoloLens team would have experience creating an advanced augmented reality headset.
Zeyu Li, who served as a principal computer vision engineer at Magic Leap (a startup developing a head-mounted AR/VR display), is now working at Apple as a "Senior Computer Vision Algorithm Engineer."
Yury Petrov, a former research scientist at Facebook-owned Oculus, is now serving as a "research scientist" at Apple. According to his LinkedIn profile, Petrov studied virtual reality experiences, prototyped optics, and developed computer simulation software.
Augmented reality expert Jeff Norris joined Apple in April 2017 as a senior manager working on the company's augmented reality team. Norris founded the Mission Operations Innovation Office and JPL Ops Lab at NASA. He led multiple projects focused on human-system interaction with an emphasis on virtual and augmented reality.
Apple in May 2018 2018 hired Sterling Crispin, who developed a painting app for mobile VR headsets. "Cyber Paint" let VR headset wearers create 2D 260-degree pictures on Oculus Go, Daydream, GearVR, and Vive Focus. Crispin's LinkedIn page says he is working as a "prototyping researcher," suggesting he has jointed the team rumored to be working on VR/AR headset technology.
Apple in December 2018 hired former senior Tesla and Microsoft HoloLens designer Andrew Kim, and given his history, he could be working on Apple's rumored AR glasses project or its upcoming Apple car that's said to be in development.
Jaunt VR founder Arthur van Hoff joined Apple as a senior architect in April 2019. Prior to working at Apple, his company created VR hardware, including a $100,000 3D VR camera called the Jaunt One. Prior to when van Hoff left the company, Jaunt failed and pivoted to AR experiences.
With Apple's team encompassing hundreds of employees, there are many other virtual reality expert hires that have gone under the radar. On LinkedIn, there are multiple software engineers with virtual reality experience that are employed by Apple, but it is unclear if they work on the secret AR/VR team.
Apple in July 2019 moved Kim Vorrath, one of its software executives, over to the augmented reality headset division to "bring some order" to the team. Vorrath has overseen program management on the software development team for 15 years, and has been described as a "powerful force" making sure employees meet deadlines while also sussing out bugs.
Many members of Apple's AR/VR team may have joined the company though acquisitions. Since 2015, Apple has purchased several companies that created AR/VR-related products, and some of its AR/VR acquisitions even date back several years.Akonia Holographics
Apple in August 2018 bought Akonia Holographics, a startup that makes lenses for augmented reality glasses. Akonia Holographics advertises the "world's first commercially available volume holographic reflective and waveguide optics for transparent display elements in smart glasses."
The displays that it makes are said to use the company's HoloMirror technology for "ultra-clear, full-color performance" to enable the "thinnest, lightest head worn displays in the world."Vrvana
In November of 2017, Apple purchased Vrvana, a company that developed a mixed reality headset called Totem. Totem, which was never released to the public, was designed to combine both augmented and virtual reality technologies in a single headset, merging full VR capabilities with pass-through cameras to enable screen-based augmented reality features.
Totem essentially used a set of cameras to project real world images into its built-in 1440p OLED display, a somewhat unique approach that set it apart from competing products like Microsoft's HoloLens, which uses a transparent display to combine virtual and augmented reality. Apple could be planning to use some of Totem's technology in a future product.PrimeSense
Apple purchased Israeli-based 3D body sensing firm PrimeSense in 2013, sparking speculation that motion-based capabilities would be implemented into the Apple TV. PrimeSense's 3D depth technology and motion sensing capabilities were used in Microsoft's initial Kinect platform.
PrimeSense used near-IR light to project an invisible light into a room or a scene, which is then read by a CMOS image sensor to create a virtual image of an object or person. This enables motion-based controls for software interfaces, but it's also able to do things like measure virtual objects and provide relative distances or sizes, useful for augmented reality applications like interactive gaming, indoor mapping, and more. PrimeSense technology can also create highly accurate 360 degree scans of people and objects, potentially useful for virtual reality applications.Metaio
Apple acquired augmented reality startup Metaio in May of 2015. Metaio built a product called the Metaio Creator, which could be used to create augmented reality scenarios in just a few minutes. Prior to being purchased by Apple, Metaio's software was used by companies like Ferrari, who created an augmented reality showroom.
Metaio technology was also used in Berlin to allow people visiting the site of the Berlin Wall to use a smartphone or tablet to see what the area looked like when the Berlin Wall was still standing. Metaio's technology is one that could potentially be used to implement augmented reality capabilities into Apple apps like Maps.Faceshift
Apple acquired Faceshift in August of 2015, marking its second augmented reality purchase in 2015. Before being acquired by Apple, Faceshift worked with game and animation studios on technology designed to quickly and accurately capture facial expressions using 3D sensors, transforming them into animated faces in real time. Faceshift was also working on a consumer-oriented product that would allow people to morph their faces into cartoon or monster faces in real time in Skype.
Faceshift's technology has a wide range of possible use cases, and Apple appears to be using the feature to power Animoji in the iPhone X.Emotient
Emotient, a company that built tools for facial expression analysis, was acquired by Apple in January of 2016. Emotient's technology uses artificial intelligence and machine learning to read human emotion, features that have been used in the real world by advertisers to determine emotional reactions to advertisements.
There are dozens of things Apple could do with Emotient, ranging from better facial detection in the Photos app to analyzing customer feelings in Apple retail stores to unlocking iOS devices, but it also has potential AR/VR uses. Like Faceshift, Emotient's technology could be used to analyze and transform facial expressions for the creation of virtual avatars, useful for social media purposes and games. Emotient technology was likely used for Animoji.Flyby Media
Purchased in early 2016, Flyby Media is another company that worked on augmented reality. Flyby created an app that worked with Google's 3D sensor-equipped "Project Tango" smartphone, allowing messages to be attached to real world objects and viewed by others with one of Google's devices.
For example, a person could "scan" a landmark like San Francisco's Golden Gate Bridge and write a message attached to it. A person visiting the bridge later would then be able to scan the bridge with the Flyby app to see the message. The Flyby app likely drew the attention of Apple because it was able to recognize and understand different objects that were scanned, technology that could be used by Apple in a number of ways in apps like Photos and Maps.RealFace
In February of 2017, Apple purchased RealFace, a cybersecurity and machine learning company that specializes in facial recognition technology, which could potentially be used for future augmented reality features.
RealFace developed facial recognition technology integrating artificial intelligence for frictionless face recognition. Realface technology was likely employed in the iPhone X, Apple's first smartphone with facial recognition capabilities in the form of Face ID.NextVR
Apple in May 2020 acquired NextVR, a California-based company that combined virtual reality with sports, music, and entertainment, offering VR experiences for watching live events on VR headsets from PlayStation, HTC, Oculus, Google, Microsoft, and other manufacturers.
Apple in August 2020 purchased VR startup Spaces, a company that designed virtual reality experiences that people could experience in malls and other locations, such as "Terminator Salvation: Fight for the Future." Spaces also created virtual reality experiences for video communication apps like Zoom, which is something that Apple could potentially incorporate into a future AR/VR product.
Apple has filed at multiple patents that relate directly to a virtual reality headset, all dating back several years. While technology has likely advanced somewhat beyond these, they provide an interesting look at the ideas Apple has explored in the past.
A 2008 patent application covered a fairly basic "personal display system" designed to mimic the experience of being in a movie theater when watching video.
A second patent described a "Head Mounted Display System" with a "laser engine" that projected images onto a clear glass display worn over the eyes, similar to glasses. In this configuration, the headset connected to a handheld video player such as an iPod to provide processing power.
A third patent originally filed for in 2008 was similar in design, covering a goggle-like video headset designed to let users watch movies and other content. It outlined two adjustable optical modules lined up with the user's eye, which could provide vision correction and allow for the viewing of 3D content. Apple described this as offering a personal media viewing experience.
A fourth patent from 2008 covered a video headset frame similar to the Google Glass, which would allow a user to slide their iPhone or iPod into the headset to provide video. The headset was described as an augmented reality product that would let users do things like watch a video or check email while keeping an eye on their surroundings.
Beyond headset-related patents, Apple has also filed for patents describing other ways virtual and augmented reality features could be implemented into its devices. A 2009 patent application, for example, covered camera-equipped 3D displays that would shift in perspective based on a user's relative position.
Such a display would detect head movement, allowing a user to move their head around to look at a 3D image from different angles while also incorporating elements of a user's environment.
2010 and 2012 patents described the use of motion sensors to create a 3D interface for iOS devices using augmented reality techniques. Apple described the interface as a "virtual room" navigated by manipulating the orientation of the device through built-in sensors or through gestures.
In 2011, Apple filed a patent for an augmented reality feature in the Maps app related to mapping the distance to notable landmarks. With the camera, a user could look at the area around them and get real-time estimations of the distance between two points along with overlays of relevant information.
A patent filed in 2014 and granted in 2017 covers a mobile augmented reality system able to detect objects in the environment and overlay them with virtual information through the use of cameras, a screen, and a user interface. Apple describes the system as ideal for a head-mounted display, but it also shows it being used in smartphones.
Apple has been working on virtual reality technology that could be used within autonomous vehicles. Several Apple patents describe a system that includes an in-car virtual reality system with a VR headset worn to provide entertainment and to mitigate carsickness from tasks like reading and working while a vehicle is in motion.
A July 2020 patent application covers possible input methods with Apple Glasses, describing a system where the glasses use infrared heat sensing to detect when someone touches a real world object, allowing the glasses to then project controls onto a real world surface.
With this method, the Apple Glasses could project in AR control interface onto any actual object in the real world for a mixed reality overlay kind of effect.
Apple has been working on some kind of AR headset since 2016 and early rumors indicated Apple wanted to launch a product by 2019. That didn't happen, and current rumors indicate we could see the first augmented reality glasses or headset-type product in late 2021 or early 2022.