New in OS X: Get MacRumors Push Notifications on your Mac

Resubscribe Now Close

Apple Readies 3D Sensing Rear Camera Component Supplies for 2020 iPhones

Apple has asked one of its manufacturing partners to ready components for use in rear Time-of-Flight (ToF) camera lenses said to be coming to next year's iPhone lineup, according to DigiTimes.

Apple has reportedly asked its supply chain partner to supply VCSEL components for use in rear ToF camera lens in its mobile devices to be released in 2020, according to supply chain sources.
Multiple sources have claimed that Apple's 2020 iPhones will include a laser-powered time-of-flight 3D rear camera that will result in significant improvements to AR experiences, including Bloomberg's Mark Gurman and Apple analyst Ming-Chi Kuo.

VCSELs, or vertical-cavity surface-emitting lasers, are a key component of Apple's TrueDepth camera in the iPhone XR, XS, and XS Max, and power several flagship features like Face ID, Animoji and Portrait mode selfies, as well as the proximity-sensing capabilities of AirPods. However, a ToF camera system is a major step up from TrueDepth due to its more advanced use of laser.

TrueDepth relies on a structured-light technique that projects a pattern of 30,000 laser dots onto a user's face and measures the distortion to generate an accurate 3D image for authentication. By contrast, ToF calculates the time it takes for a laser to bounce off surrounding objects to create a 3D image of the environment. This allows for more accurate depth perception and better placement of virtual objects, and should also result in photos better able to capture depth.

Bloomberg says that the rear camera in Apple's 2020 iPhones will be able to scan areas up to 15 feet from the device. Apple's front-facing TrueDepth camera uses 3D technology but its structured-light system only works at distances of 25 to 50 centimeters.

Apple is said to have gained a two-year lead over its rivals in the smartphone industry in the area of 3D sensing technology, having secured the necessary hardware way in advance of competitors. Sony could be the ToF supplier that DigiTimes is referring to in the paywalled article, as Apple has reportedly been in talks with Sony over ToF sensor tests. Although in December 2017, Apple said it planned to invest $390 million in Finisar Corp, which currently supplies the components for VCSELs.

At the time, Apple said its Finisar investment would enable the supplier to exponentially increase its R&D spending and high-volume production of VCSELs. Apple initially sourced VCSELs for 2017's iPhone X chiefly from California-based Lumentum, but it was bottlenecks in production there that helped spur the $390 million deal with Finisar.

Lumentum subsequently ramped up additional manufacturing capacity for VCSELs and edge-emitting lasers in the first half of fiscal 2019. Another producer, Austria-based Ams, also makes VCSEL chips, and in March 2018 said it had won a large deal with an unnamed smartphone maker, so there are a few potential suppliers that Apple could be leaning on.

There were originally some rumors suggesting Apple would introduce a rear 3D camera system in its 2019 iPhones, but Kuo said that wouldn't happen because Apple needs 5G connectivity, augmented reality glasses, and a more powerful Apple Maps database to truly take advantage of the AR capabilities afforded by a ToF camera.

Bloomberg has since confirmed that Apple was aiming to put the 3D rear camera system in this year's iPhones, but ultimately had to delay its plans. Whether that decision is related to rumors that Apple has had to temporarily stop developing AR/VR headsets remains unclear.

Related Roundups: iPhone 11, iPhone 12
Buyer's Guide: iPhone 11 (Buy Now)

Top Rated Comments

(View all)

18 weeks ago

I wish I had the imagination to think of all the ways this will be able to be used, but I’m sure it’ll be a lot.



Wait, there’s one more thing.

3D Emojis/Animoji’s!
Rating: 10 Votes
18 weeks ago

I volunteer to be the downer here:
This is another feature that has nothing to do with the main purpose of the iPhone (phone, simple camera and internet/email).

I volunteer to provide you with a reality check:
Thanks for defining what the main purpose of the "iPhone" is for the whole world.:rolleyes:
Rating: 10 Votes
18 weeks ago

('https://www.macrumors.com/2019/07/17/apple-tof-camera-lens-supplies-202-iphones/')


Apple has asked one of its manufacturing partners to ready components for use in rear Time-of-Flight (ToF) camera lenses said to be coming to next year's iPhone lineup, according to DigiTimes ('https://www.digitimes.com/news/a20190717PD203.html').


Multiple sources have claimed that Apple's 2020 iPhones will include a laser-powered time-of-flight 3D rear camera that will result in significant improvements to AR experiences, including Bloomberg ('https://www.bloomberg.com/news/articles/2019-01-30/apple-is-said-to-prep-new-3-d-camera-for-2020-iphones-in-ar-push')'s Mark Gurman and Apple analyst Ming-Chi Kuo.

VCSELs, or vertical-cavity surface-emitting lasers, are a key component of Apple's TrueDepth camera in the iPhone XR, XS, and XS Max, and power several flagship features like Face ID, Animoji and Portrait mode selfies, as well as the proximity-sensing capabilities of AirPods. However, a ToF camera system is a major step up from TrueDepth due to its more advanced use of laser.

TrueDepth relies on a structured-light technique that projects a pattern of 30,000 laser dots onto a user's face and measures the distortion to generate an accurate 3D image for authentication. By contrast, ToF calculates the time it takes for a laser to bounce off surrounding objects to create a 3D image of the environment. This allows for more accurate depth perception and better placement of virtual objects, and should also result in photos better able to capture depth.

Bloomberg says that the rear camera in Apple's 2020 iPhones will be able to scan areas up to 15 feet from the device. Apple's front-facing TrueDepth camera uses 3D technology but its structured-light system only works at distances of 25 to 50 centimeters.

Apple is said to have gained a two-year lead ('https://www.macrumors.com/2018/03/20/apple-two-year-lead-rivals-3d-sensing/') over its rivals in the smartphone industry in the area of 3D sensing technology, having secured the necessary hardware way in advance of competitors. Sony could be the ToF supplier that DigiTimes is referring to in the paywalled article, as Apple has reportedly been in talks with Sony over ToF sensor tests. Although in December 2017, Apple said it planned to invest $390 million in Finisar Corp ('https://www.macrumors.com/2017/12/13/apple-390-million-investment-in-finisar-us-maker/'), which currently supplies the components for VCSELs.

At the time, Apple said its Finisar investment would enable the supplier to exponentially increase its R&D spending and high-volume production of VCSELs. Apple initially sourced VCSELs for 2017's iPhone X chiefly from California-based Lumentum, but it was bottlenecks in production there that helped spur the $390 million deal with Finisar.

Lumentum subsequently ramped up additional manufacturing capacity for VCSELs and edge-emitting lasers in the first half of fiscal 2019. Another producer, Austria-based Ams, also makes VCSEL chips, and in March 2018 said it had won a large deal with an unnamed smartphone maker, so there are a few potential suppliers that Apple could be leaning on.

There were originally some rumors suggesting Apple would introduce a rear 3D camera system in its 2019 iPhones, but Kuo said that wouldn't happen because Apple needs 5G connectivity, augmented reality glasses, and a more powerful Apple Maps database to truly take advantage of the AR capabilities afforded by a ToF camera.

Bloomberg has since confirmed that Apple was aiming to put the 3D rear camera system in this year's iPhones, but ultimately had to delay its plans. Whether that decision is related to rumors that Apple has had to temporarily stop developing AR/VR headsets ('https://www.macrumors.com/2019/07/11/apple-ar-glasses-reportedly-terminated-digitimes/') remains unclear.

Article Link: Apple Readies 3D Sensing Rear Camera Component Supplies for 2020 iPhones ('https://www.macrumors.com/2019/07/17/apple-tof-camera-lens-supplies-202-iphones/')

Haha that mockup was definitely created before the iPhone X launched. I remember everyone thinking of how they’ll replace the home button via the software and most people thought it’d be just a big white button with two extra options on each side.
Rating: 4 Votes
18 weeks ago
This is very significant news. And expected. It speaks to Apple's huge push into AR.

It also aligns with Tim Cook introducing the iPhone X a couple years ago where he said (paraphrased) that the X is a demonstrator/testbed of sorts for the technologies needed to push Apple into AR. With the front facing camera used for FaceID (and software) being instrumental and key towards that push. As an aside, its why FaceID will not be going away.

The level of innovation to now bring backside 3D time-of-flight camera sensing technology in miniaturized form to a consumer device is nothing short of astounding.
Rating: 4 Votes
18 weeks ago
Maybe they will use the iPhone as a test bed for the glasses' 3d sensor? When ToF software/hardware has been successfully implemented, they can move on to the glasses.. seems logical
Rating: 3 Votes
18 weeks ago
I volunteer to be the downer here:
This is another feature that has nothing to do with the main purpose of the iPhone (phone, simple camera and internet/email).
Nevertheless this will drive up the prices and make the phones more complex than they need to be.
I hope there will be a strong backlash on prices and Apple will be forced to release a more basic line of iPhones and then a high priced line of phones for people who want to do all the fancy AR, VR, Gaming, high end video things.
Rating: 3 Votes
18 weeks ago
...Yet another reason to skip the 2019 round.
Rating: 3 Votes
17 weeks ago

Is not Samsung the only vendor with the pones that have variable aperture lens? That's quite fundamental advantage. No amount of AI trickery can compensate for that. Same with the multiple lenses. The main reason to have them is to be able to use different focal lengths. This is critical. Apple phone cameras have been decent lately but not really special. Also Samsung camera software has been offering more features than what iPhone offers. You might not be familiar with the fact that Samsung actually was designing and manufacturing real digital cameras for quite a while and their models were very innovative. They also produce and use their own image censors (in addition to Sony sensors). I am not aware of a single feature in iPhone camera that is not available on Samsung phones.

Dynamic video hdr is available on Samsung phones? (But even then Xs max videos are about the best around. And Samsung has faster shutter with slow-mo, but the time it runs is limited:mad:)The usefulness of variable aperture lenses can be debated on lenses with short focal lengths and tiny sensors.

Thank you. As a cinematography and photography nerd.....I really pay attention to what the cameras do, along with the image they produce. The iPhone XS/XS Max is the only phone right now with an ISP that produces actual HDR video (and in real time). They do this WITHOUT a dual aperture or third lens. Camera tests from the S9 and Note 9 and S10 all show VERY little difference in quality when switching between apertures. This seemed like it should've MAJORLY improved quality and low light conditions, but on sensors so small, with such a small focal length, smart phone cameras will never truly be able to benefit from that. However, Apple was able to make LEAPS in quality between the X and XS by 1. Giving the cameras a bigger sensor that lets more light in to paint the picture. 2. Beefing up the processor and ISP with a real time neural engine that allows for zero shutter lag AND more accurate, higher dynamic range in both photos and video. This higher dynamic range is really being ignored, but it allows for a post editing experience that's a lot closer to working with RAW footage due to all the HDR data captured. Samsung is a company that produces hardware. Apple is all about achieving their goals through software and AI manipulation.
As for the SUPER SLOMO gimmick that the Samsung devices have......it's not practical for filming. its unreliable because you don't get to control it. the processor has to work so hard that the machine has to control WHEN it slows the motion AND for how long. ALSOOOOOO.....does't the Note 9 and below have a cap on 4K recording time (specifically 4K 60fps)
Rating: 2 Votes
18 weeks ago
So don’t buy a phone to replace my X this year. Got it.
Rating: 2 Votes
18 weeks ago
Keep these upgrades coming. I’m so ready to upgrade in 2020 from my 8 Plus lol. The iPhone render on the article is not too bad actually
Rating: 2 Votes

[ Read All Comments ]