Apple Releasing New iOS 13 Developer Beta Today With Deep Fusion for New iPhones [Update: Delayed]

Apple will today release the first beta of an upcoming iOS 13 update, presumably iOS 13.2, which will introduce a feature that Apple promised at its iPhone 11 and iPhone 11 Pro event: Deep Fusion.

According to The Verge, today's update is aimed at adding Deep Fusion to Apple's newest iPhones.


Deep Fusion is a new image processing system that uses the A13 Bionic and the Neural Engine. Deep Fusion takes advantage of machine learning techniques to do pixel-by-pixel processing of photos, optimizing for texture, details, and noise in each part of the image.

The feature is aimed at improving indoor photos and photos taken in medium lighting, and it's a feature that will automatically activate based on the lens being used and the light level in the room. The wide-angle lens will use Smart HDR by default for bright scenes, with Deep Fusion activating in medium or low light and Night mode activating for darker scenes.

The telephoto lens will use Deep Fusion primarily, but Smart HDR will activate instead when the lighting is very bright. Dark Mode activates when the lighting is dark. The ultra wide-angle lens uses Smart HDR only and does not support Deep Fusion (or Night mode).

The Verge has a rundown on how Deep Fusion works, with info sourced from Apple. Deep Fusion runs entirely in the background, and unlike Night mode, there's no option to toggle it on or off.

Deep Fusion is a complex process, with the hardware in the iPhone performing several actions when a photo is taken. Prior to when the shutter button is pressed, the camera captures three frames at a fast shutter speed to freeze motion. When the shutter press happens, an additional three photos are captured, and then one longer photo is taken to preserve detail.

The three regular photos and the long-exposure shot are merged into what Apple is calling a "synthetic long," which is different from Smart HDR. Deep Fusion chooses the short exposure image that has the most detail, and then merges it with the synthetic long exposure (it's just two frames that are merged).

The images are then run through a four-step processing procedure, pixel by pixel, aimed at increasing detail and providing instructions to the A13 chip on how the two images (detail, tone, color, luminance and more) should be blended together.

Taking a Deep Fusion shot takes just a bit longer than taking a normal Smart HDR image, right around a second, so Apple will initially show a proxy image if you tap over into Photos right after taking a Deep Fusion shot, though it will quickly be replaced with the full Deep Fusion image.

Update: According to TechCrunch's Matthew Panzarino, the developer beta that features the Deep Fusion update has been delayed and it will not be available today. There is no official word on when it will launch, but John Gruber of Daring Fireball says that it's set to come tomorrow.



Related Roundups: iPhone 11, iOS 13, iPadOS, iPhone 11 Pro

Top Rated Comments

(View all)

11 weeks ago


Really wish this could be turned off. The Smart HDR has pretty terrible skintones, especially for video, turning everyone into a "Trump-orange" color in many situations. I wouldn't trust the new version.


You haven't even seen it in action yet and you already wish it could be turned off? :oops:
Rating: 43 Votes
11 weeks ago


You haven't even seen it in action yet and you already wish it could be turned off? :oops:

Isn’t that how it goes here? Complain about features before they’re even released??
Rating: 36 Votes
11 weeks ago
Finally Deep Fusion will make me a professional photographer
Rating: 26 Votes
11 weeks ago
Very nice indeed! I’m loving the camera on the iPhone 11 Pro already.
Rating: 9 Votes
11 weeks ago


You haven't even seen it in action yet and you already wish it could be turned off? :oops:


I wish it could be turned off so someone would post a with and a without photo. For all we know this update simply adds sweaters to everyone in the frame.
Rating: 7 Votes
11 weeks ago
A lot of jargon. How about they show us before and after comparisons Instead of just a photo of some guy wearing a quilt
Rating: 7 Votes
11 weeks ago


Much slower? 11 Pro CPU is some 20% faster than the A12. This is not much slower.
Apple doesn't want to give those features to older phones in order to sell the latest models.
[automerge]1569955013[/automerge]


Last year's top phone has a cheap autofocus system... Okkkk?!!!

It has nothing to do with the CPU speed. Images are processed by the ISP then fed to GPU, CPU and ML. The A12 has a completely different ISP than the A13. The A13 also has the ML coordinator block as well as each large core has an additional matrix math (AMX) block. Combined the AMX blocks do 1 trillion calculations.
Rating: 6 Votes
11 weeks ago
Hope Apple releases 13.2 today so the upgrade police go crazy.
Rating: 6 Votes
11 weeks ago
Next year’s models will be able to show the patterns on teabags. Deep Infusion.
Rating: 5 Votes
11 weeks ago


The hardware may handle it, but it will process much slower and the results won't look as good as the iPhone 11. Apple doesn't want people with the older generation to get inconsistent results, so they will just release it to the newest ones.

Also, money.


Much slower? 11 Pro CPU is some 20% faster than the A12. This is not much slower.
Apple doesn't want to give those features to older phones in order to sell the latest models.
[automerge]1569955013[/automerge]


XS has a cheap autofocus system not the 100% focus pixels Deep Fusion needs


Last year's top phone has a cheap autofocus system... Okkkk?!!!
Rating: 5 Votes

[ Read All Comments ]