Apple Releasing New iOS 13 Developer Beta Today With Deep Fusion for New iPhones
Apple will today release the first beta of an upcoming iOS 13 update, presumably iOS 13.2, which will introduce a feature that Apple promised at its iPhone 11 and iPhone 11 Pro event: Deep Fusion.
According to The Verge, today’s update is aimed at adding Deep Fusion to Apple’s newest iPhones.
Deep Fusion is a new image processing system that uses the A13 Bionic and the Neural Engine. Deep Fusion takes advantage of machine learning techniques to do pixel-by-pixel processing of photos, optimizing for texture, details, and noise in each part of the image.
The feature is aimed at improving indoor photos and photos taken in medium lighting, and it’s a feature that will automatically activate based on the lens being used and the light level in the room. The wide-angle lens will use Smart HDR by default for bright scenes, with Dark Fusion activating in medium or low light and Night mode activating for darker scenes.
The telephoto lens will use Deep Fusion primarily, but Smart HDR will activate instead when the lighting is very bright. Dark Mode activates when the lighting is dark.The ultra wide-angle lens uses Smart HDR only and does not support Deep Fusion (or Night mode).
The Verge has a rundown on how Deep Fusion works, with info sourced from Apple. Deep Fusion runs entirely in the background, and unlike Night mode, there’s no option to toggle it on or off.
Deep Fusion is a complex process, with the hardware in the iPhone performing several actions when a photo is taken. Prior to when the shutter button is pressed, the camera captures three frames at a fast shutter speed to freeze motion. When the shutter press happens, an additional three photos are captured, and then one longer photo is taken to preserve detail.
The three regular photos and the long-exposure shot are merged into what Apple is calling a “synthetic long,” which is different from Smart HDR. Deep Fusion chooses the short exposure image that has the most detail, and then merges it with the synthetic long exposure (it’s just two frames that are merged).
The images are then run through a four-step processing procedure, pixel by pixel, aimed at increasing detail and providing instructions to the A13 chip on how the two images (detail, tone, color, luminance and more) should be blended together.
Taking a Deep Fusion shot takes just a bit longer than taking a normal Smart HDR image, right around a second, so Apple will initially show a proxy image if you tap over into Photos right after taking a Deep Fusion shot, though it will quickly be replaced with the full Deep Fusion image.
There are no specific details on when iOS 13.2 (and presumably iPadOS 13.2) are coming, but the update could be released any time now.