Here's what Apple says about the Deep Fusion update:
iOS 13.2 introduces Deep Fusion, an advanced image processing system that uses the A13 Bionic Neural Engine to capture images with dramatically better texture, detail and reduced noise in lower light on iPhone 11, iPhone 11 Pro and iPhone 11 Pro Max.
(Additional features include updated and additional emoji, Announce Messages for AirPods, support for AirPods Pro, HomeKit Secure Video, HomeKit-enabled routers and new Siri privacy settings. This update also contains bug fixes and improvement.)
Plus, by way of more explanation:
Pixel-by-pixel the shot is run through an additional four steps, all in the background and basically instantaneously, all working to eek out the most detail. The sky and walls in a given shot work on the lowest band. Hair, skin, fabrics, and other elements are run through the highest band. Deep Fusion will select details from each of the exposures provided to it to pick out the most detail, color, luminance, and tone for the final shot.
The iPhone's A13 chipset is the most processing power that's ever been thrown at image capture in real time, I think we've now gone beyond what the old (2012) Nokia 808 PureView could do in hardware, in its dedicated, hard-wired ISP. But we're doing it now in software, which can be improved and updated, plus we're doing it with multiple exposures and with modern advances in OIS and sensor sensitivity.
Exciting, eh? As I say, watch this space!!