Recent new hardware and software upgrades from Apple and software upgrades from Adobe have enabled some interesting new techniques in the realm of depth mapping in photos. They're not totally obvious, so in this article I'll take you through some sneaky techniques and explore how you can get the best out of your new fancy iPhone. For everything to work, you'll need the new high Sierra operating system on your Mac, iOS11 on your iPhone 7+, 8+, or X, and the latest release of Photoshop CC 2018. Let's get into it.
We've all been using JPEG for a very long time. It saves a lot of space over lossless compression methods and looks pretty good. However, like H.265 (AKA HEVC) is trying to replace H.264 for video, HEIF is positioning itself as a JPEG replacement for still images. It's actually based on the same technology as H.265, and claims to potentially take just half the space of the equivalent JPEG. Such a big change can't happen overnight however, and you'll have to enable the “more efficient” preference, hidden deep in the camera settings before your new iPhone will abandon JPEG.
Besides the potential space savings, HEIC can record more than 8 bits of color info, helpful in the move to wide gamut HDR displays. Another big reason to move to this new format is that it can contain more than just a single image. Apple's default extension is actually HEIC, which stands for High Efficiency Image Container, and they’re including a depth map as well as the photo itself.
If you want to record depth information, you need a iPhone with two rear cameras from 2016 or later. Currently that means a 7 Plus, an 8 Plus, or the iPhone X. The iPhone X can also use its front-facing sensor array to give a depth map with the selfie camera. The information you get from the rear cameras is calculated from the difference between the normal and telephoto lenses, while the selfie camera on the X gets the information directly — but the result is similar.
In iOS 11, in the Camera app’s Portrait Mode, the depth map is used to digitally blur out the background behind your subject, emulating the “bokeh” from a traditional camera. It's difficult to blur out the background with the tiny camera sensor in an iPhone, and if you want that blur, but can’t get a larger sensor camera in your pocket, this is the next best thing.
The depth information is used to figure out where the foreground subject is, and also to determine how far away each of the background elements are. Because the depth map is a little patchy, the camera app will make some educated guesses as to exactly which parts of the picture should be in the foreground and in the background. Sometimes it guesses correctly — and to be fair, it has improved — but often, it just fails.
(Incidentally, iOS11 also included some new lighting methods you can use to adjust a photograph, both while you're taking it and after the fact, but these mostly use face detection to perform their magic. Stage Light uses depth information as well, but is very hard to get right.)
Because HEIC is brand new, you'll need to use the latest release of Photoshop on the latest macOS High Sierra if you want to view it. To avoid any potential recompression from the (excellent) new High Sierra Photos app, I’ve used the Image Capture app to copy the files direct from my iPhone. That app shows that two files are created for a portrait mode shot — the original file plus the depth map in HEIC, plus an JPEG that includes the blur effect.
It’s really important to note that the HEIC file does not include any blur. You’ll have to add that yourself, or open the JPEG instead.
As long time Photoshop users might expect, the HEIC’s depth map is included as a channel, and can be found in the Channels panel, called Depth Map. Clicking its name shows you that it’s not a terribly high-fidelity image. That’s a problem, but the primary issue you're going to have with this channel is that it's all grey. For it to be of maximum use, you’ll want to use levels to redistribute its dynamic range between black and white.
To keep the original Depth Map intact, right click and duplicate it, then name the duplicate “Depth Map fixed”. Press Command-L for old-school Levels (adjustment layers can’t be used with channels) and then move the black and white point to the ends of the actual data. In the next step, the black areas won’t be blurred, and the white areas will receive maximum blur, so if you want to push these points further, go ahead. Press OK.
(Note: You can skip this step, but adjusting the Blur Focal Distance in the next step will be both more important, and more difficult to get right.)
If you can see any obvious flaws in your depth map, you can try to fix them by painting with the appropriate grey now — but it’ll be tricky to get right.
Click back on the RGB to see the image again, then click back to the Layers panel. The filter we’re about to use is one of the few that doesn't work with adjustment layers, so duplicate the Background layer with a quick Command-J for safety.
Apply Filter > Blur > Lens Blur, and you should see your selected “Depth Map fixed” channel already selected. The options here let you simulate different kinds of camera iris, change strength with Radius, and you can also shift the focal point somewhat with Blur Focal Distance, though if you push this too far it's going to look a little bit wrong.
If you're comfortable playing with channels, you can try tweaking the depth map so that it can be used as a layer mask. However, I found that the depth map isn't quite accurate enough to make this work – and the frequent failure of Apple’s own Stage Light feature shows this is hard. The Depth Map might be useful as a starting point, but I suspect the old standby Color Range will give cleaner edges in most cases.
While it’s great to see support for new image formats in the latest version of Photoshop, for this particular task it’s going to be a lot of effort for a minor improvement. Depth detection just isn't accurate enough yet, and if you want an out-of-focus background, you’ll save a huge amount of time just by using a larger sensor camera with a fast lens. Sometimes you can fake it in post, but blur is often too tricky, and the flaws all too obvious. Rumour has it that the future iPhone will include an dedicated depth sensor, and if that produces a higher quality depth map, this technique will become much more useful.