'High dynamic range' – or HDR – takes multiple exposures, then stitches them together. Both the iPhone and Android devices come with HDR options.
Over the past year, the photo-sharing network Flickr marked a massive shift in how people take pictures. Digital photographs come with a snippet of code that identifies what kind of camera took each shot. Flickr tracks this information across its 75 million users, showing the rise and fall of popular cameras. Last year, the site revealed that the most popular camera isn't a camera at all. It's a phone.
Smart-phone cameras have greatly improved year over year, introducing higher megapixel counts, better lenses, and an often-misunderstood feature called "high dynamic range" or HDR.
What is HDR?
This camera setting tries to tackle one of the weaknesses in phone photography. Few phones offer exposure settings. This leads to problems when dealing with harsh light and heavy shadow.
Take a look at the images above. The left photo was taken without HDR. The brick walkway looks great, but the camera lost a lot of detail in the bright and dark areas. The sky looks washed out. The flowers seem lost in shadow.
A professional could repair this image with editing. But HDR tries to fix the image as it's being taken.
With HDR on, a phone will snap three photos in rapid succession. Each image uses a different exposure level. One shot is tuned for dark areas, such as the flowers in the image above. One captures the bright spots, such as the sky. The third goes for the mid-tones, much like the non-HDR photo.
After it snaps all three photos, the phone tries to identify the best aspects of each shot and stitch them into a single image.
While the final product is rarely perfect, it's often much better than the original – with no extra work required. If you're unhappy with the results, the iPhone actually saves two images: the HDR version and the shot it would have taken without the feature.