How Google's Pixel phone builds a better photo

Admin
By -
0



The new Google Pixel phone has a super camera, with snappy performance and image quality that beats Apple's iPhone 7 Plus overall. So how did Google do it?

A lot of the success comes from a technology Google calls HDR+ that blends multiple photos into one to overcome common problems with mobile phone photography. Knowing that photography is a top-three item for people buying a phone, Google invested heavily in the Pixel camera. HDR+ is a key part of that effort.
You may prefer Samsung Galaxy phones or Apple iPhones, but if you're a shutterbug looking for a new phone, HDR+ is a good reason to put Google's Pixel on your short list.
HDR+ is an example of computational photography, a fast-moving field that means creation of a photo doesn't stop when an image sensor turns light from a scene into digital data. Instead, computer chips add extra steps of processing. That's useful for reducing noise, correcting lenses' optical shortcomings and stitching a camera sweep into a single panoramic shot.
But HDR+ improves what's arguably a more noticeable part of image quality called dynamic range, the ability to photograph both dim shadows and bright highlights. Expensive cameras with large sensors are better at handling both, ensuring that pine needles don't disappear into a swath of black and details on a wedding dress don't blow out into a blaze of white. But because of the way small image sensors work, phone cameras struggle with dynamic range.




Google's HDR technology, used on the right image here, blends several underexposed frames into one final photo to boost dim areas while keeping unpleasant glare out of bright patches.
Enter high dynamic range photography, or HDR. Plenty of cameras these days employ HDR techniques -- Apple has since the iPhone 4 back in 2010 -- but Google's HDR+ does so particularly well. The result is something that looks more like what your own eyes see.
HDR+ starts with the Pixel's ability to "circulate" a constant stream of photos through the phone's memory whenever the camera app is open, 30 per second when it's bright and 15 per second when dim. When you tap the shutter button, it grabs raw image data from the last 5 to 10 frames and gets to work, according to Tim Knight, leader of Google's Android camera team.
The key part of HDR+ is making sure highlights don't blow out into a featureless whitewash, a common problem with clouds in the sky and cheeks in sunlight.
"HDR+ wants to maintain highlights," Knight said "We're capturing all the data underexposed -- sometimes 3 to 4 stops underexposed," meaning that each frame is actually up to 16 times darker than it ought to look in a final photo. By stacking up these shots into a single photo, though, HDR+ can brighten dark areas without destroying the photo with noise speckles. And it can protect those highlights from washing out.
HDR+ predates the Pixel, but special-purpose hardware, Qualcomm's Hexagon chip technology, lets Google accelerate it on the Pixel. "Our goal was to maintain quality but improve speed," Knight said. "We met that goal."
Specifically, Google uses an open-source image-processing software project it calls Halide. It took Google two years to adapt Halide so it would run using the Hexagon technology.
HDR in general works better if you have good raw material to work with. Google chose a high-end 12-megapixel Sony IMX378 sensor with large pixels that are better able to distinguish bright from dark and to avoid image noise in the first place.

Google HDR+ halo example
Another general HDR problem is ghosting, artifacts stemming from differences in the frames caused by moving subjects like running children or trembling tree leaves. Blurring from camera shake also can be a problem. Using artificial intelligence techniques, Google's HDR+ quickly analyzes the burst of photos to pick a "lucky shot" that serves as the basis for the final photo.
It's not perfect. In my testing, HDR+ can sometimes leave photos looking underexposed, some naturally bright colors can be muted and bright-dark contrast areas sometimes suffer halos that can, for example, make a tree look like it's glowing against a darker blue sky background.
But in general HDR+ on the Pixel does well with overcast skies, backlit faces, harsh sunlight and other challenges. And because it's software, Google can update its camera app to improve HDR+, something it's done with earlier Nexus phones.
Blending multiple shots into one, done right, is a good recipe for success.

Post a Comment

0Comments

Post a Comment (0)