Look at Google’s best camera and how computational photography is the next BIG…

UNDEREXPOSING EACH SHOT PRODUCES BETTER LOW-LIGHT RESULTS, COUNTER-INTUITIVELY

Google also claims that, counter-intuitively, underexposing each HDR shot actually frees the camera up to produce better low-light results. “Because we can denoise very well by taking multiple images and aligning them, we can afford to keep the colors saturated in low light,” says Levoy. “Most other manufacturers don’t trust their colors in low light, and so they desaturate, and you’ll see that very clearly on a lot of phones — the colors will be muted in low light, and our colors will not be as muted.” But the aim isn’t to get rid of noise entirely at the expense of detail; Levoy says “we like preserving texture, and we’re willing to accept a little bit of noise in order to preserve texture.”

As Levoy alludes to, mobile image processing is a matter of taste. Some people will like the Pixel’s results, others may not. But if you’re the kind of person who follows phone announcements and scours spec sheets, you’ll probably wonder whether the Pixel’s lack of optical image stabilization sets it back. Not so, says Levoy. “HDR+ needs that less than other techniques because we don’t have to take a single long exposure, we can take a number of shorter exposures and merge them… it’s less important to have optical image stabilization if you’re taking shorter exposures. We have had it in some years and not in other years. The decisions are complicated — they have to do with the build materials and other things that we’re trying to optimize on the platform.”

READ THIS TOO:  Samsung Galaxy Note 7; Specs, Price and Release Date

google best camera phone

“WE DEFINITELY WANT TO TAKE OVER MORE OF THE CAMERA STACK.”

The Pixel phones are the first to be fully designed by Google, which means such decisions can be made with a more holistic view toward the final product. “There’s now this hardware org headed by Rick Osterloh, and one of the goals of that was to pivot to a more premium experience for our phones and also to have more vertical integration,” says Levoy. “[Our team is] a part of that effort, so we definitely want to take over more of the camera stack.”

READ THIS TOO:  SEE WORLD'S MOST SECURE SMARTPHONE

What might that involve in the future? “The notion of a software-defined camera or computational photography camera is a very promising direction and I think we’re just beginning to scratch the surface,” says Levoy, citing experimental research he’s conducted into extreme low-light photography. “I think the excitement is actually just starting in this area, as we move away from single-shot hardware-dominated photography to this new area of software-defined computational photography.”

Original Article Written by Sam Byford

Use your ← → (arrow) keys to browse

Please Share with others:

FacebookTwitterGoogleTumblrPinterestDigg


Leave a Reply

Connect with:



Your email address will not be published. Required fields are marked *