If you look at the top cameras used on the Flickr photo-sharing site, you’ll see four out of five of them are Apple iPhones. But with new camera technology debuting in its Pixel and Pixel XL phones on Tuesday, Google hopes to find a place in the hearts of photographers.
In an exclusive in-depth look at the new phones’ cameras, the company offered a number of reasons it thinks its phones will attract creative types who’ve largely ignored the company’s earlier Nexus line of phones.
The Pixel camera app is faster to launch and shoot, gives you better control and uses AI to find the best in a rapid-fire photo burst. A new 12-megapixel Sony image sensor locks focus sooner. Special-purpose Qualcomm processor hardware turns raw sensor data into a finished photo faster while lowering battery consumption.
But there’s a bigger reason that more people could appreciate Pixel photography. This time, Google’s fully controlled the Pixel design and now aims to make it a mass-market phone you’ll see in stores and on the streets. Moving beyond Nexus’ niche success with developers and Google fanboys is what Google needs to climb the Flickr ranks.
“The ambition is to build the best smartphone in the world,” said Dave Burke, vice president of engineering for Android. “When we go from Nexus to Pixel, we want to have that much broader audience and hit the premium high end.”
It’s an immense change for Google, which until now relied on companies like Samsung, HTC and LG to make its Android phone software a marketplace success. Now Google designs the phone, buys its components, puts a “G” logo on the back, sells it and supports it. In short, Google is more like Apple, a single company that delivers it all.
Starting at $649 for a 32GB model, Pixels are priced same as the world’s highest-profile phones, Apple’s iPhone 7 and Samsung’s Galaxy S7. The 128GB Pixel costs $749, and the corresponding Pixel XL models cost $769 and $869. If you like Google’s sales pitch, you can order now, with the phones arriving in retail October 20.
One independent assessment, DxO Labs, gave the Pixel camera its highest-ever DxOMark Mobile score, 89, a point ahead of the Samsung Galaxy S7 and the HTC 10 and three points ahead of the iPhone 7. (DxO hasn’t published iPhone 7 Plus results.)
Google’s hardware link
A single electronic connection embodies Google’s new approach. It links the image sensor with the motion and tilt sensors so the phone knows exactly how the camera is moving, updated 200 times each second.
That’s crucial for one of the phone’s big improvements, digital image stabilization that’s now good enough to shoot smooth video while walking. That’s in contrast to Apple’s approach of stabilizing the actual physical camera component.
“Having full control over the whole stack made this possible,” said Tim Knight, leader of Google’s Android camera team, who started working with Sony in 2014.
The phone’s image stabilization constantly crops and warps the image and repairs rolling-shutter artifacts commonly called the Jell-O effect. “If I shake the phone, it’s rock solid,” Burke said.
The iPhone 7 has the same 12-megapixel resolution as the Pixel, but Google’s sensor is physically larger so each pixel — 1.55 millionths of a meter across — can gather more light. That helps in dim situations and compensates for the fact that the Pixel’s f2.0 lens aperture doesn’t let in as much light as the iPhone 7’s f1.8 lens.
Last year’s Nexus 6P and Nexus 5X had good image sensors but were too slow to compete. But the Pixels beats the Apple iPhone 6S, iPhone 7 and Samsung Galaxy S7, Burke said.
Launching the camera app is 40 percent faster than with the higher-end Nexus 6P, partly because Android 7.1 software loads the camera app into memory in advance. Taking the photo is 74 percent faster, Knight said.
New chips help. Sony’s new IMX378 adds autofocus technology so Pixels lock focus 70 percent faster than a Nexus 6P, Knight said. And Google spent two years tailoring its Halide image-processing software to exploit Qualcomm’s Hexagon technology.
Android 7.1 also brings camera app improvements. There’s no more stutter if you rotate the camera from portrait to landscape orientation. Pressing and holding on the screen locks exposure and focus on the point you picked, and tapping again resets it. Sliding your finger up and down after setting focus brightens or darkens exposure.
Another performance boost comes from the fact that the camera continuously shoots 30 frames per second. When you tap the shutter button, the camera app grabs up to 10 frames out of memory and combines them into a single shot. It’s a refined version of the HDR+ (high dynamic range) technology that Google already ships.
“The goal was to maintain quality but improve speed,” Knight said.
By circulating images through memory constantly, the Pixel also lets you take rapid-fire shots. Once it’s got a focus lock, it’ll take pictures about as fast as you can tap the shutter button.
Smart burst and AI
When you capture a sequence, a feature called smart burst presents you with what it thinks are the best options to make it easier to delete the duds.
Shooting so much could fill your phone’s storage. But if you max out your Pixel, Google will shuffle the oldest photos and videos, in original quality, to its cloud storage for free for the life of the device.
Smart burst uses a type of artificial intelligence technology to decide which shots are best. More such AI effectively stabilizes low-light photos by picking the sharpest “lucky shot” frames taken when you happened to be between wobbles. AI also sets the exposure, Knight said.
It’s all possible because Google now fully controls all the hardware and software. “It’s our phone, our product,” Burke said.