Skip to main content
Smartphones

The Google Pixel's best feature came from Google's biggest failure

Every time you buy a Pixel, you're buying a little piece of Google Glass.

Google Pixel Credit: Reviewed.com / Michael Desjardin

Products are chosen independently by our editors. Purchases made through our links may earn us a commission.

It's hard to believe now, but once upon a time, Google Glass was one of the most exciting prospects in tech. The spectacle(s) met a swift end, unfortunately—it turns out people weren't ready to live in a world where everyone straps a camera to their head and occasionally speaks to it. Is that person looking at me or are clandestinely filming my every move? I guess there's no way to tell!

But this sad tale ends on a happy note, and that note involves one of our favorite smartphones ever: the Google Pixel.

Let's start at the beginning. For those who don't remember, Google Glass was a short-lived project where you would pay Google over $1,000 to wear a camera and augmented reality glasses on your face all day long. As we already mentioned, it didn't go so well. But while the social issues remain unfixable, a blog post from Google's secretive X lab details how the team overcame one of its toughest engineering challenges: the camera.

In the product's early stages, Glass's research-and-development team at Google X had a problem: The camera in the glasses needed to be at least as good as the one found on people's smartphones in order for anyone to regularly use it, but also light enough that long-term use wasn't... well, a headache.

Google Glass's research-and-development team had a problem.

{{ amazon name="Google Pixel (32GB)", asin="B01M0DMG3W", align="right" }} A teeny-tiny camera allows for lightweight, easy-to-wear glasses, but small sensors struggle to take in the proper amount of light needed for dimly-lit or high-contrast pictures. There's only so much you can do with the hardware at that point.

So the engineers at Google X decided to address the problem with software. They spun out a team called "Gcam" to focus on mobile photography–specifically, maximizing the quality of mobile photos using software solutions. By using a technique called image fusion, Gcam team not only solved their camera problem, but changed the way certain mobile cameras process low-light images.

Image fusion saved the Google Glass—at least until the Glass met an unceremonious end.

In a nutshell, image fusion takes a series of rapid-fire pictures and then processes all of them as a single image. Dimly-lit scenes (or high-contrast scenes with varying degrees of light) end up looking better, with higher clarity and more detail. Image fusion saved the Google Glass—at least until the Glass met an unceremonious end.

Google Pixel Camera In Use
Credit: Reviewed.com / Michael Desjardin

The HDR+ software originally developed for Google Glass found its way into the Google Pixel.

But even though the world wasn't ready for Google Glass, the smartphone industry was totally ready for its game-changing camera software. That's right—Google did what it does best and picked apart the carcass of the Glass like a congress of vultures consuming one of its own.

Gcam's image processing software—which eventually came to be known as HDR+, found its way into Google's line of Nexus phones, beginning with the Nexus 5. The Nexus 6 followed shortly thereafter.

Gcam's image processing software found its way into Google's phones.

Recently, HDR+ helped give the Google Pixel one of the best (maybe even the best) smartphone cameras ever made. In our lab, the Pixel managed to pass our low-light camera tests with flying colors, putting up results we've only recently learned were possible for a smartphone camera.

So the Google Glass is more or less dead, but a piece of its soul lives on in the Pixel—and possibly future Android phones, as well.

{{ brightcove '5258307241001' }}

Up next