Search engine giant Google on Tuesday unveiled the much-awaited Pixel 6, 6 Pro during the virtual Pixal Fall Launch event.
Google's new Pixel 6 series phones come the company's first-ever proprietary Tensor chipset. It should be noted that the Pixel series started off great with photography hardware but after the Pixel 4 series, Google kind of lost its way compared to Apple iPhones.
The former during the event revealed (read blamed) that using the third-party chipset (read Qualcomm) actually held back Google engineers to fully capitalise on their advanced computational photography software in their phones.
But with the Tensor chipset, Google says they have built new camera capabilities in Pixel 6, 6 Pro, that were impossible in previous iterations.
Google Pixel 6 series: Key camera features you should know
Photography hardware: Both the Pixel 6 and 6 Pro come with a massive 50MP primary wide camera with a 1/1.3 inch sensor on the back. It can capture up to 150% more light (compared to Pixel 5’s primary camera) and promises to take photos and videos with even greater detail and richer color. Both phones also have completely new 12MP ultrawide (114-degree) lenses with larger sensors, so photos look great when you want to fit more in your shot.
In the Pixel 6 Pro, Google has incorporated an additional 48MP Telephoto lens. This will enable the phone to perform 4x optical zoom and up to 20x zoom with an improved version of Pixel’s Super Res Zoom.
Magic eraser: Having seen it in the presentation, the magic eraser feature is too advanced and can certainly be used as the USP to market the Pixel 6 series. It can literally wipe clean objects and humans in the background without affecting the sanctity of the picture. It would definitely come in handy while taking pictures with popular landmarks likes Hampi or Taj Mahal in India. The Pixel 6 series owners can just keep the loved ones in the picture and erase the crowd. Unfortunately, it is coming to India. Can you please explain India's exclusion, Google?
Face unblur: This feature is built right inside the camera app exclusively for the Pixel 6 series. It is capable of automatically unblurring the human subject's face with sharper detail. The credit has to go the Tensor's on-device machine learning (ML) capability that makes the phone take two pictures one from the main wide 50MP and another with the 12MP ultra-wide sensor. The main image sensor uses a normal exposure to reduce noise, the ultra-wide sensor uses a faster exposure to minimise blur. The ML of the camera app fuses the sharper face from the ultrawide with the low-noise shot from the main camera to get the best of both into the image.
As the last step, the Pixel camera app takes one last look at the image to see if they are any remaining blur in the fused image. It estimates the level and direction and removes them. In total, Google says it takes four ML models and two camera sensors to take blur-free images. It particularly comes in handy in taking pictures of a moving child indoors with controlled light conditions.
Motion Mode: It comes in two options-- Action Pan and Long Exposure.
Users can use Action Pan to take photos of the kids riding their bicycles against a stylish blurred background. With Action Pan, the phone captures multiple images and combines them using computation photography and an on-device ML model to identify the subject of the photo and figure out what's moving and add blur, which improves the visual appeal of the photo.
With Long Exposure, users can create beautiful long exposure shots where the subject is moving, like waterfalls or vibrant city scenes.
Real Tone: A lot of research studies have shown that photos taken on mobile camera apps have a tendency to project human subjects with lighter skin tones. People with different colour tone come off unnatural in most scenarios. The bias is said to be in the photography algorithm found in most phones today. Here, Google seems to have addressed this issue.
"Going back decades, cameras have been designed to photograph light skin — a bias that’s crept into many of our modern digital imaging products and algorithms. Our teams have been working directly with photographers, cinematographers, and colorists who are celebrated for their beautiful and accurate imagery of communities of color. We asked them to test our cameras and editing tools and provide honest feedback, which helped make our camera and auto enhancement features more equitable," said Brian Rakowski, VP, Product Management, Pixel, Google
Based on feedback from experts, Google engineers developed Auto-exposure models that help determine the brightness of an image. The camera promises to show persons as they really are — not unnaturally darker or brighter.
Also, sometimes stray light has a tendency to disproportionately wash out darker skin tones and this too, has been addressed in the new Pixel 6 series camera.
Get the latest news on new launches, gadget reviews, apps, cybersecurity, and more on personal technology only on DH Tech.