The iPhone 15 series will bring us even closer to the dream of an all-screen phone by shrinking the bezels even more thanks to a new OLED panel. Only the Pro models get the smaller bezels, and they’ll still have the Dynamic Island at the top. But Apple is getting closer to placing the selfie camera below the screen for a perfect all-screen phone design.
Not surprisingly, Google is also working on this technology, and we might see Pixel phones with under-display cameras in the near future. And while I’m unlikely to switch from an iPhone to a Pixel phone anytime soon, I’m excited for Google to continue with this upgrade.
The problem with under-display cameras
When Apple and Google deliver their under-display camera technology, they won’t be the first to do so. ZTE did it with a traditional phone years ago. Then Samsung put an Under Display Camera (UDC) on the Galaxy Z Fold. Other suppliers have also been investigating the technology for years.
The problem with under-display cameras is that we need further technological innovations for them to become mainstream. There’s a reason why Samsung doesn’t use UDC selfie cameras for its Galaxy S series. On the Fold, the UDC selfie camera is a backup. You have two other ways to take selfies, so you don’t have to worry about the camera quality.
Placing the camera under multiple layers of materials will affect image quality. It is not only the glass that covers the lens, but also the OLED screen, which must be transparent.
Apple is already working on perfect iPhones
But you don’t get perfect transparency, as the OLED screen section covering the camera has to act as a screen. You need algorithms beyond the camera’s capabilities to develop decent selfies. And the selfie is a big part of our smartphone experience.
Google Pixel phones aren’t necessarily the best iPhone alternatives out there, but the Pixels are great cameras. And that’s why I’m excited to see Google working on under-display camera technology. Google can deliver the kind of camera innovations that others will try to copy. And that can put pressure on Samsung and Apple. That’s if Google’s underscreen camera will be able to excel, that is.
It is unclear when Google will launch such a Pixel. That’s not happening with the Pixel 8. If anything, Google may let Apple’s iPhone go first, as Google’s strategy has been to copy Apple while criticizing the iPhone. And rumor has it that we’re not that far off from iPhones with under-screen selfie cameras.
Google’s idea for a selfie camera under the screen
Google is studying this technology, and Forbes found a patent application with the title System and apparatus of under-display camera.
The patent application tells half the story of Pixel phones with cameras under the screen. It focuses on the Pixel producing high-quality selfies by using not one, but two different under-display cameras with different abilities to capture image data.
One lens could be monochrome, while the other would be a color sensor. Each sensor will have a screen cover with special light-blocking elements. One of the cameras could focus on a specific characteristic, for example sharpness for the monochrome sensor and color for the RGB one:
 In an example implementation, camera 165 may include an RGB sensor and first display area 160 may have a circular LED or transistor layout structure.
Furthermore, camera 175 may include a monochrome sensor, and the second display area 170 may have a rectangular LED or transistor layout structure. The combination of an RGB sensor and a circular layout structure can generate signals that cause an image to have one or more first characteristics (eg, quality, sharpness, distortion, and the like) based on light distortion and camera sensitivity. The combination of a monochrome sensor and a rectangular layout structure can generate signals that can generate signals that cause an image to have a different characteristic(s) (eg, quality, sharpness, distortion, and the like) based on light distortion and camera sensitivity.
For example, the rectangular layout structure allows more light to pass through and also generates images with better sharpness, and the monochrome sensor is more suitable than the RGB sensor to capture these characteristics. Overall, the first display-sensor combination and the second display-sensor combination may be designed to capture first and second characteristics that are complementary to each other. The two images can be fused afterwards so that the resulting image has both complementary properties.
The software would then merge the two images and deliver the best possible selfie. A photo with great sharpness and color reproduction, where the information comes from two different images.
Current under-display cameras use only one camera sensor located below the display.
That said, there’s no telling if Google will move forward with this idea. This might just be a patent that helps the company cover all its bases. Placing two cameras under the screen also means that Google has to worry about two different parts of the OLED display acting as seamless displays when the cameras are not in use. Presumably, a separate patent application will cover this technology.
As with any technology that appears in patents and patent applications, there’s no telling if it will ever make it to a consumer product. Or how long it will take for that to happen.
Separately, I can’t help but wonder if placing two cameras under the display on a Pixel phone would enable 3D facial recognition like the iPhone’s Face ID.
Meanwhile, we’ll get a Pixel 8 lineup in October that will look similar to their predecessors, including the punch-hole camera design.
#wait #Google #put #underdisplay #selfie #cameras #Pixel #phones