Google told how the second camera makes portrait photos better - Technology

Breaking

Wednesday 4 March 2020

Google told how the second camera makes portrait photos better

Google was never afraid to be a black sheep. This is especially clearly seen on the example of company branded smartphones, which have always been different from competitors' decisions. When everyone was engaged in the development of hardware technologies, Google was in full swing over the improvement of software. The result of her work was a wide range of unique functions that were available only to users of smartphones line of Google Pixel . Most of them were concentrated around the camera, however, for three generations it consisted of only one module, which could receive both a portrait shot and a photograph of the starry sky. But at some point, Google decided that it was time to increase the number of modules.

How the dual camera made the Pixel 4 better
While all manufacturers are striving to equip their smartphones with modules with a wide viewing angle, Google decided to go the other way and equipped the Pixel 4 with a telephoto. Despite the fact that her devices used to take high-quality portrait shots even on a single camera, the company's engineers came to the conclusion that the presence of a telephoto lens will improve the quality of such photos. Thanks to him, the smartphone better estimates the depth of the frame and thus produces a picture with a more natural bokeh effect, they told Google. Here's how it goes.

How Pixel takes portrait photos


On the left - shooting with software algorithms, on the right - with a telephoto lens
Previously, Pixel smartphones used a two-pixel focusing system to simulate depth, which simulated shooting with two cameras. Software algorithms seemed to divide one pixel into two and thus allowed it to capture an image from two angles. These photos overlapped each other and the effect of volume was obtained, due to which the smartphone determined which area of ​​the image should be blurred and which, on the contrary, should be highlighted. The output was a completely high-quality portrait photograph.
However, in Pixel 4, in addition to the two-pixel system that successfully proved itself in the past, an additional telephoto camera began to be used. It is located from the main width at a distance of 13 mm, which gives the smartphone the ability to evaluate the depth of the frame not only vertically, but also horizontally. Combining them allows you to obtain more accurate data on the location of the subject and, by comparing the horizontal and vertical information, give the most natural blurring of the background, which does not climb on the ends of the hair, ears and clothing of the subject, and also allows the objects to not go “messy” but still gives us a chance to discern what is located there.

How the telephoto lens works in Pixel 4

The pair of photographs presented above shows very clearly how the old and new photographing systems differ. Of course, if you look at each shot individually, both of them will appear to be of high quality, but with direct comparison everything becomes clear. Despite the fact that it seems that the blur should be as soapy as possible, it still should not be excessive, as in the first picture. The second in this sense is absolutely beautiful, since the smartphone does not “twist” the bokeh, but makes it juicy, but at the same time transparent enough not to lose background details.
True, the telephoto lens does not always work, according to Google. It turns on only if the subject is at least 20 centimeters. This is the minimum distance at which the telephoto can focus and give a high-quality result. And if the user wants to make a conditional macro shot with blurring the background, the photo will be taken only using the two-pixel forecasting system described above. In this case, the highest frame quality is not guaranteed, because only software algorithms are responsible for shooting

No comments:

Post a Comment