When Google introduced Night Sight to its Pixel smartphones, it was a game changer simply due to the fact that the Pixel phones back then only had a single camera, but yet was more capable than other phones with multiple cameras at capturing photos in low light. This was thanks to the use of AI that allowed them to achieve such stunning results.
With the Pixel 4, Google introduced a new low-light feature where the phone's camera is capable of astrophotography. In a post on its blog, Google details how they managed to achieve this, and once again, it is largely thanks to the use of AI and machine learning in which the camera is smart enough to know which parts to illuminate and which it shouldn't.
For example, one of the ways of boosting the color dynamics in a photo is by capturing multiple exposures of the same photo, a technique also known as HDR. The downside is that this can result in certain parts of the photo looking overexposed, but with the use of AI, it can detect which part needs to be dimmed that results in a more natural-looking image. According to Google:
"Sky detection also makes it possible to perform sky-specific noise reduction, and to selectively increase contrast to make features like clouds, color gradients, or the Milky Way more prominent."
That being said, this is by no means a perfect system due to a number of factors, such as sensor size, the type of lens involved, and so on, but for a smartphone to be capable of achieving these results (see photo above) is truly impressive.
Source: Google
from Phandroid https://ift.tt/2pUjMUA
via IFTTT
No comments:
Post a Comment