Advertisement

Google explains the Pixel 2's super-stable video recording

From Engadget - November 12, 2017

The system starts off by collecting motion info from both OIS and the phone's gyroscope, making sure it's in "perfect" sync with the image.But it's what happens next that matters most: Google uses a "lookahead" filtering algorithm that pushes image frames into a deferred queue and uses machine learning to predict where you are likely to move the phone next.This corrects for a wider range of movement than OIS alone, and can counteract common video quirks like wobbling, rolling shutter (the distortion effect where parts of the frame appear to lag behind) or focus hunting.The algorithmic method even introduces virtual motion to mask wild variations in sharpness when you move the phone quickly.

This is not to say that Google's approach is flawless.As others have noted, the Pixel 2 can crop the frame in unexpected ways and blur low light footage more than it should.On the balance, though, this shows just how much AI-related technology can help with video.It can erase typical errors that EIS or OIS might not catch by themselves, and produces footage so smooth it can look like it was captured with the help of a gimbal.

Advertisement

Continue reading at Engadget »