Google says that the app won't always work with 100 percent accuracy, and that it will continue to develop the app as it gets more feedback from users. The app was designed for the visually challenged people and was based on Google's machine learning algorithms. By the way, Lookout was originally announced at the I/O conference back in May 2018, requiring a lot of time for "testing and improving the quality" of its results.
After nearly a year of testing, Google has finally launched its AI-powered Google Lookoutapp that is created to help visually impaired and blind by identifying the objects around them.
To use Lookout, Google recommends that users wear their Pixel phone on a lanyard around their neck or in the front pocket of their shirt. Google Lookout joins the company's growing list of accessibility apps.
Lookout is primarily created to work in "situations where people might typically have to ask for help"; Google cites examples like "learning about a new space for the first time, reading text or documents" and daily tasks like "cooking, cleaning, and shopping".
I can't imagine having to navigate today's world while visually impaired.
Apart from identifying the objects through the AI technology, the app is also capable enough to read the text, labels, scan barcodes and much more. The app won't swarm the user with unnecessary info, though, but rather only tell them about the things it thinks are important. Share your thoughts in the comment box below and also stay tuned to PhoneRadar for more such interesting updates.