Google Lookout uses AI to describe surroundings for the visually impaired

Google Lookout App for Visually Challenged Now Available for Pixel Phones in US

Well, the app is now officially available to install and explore by the users of Google's Pixel Smartphone users.

Google says that the app won't always work with 100 percent accuracy, and that it will continue to develop the app as it gets more feedback from users. The app was designed for the visually challenged people and was based on Google's machine learning algorithms. By the way, Lookout was originally announced at the I/O conference back in May 2018, requiring a lot of time for "testing and improving the quality" of its results.

After nearly a year of testing, Google has finally launched its AI-powered Google Lookout app that is created to help visually impaired and blind by identifying the objects around them.

To use Lookout, Google recommends that users wear their Pixel phone on a lanyard around their neck or in the front pocket of their shirt. Google Lookout joins the company's growing list of accessibility apps.

Lookout is primarily created to work in "situations where people might typically have to ask for help"; Google cites examples like "learning about a new space for the first time, reading text or documents" and daily tasks like "cooking, cleaning, and shopping".

It is now available at the Google Play Store for all Pixel devices running Android 8.0 Oreo and is currently only available in English.

I can't imagine having to navigate today's world while visually impaired.

Apart from identifying the objects through the AI technology, the app is also capable enough to read the text, labels, scan barcodes and much more. The app won't swarm the user with unnecessary info, though, but rather only tell them about the things it thinks are important. Share your thoughts in the comment box below and also stay tuned to PhoneRadar for more such interesting updates.