Google has now extended the availability of its vision-assistive AI application Lookout to include several additional flagships, enabling more users than ever to accomplish a wider range of tasks without asking for help. No details have been made available regarding specific models the app will work with but the search giant says it should work at a minimum on the latest flagships from both Samsung and LG following the latest update.
A new video advertisement has been kicked off in the meantime, showcasing the features and functionality of the application and providing a rundown of use-case scenarios for the app. No longer available for download solely on Google's own Pixel handsets, Lookout summarily utilizes machine vision to help users see or see more clearly.
The company's advertised examples of scenarios where that might be useful include its use in handling money. For most, sorting through bills or coins to pay for an item or just for counting is an easy task but that becomes difficult if the money can't be seen. The AI in lookout can help by identifying the monetary value of the money and reading it back to the user.
Spreading out from there, Google shows off general use cases. In one instance, a user takes advantage of the AI to quickly identify objects in their surrounding environment — for example, a coffee pot or dinnerware. Barcodes, text, and similar items can be scanned as well.
Rapidly improving AI
Ideally, Google says a tool like Lookout will eventually be able to provide users a true sense of independence and freedom that doesn't exist for many otherwise. The AI is already beginning to approach that level as further identifiable objects and recognition algorithms are built by Google and others but it still seems to have a long way to go.
That doesn't mean development hasn't already come far, especially in terms of machine-driven voice recognition. At Google I/O 2019 this year, Google took to the stage to show off a new generation of Assistant — undoubtedly tied to Lookout via work on Google Lens — that's able to navigate in a much more intuitive and conversational way.
The new Assistant essentially eliminates the need for keywords to be spoken in between commands and allows the AI to work with better integration across Android, the web, and other areas where it has been available. In at least a few cases, image and object recognition improvements are present in the update that could ultimately lead to a better Lookout app too.
More importantly, the understanding of contextual relevancy shown in the new Assistant may help machine vision be more capable of differentiating between objects and the circumstances those exist in.
This could spread
Another big announcement at this week's event that could impact Lookout is the news that Google is bringing AI closer to an on-device solution than one based in the cloud. That's driven by reductions in the overall size of the code required to utilize the artificial intelligence, among other things.
Speculatively, moving the leading edge of the processing and making that more efficient, Lookout could feasibly find its way to a greater variety of Android handsets across a diverse price bracket in the near future. Given that the free app was only just announced in March and only for Pixel-branded flagships, that democratization process could also happen very quickly for this and other AI-driven accessibility apps currently under development.