Google I/O Keynote Highlights Schedule

Google Lens images wallpaper

Google has shown, once again that it is pushing for artificial intelligence. As you know for example, holding down the Home button of the Android device on a picture framed with the camera app, the Google server will try to find on the internet about information that will provide useful advice. This feature is called Search screen that was formerly known as Now on Tap, the daughter of the first experiments with the project until 2014 (now defunct) Google Goggles.

Recently, Google introduced a new version of the Photo service that is able to recognize objects depicted in a photo showing the price and reference e-commerce sites. Google Images recognizes objects in the photo and indicates the price.

What is Google Lens and how it works

During the Google I/O 2017, the managers of the company have unveiled Google Lens, a feature that allows you to turn on and access to new instruments to enable a computer vision. Google Lens is a kind of camera connected with Google Assistant that can extract the relevant information and contextualize images. Leveraging the Lens, you can quickly find the information that interests or automate operations.

Suppose, for example, to frame a login to the WiFi placed on the back of a router, Lens will configure the wireless connection. Google Lens helps you find information on businesses from the logos, to extract the information printed on business cards and automatically adding to contacts in your phone book, translate texts from other languages, and so on. Potential fields of application are virtually endless.

Google Lens will be tied hand in glove with both Google and with Google Photos Assistant, with all the consequences that follow. Google Photos is growing a lot and will be enriched with new tools much like Chrome, YouTube or Gmail. That's why Google has decided to integrate Lens with the Photos app.

Google Photos allows you to set aside the traditional concept of gallery photo and the images are searchable on the basis of the time when they were taken. The chronological order becomes secondary. It is much more important to find the photos of interest on the basis of people who were participating in an event or when/where they were taken.

From now on Google Photos will be able to suggest people with whom an image is to be shared. They will also be introduced shared libraries so as to enable a continuous sharing with specific people.