In addition to the launch of ARCore 1.0, Google is making many users happy by rolling out Google Lens to all Google Photos English-Language Users. This is the awesome technology that through software essentially adds the Google Assistant into your Android or iOS smartphones camera.
This means that within in the Google Assistant on a Pixel 2 XL you can turn the camera on and show it a page of a book, highlight a word and get a definition. Google Lens was previewed earlier and I got a demo with it at the October 4th, 2017 Made by Google event. But Lens doesn’t stop with words, it can also identify traditional photos. For instance, if you have a photo of a flower or a classic car, it can get you information on that.
It is similar to how the Google Assistant functions on the Pixelbook with the Pixelbook Pen. In that, you can circle an image and it will identify it—I don’t know if this writing will do it justice, so if you get the chance to use it in real life, I highly recommend it.
Google Lens is rolling out today to Google Photos English-language users(UK, US, Canada, Germany, Australia, India, France, Italy, Spain and Singapore) with the latest version of the app on Android and iOS 9. Yes, that is right, it is coming to Android as well as iOS–one might call this a cross-platform dream. You can also expect the live camera-based Lens experience within the Google Assistant to appear on upcoming devices from Samsung (Maybe the Galaxy S9 or S9+), Huawei, LG, Motorola, Sony, and Nokia in the coming weeks.