Category: Google Lens

Google Camera Update Brings Google Lens Integration on Pixel Phones

A new update to the Google Camera app is making its way out in the wild today, bringing integration with Google Lens to Google Nexus and Pixel phones.  The updated version is 5.2.025 for those keeping score at home.

At Google I/O, the Mountain View company indicated that devices that were compatible with the Google camera app would be getting Google Lens integration in the app.  To date, that has rolled out for several manufactures like Sony, Motorola and OnePlus but not on Google’s own devices.

Google Lens Standalone Shortcut App Released in The Play Store

If you like Google Lens but hate the fact that you have to open up Google Assistant on your phone to get to it, then today’s your lucky day.  Google has just dropped a Google Lens standalone app in the Play Store that you can download.

By all indications, this is a shortcut app.  That means it simply opens up Lens on your phone by pressing the icon for the app instead of having to long press your Home button the tap the Lens icon within Google Assistant.  There is nothing new or special that the app does itself.

Google Lens Rolling Out Real Time Information Update

In today’s chapter of “It’s cool living in the future”, Google Lens is now broadly rolling out real time information and text selection features via Google Assistant.  The new features, which were highlighted at Google I/O a few weeks back, trickled out to some last week but now it appears that Google has pushed the cloud-side big button and it is now going out to everyone.

There are few different elements that are rolling out today that are pretty magical.  First, there is real time information element.  When you enable Google Lens (long press your Home button to activate Google Assistant and then tap the Lens icon in the lower right corner), you can point your camera at different objects.  If it recognized that object, you will get a colored bubble over that item which you can then tap and Assistant will give you information about it.

Google Lens Will Soon Work in Real Time and be Native In Other Camera Apps

If there is one app outside of Google Photos that shows the power of Artificial Intelligence in Googleville, it is Google Lens.  The AI-driven camera feature allows you to quickly identify objects simply by pointing your camera at them and tapping that object.  At Google I/O yesterday, Google announced some of the next steps for Lens and they are going to be impressive when the land over the next few months.

First, and perhaps the biggest news, is that Lens will be available in the native camera app for several manufactures.  That means the feature will be available a lot more people right out-of-the-box.  If you have a device from LGE, Motorola, Xiaomi, Sony Mobile, HMD/Nokia, Transsion, TCL, OnePlus, BQ, or Asus, you can expect an update to the native camera app on your phone to support Google Lens.

Google Lens Can Now Identify Dog and Cat Breeds in Assistant and Google Photos

Google has begun rolling out a new cloud-side update to Google Lens, the applet inside of Google Assistant and Google Photos, enabling it to identify both dog and cat breeds you see or in your photos.  The new feature is making its way out broadly to users so it is likely that you already have it so long as you have the latest Google app on your phone.

Previously, when you used Google Lens on a photo of a dog or cat, either by using it in Assistant or in Photos, it would simply come up with a generic “looks like a cat” or “a cat” but it didn’t provide any breed information.

Google Lens Now Broadly Available in Google Assistant

Google Lens, the AI-driven feature that allows for objects to be identified, has now begun broadly rolling out to Google Assistant on Android.  The feature, once available for your phone, can be accessed by opening up Google Assistant (long press the Home button) then tap the Lens icon in the lower right.

Using Google’s Artificial Intelligence, Google Lens allows you to tap an object on your display while it is active and that object will be identified.  It works on everyday things but increasingly can identify plants, trees and soon flowers too.

Google Lens Could Soon Help with Shopping and Plant Identification

Google Lens began a general rollout less than a month ago and at the time I said it felt futuristic and had a huge amount of potential.  Indeed, looking back on 2017, I thing that Lens could be one of the most significant product releases by Google for the year – or will prove to be in the coming 12 months.

Google’s Rajan Patel, the lead engineer for Google Lens, took to Twitter yesterday and highlighted some of the areas in which we are going to see Lens improve and grow in the coming months.

Before readers get too excited, it should be noted that Mr. Patel gave no indication on when we will see these new features and functionalities within Lens.  That said, it is exciting to read that Google is thinking about where to take the app & service next.  Truly exciting stuff!

Google Lens in Google Assistant Begins General Rollout

Google Lens, the visual search tool that the company announced this summer at Google I/O is starting to generally roll out to Pixel owners.  The new tool is embedded in Google Assistant and allows you to tap the Lens icon and then by using your phone’s camera, view an object, tap it and identify it.

The good news is that the new feature is rolling out to Google Pixel and Pixel 2 owners globally in the United States, United Kingdom, Australia, Canada, India, and Singapore but the device’s language must be set to English for the feature to enable and work.  You will also be prompted when Lens comes to your device to give it permission to access your camera.

%d bloggers like this: