Google Lens Will Soon Work in Real Time and be Native In Other Camera Apps

If there is one app outside of Google Photos that shows the power of Artificial Intelligence in Googleville, it is Google Lens.  The AI-driven camera feature allows you to quickly identify objects simply by pointing your camera at them and tapping that object.  At Google I/O yesterday, Google announced some of the next steps for Lens and they are going to be impressive when the land over the next few months.

First, and perhaps the biggest news, is that Lens will be available in the native camera app for several manufactures.  That means the feature will be available a lot more people right out-of-the-box.  If you have a device from LGE, Motorola, Xiaomi, Sony Mobile, HMD/Nokia, Transsion, TCL, OnePlus, BQ, or Asus, you can expect an update to the native camera app on your phone to support Google Lens.

Second, is smart text selection.  With this, you can point Lens to a menu, select a piece of text, and have Google provide you a description or explanation of what that menu item is.  It will be great for menus but will also work on WiFi codes, recipes, and gift card codes to name but a few.

Next is Google Lens working in real time.  Right now, Lens is passive.  You point your camera at an object with Lens activated and tap that object for the Google results of what that object is to be provided.  That will soon be changing.  In real time, Lens will work and will provide you dots to tap to find information about objects in the space you are in at the time.  It can also provide you information and similar styles of things like clothing, shoes and furniture as you look around a room.  Find a book?  Tap it and you can get information on that book in real time.

Google Lens Real Time

Google Lens Real Time

While we are still likely a few months away from all of these new features coming to Google Lens, it is exciting to see the vision that Google has for it and just how powerful it will be in day-to-day life.

%d bloggers like this: