Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

How to use Google Lens to identify objects using your smartphone

 

Google Lens, first introduced at Google I/O 2017, is one of the most exciting Android features for years. Originally an exclusive feature only found on Pixel smartphones, Google Lens is now baked into many Android handsets, and is available as an app in the Play Store.

Recommended Videos

Google Lens combines the power of A.I. with deep machine learning to provide users with information about many things they interact with in daily life. Instead of simply identifying what an object is, Google Lens can understand the context of the subject. So if you take a picture of a flower, Google Lens will not just identify the flower, but provide you with other helpful information, like where there are florists in your area. It also does useful things like scanning QR codes, copying written text, and even live translation of other languages.

How to access Google Lens

While there are a few ways to access Google Lens, the easiest is simply long tapping on the home button to open Google Assistant. Once it’s open, tap the compass icon in the bottom-right to open your Explore menu. Then tap the Google Lens (camera-shaped) icon to the left of the microphone to open a viewfinder screen. Point the camera at the item you are interested in, and tap on it. If you don’t see the Google Lens icon, make sure you have the app downloaded from the Play Store.

Some phones allow you to access the feature directly from the camera app. For example, on the LG G8 ThinQ, just open the camera app and double tap A.I. Cam to quickly get to the Google Lens screen.

Once Google Lens identifies an item, you can continue to interact with Assistant to learn more. If you point it at a book, for example, you’ll be presented with options to read a New York Times review, purchase the book on the Google Play Store, or use one of the recommended subject bubbles that appear below the image.

If Google Lens accidentally focuses on the incorrect item, you can tap the back button and give it another try.

If it’s too dark, you can tap the light icon in the top left to switch on your device’s flash. You can even use Google Lens on pictures you’ve already taken by tapping the gallery icon in the top right.

Google Lens isn’t perfect. The company admits the technology works best for identifying books, landmarks, movie posters, album art, and more. Still, we were impressed when it offered up reviews, social media accounts, and business information when we pointed it at the awning for a small store. Point it at a business card and it will let you save the person as a contact, and fill in all the details on the card for you.

While Google Lens is still in its infancy, it shows a lot of promise. Its deep learning capabilities mean we should only expect it to get better in the future. Google Lens is currently available on most Android smartphones that support the Google Assistant, and you can expect it to be incrementally upgraded with new features as Google adds to its suite.

Mark Jansen
Mobile Evergreen Editor
Mark Jansen is an avid follower of everything that beeps, bloops, or makes pretty lights. He has a degree in Ancient &…
Your Google News app is getting a subtle redesign. Here’s what’s changing
Google News on a Pixel 9 Pro.

Google continues to fine-tune its native apps on Android, this time with Google News. This follows the big redesign to Google Maps that happened earlier this year. So what’s new in Google News?

Basically, the newly redesigned Google News makes things simpler in terms of the bottom bar. Previously, there were four sections in that bottom navigation bar: For you, Headlines, Following, and Newsstand. The revamped version now combines For you and Headlines into a new Home tab, which acts as the default feed for content. The other two tabs -- Following and Newsstand -- still remain.

Read more
Here’s how your Android phone could help stop your motion sickness
Someone holding the Google Pixel 9 with the screen on.

Motion sickness — also called kinetosis — is a common problem. In fact, as many as one in three people have felt sick while in a vehicle. For those who suffer from it, reading in the car is practically impossible.

Apple introduced a feature that helps those prone to motion sickness use their phones without the accompanying nausea. Now, Google is working on a similar feature for Android phones.

Read more
Your Google Maps app is about to look different. Here’s what’s changing
Screenshot of the new teal color in the Google Maps app.

If you own an Android device such as a Samsung Galaxy S24 or Google Pixel 9 Pro, there is a small design update coming to the Google Maps app that aims to enhance its visual appearance and user experience. The app will be adopting a new interface color scheme, which could make navigation and interaction feel fresher.

As first reported by 9to5Google, Google Maps is set to change its signature blue accent for buttons and other user interface elements to a dark shade of teal.

Read more