Skip to main content

Google Assistant gains eyes with Google Lens, now rolling out to Pixel phones

Image used with permission by copyright holder
Google Assistant is getting smarter. While the digital assistant has traditionally only used the microphone to hear, now it’ll also use the phone’s camera to see. That’s thanks to Google Lens, which, after some testing, is now rolling out to all users of Google Pixel phones.

The news was announced by Google through a blog post, and while expected, it is exciting. Google Lens promises to apply Google’s machine learning expertise to what the phone can see through a camera. Lens was first announced at Google I/O in May.

“Looking at a landmark and not sure what it is? Interested in learning more about a movie as you stroll by the poster? With Google Lens and your Google Assistant, you now have a helpful sidekick to tell you more about what’s around you, right on your Pixel,” said Google in its blog post.

That will manifest in a number of different ways. Previously, Google Lens was available through Google Photos, but it involved users having to take a photo, then switch apps and hit the Lens button. Lens on Google Assistant promises to not only be more intuitive, but also smarter. According to Google, the feature will allow users to do things like save information from a photo of a business card, follow links, and recognize objects. You can also do things like point lens at a movie poster for information about the movie, or at landmarks like the Eiffel Tower to learn more about it and its history. Last but not least, Assistant can look up products through bar codes.

Google Lens
Image used with permission by copyright holder

Of course, we’ll have to wait and see how it all works once it’s rolled out, but the good thing about Google Lens is that it doesn’t really rely on a great camera — it’s more dependent on software, so it can be updated and improved over time.

Google Lens is currently rolling out to Pixel phones in the U.S., U.K., Australia, Canada, India, and Singapore. Google says it will roll out “over coming weeks.” When it is finally available on your phone, you’ll see the Google Lens logo at the bottom right-hand corner of your screen after you activate Google Assistant.

Christian de Looper
Christian’s interest in technology began as a child in Australia, when he stumbled upon a computer at a garage sale that he…
The Google app on your Android phone is getting a helpful new feature
Google app on Android beta showing Notifications.

The Google app for Android phones is getting a helpful new feature to make search even better. The latest beta has a dedicated "Notifications" feed in its bottom bar. The feature was first introduced on the mobile version of Google for Android earlier this year. The app feature was first noticed by 9to5Google.

The app now includes a Notifications option at the bottom, next to Discover, Search, and Saved items. The Notifications section displays a continuous list of alerts from Google Search, weather conditions, flight information, sports scores, movies and TV shows, and more. The notifications are grouped under “Today” and “Earlier." This feature should prove handy if you miss a notification from the Google app, as it provides a more focused view than Android's system-level history.

Read more
It’s impossible to recommend a cheap Google Pixel phone
The back of the Google Pixel 8a.

Google Pixel 8a Andy Boxall / Digital Trends

I had a terrible time explaining which cheap Pixel 8 phone you should buy when I reviewed the Google Pixel 8a. I eventually settled on saying the Pixel 8a is the low-cost Pixel phone you should buy ... except for those times when it isn’t.

Read more
Google has a magical new way for you to control your Android phone
Holding the Google Pixel 8 Pro, showing its Home Screen.

You don’t need your hands to control your Android phone anymore. At Google I/O 2024, Google announced Project Gameface for Android, an incredible new accessibility feature that will let users control their devices with head movements and facial gestures.

There are 52 unique facial gestures supported. These include raising your eyebrow, opening your mouth, glancing in a certain direction, looking up, smiling, and more. Each gesture can be mapped to an action like pulling down the notification shade, going back to the previous app, opening the app drawer, or going back to home. Users can customize facial expressions, gesture sizes, cursor speed, and more.

Read more