Skip to main content

The Google Lens app is now available on Google Photos for iOS

pixel 2
Image used with permission by copyright holder

As announced via the official Google Photos Twitter account on March 15, users of Google Photos for iOS can now use Google Lens to analyze and extract text, hyperlinks, and other information from photos.

Originally announced at Google’s I/O event, Google Lens uses machine learning to extract text and hyperlinks from images, along with its ability to identify various landmarks from around the world and a host of other promised abilities. It first launched on Google’s Pixel phones at the tail end of 2017, before being launched for all Android phones in March 2018. As of today, March 16, iOS users can also access the deep learning of the Google Lens by accessing it through the iOS Google Photos app.

Starting today and rolling out over the next week, those of you on iOS can try the preview of Google Lens to quickly take action from a photo or discover more about the world around you. Make sure you have the latest version (3.15) of the app.https://t.co/Ni6MwEh1bu pic.twitter.com/UyIkwAP3i9

— Google Photos (@googlephotos) March 15, 2018

Anyone looking to play with Google Lens should make sure that their Google Photos app is updated to the latest version (3.15). Then, open your Google Photos app, open a photo, then tap the Google Lens logo. If you’re struggling to find it, Google has posted a small guide on its support website. Some Twitter users have been complaining that they have not yet been able to access the functionality, and it seems that the update is in the process of rolling out worldwide. It’s also worth noting that Google Lens can only be used if your iOS device’s language is set to English, for the time being.

But what can you do with Google Lens? It’s capable of extracting text from your Google Photos, and while that may not sound impressive, it’s then able to use that text to find businesses, extract hyperlinks, find addresses, or identify books, movies, and games. If you take a picture of a business card, Google Lens will offer to save the information as a new contact, taking some of the fuss out of business networking. Landmarks can also be identified, and information on ratings, tours, and history will be offered as a result.

Use Google Lens to copy and take action on text you see. Visit a website, get directions, add an event to your calendar, call a number, copy and paste a recipe, and more. pic.twitter.com/E4ww2cxVUd

— Google Photos (@googlephotos) March 15, 2018

The Google Photos account has been sharing more than a few ways to make your Google Lens work for you, and while that fact that it’s currently restricted to the Google Photos app on iOS means it’s a bit harder to use in everyday circumstances, it’s a really cool addition, and a great indication of what the future has in store for us.

Editors' Recommendations

Mark Jansen
Mark Jansen is an avid follower of everything that beeps, bloops, or makes pretty lights. He has a degree in Ancient &…
Perplexity, one of our favorite AI apps, just got a big update
Perplexity app shown on an iPhone.

If you've been looking for more than just traditional search engines, you may have turned to Perplexity. The app allows users to ask questions and receive quick, accurate answers from a carefully selected set of sources, all powered by ChatGPT. Now, a new software update is making Perplexity AI even better.

Perplexity Pages allows you to transform your research into visually engaging and comprehensive content. Whether you are creating detailed articles, reports, or informative guides, Perplexity Pages brings your ideas to life. This new tool simplifies organizing and sharing information, giving you more control. You also have the option to publish your work to Perplexity’s library of user-generated content, where you can showcase your expertise.

Read more
Apple’s AI plans for the iPhone just leaked. Here’s everything we know
The back of a Natural Titanium iPhone 15 Pro Max.

Apple is the only major name in the world of Big Tech that hasn’t made its ambitious AI plans public yet. But that will change in a few weeks, with a focus on reimagining the iPhone experience. Bloomberg, citing internal sources, has detailed how Apple plans to integrate generative AI experiences with iOS 18, the next major build of its iPhone operating system.

The company plans to push new AI-powered capabilities not just in such in-house apps as Safari and Maps, but also in experiences like the notification system and a supercharged Spotlight search. Notably, Apple will push the bulk of AI processing to the iPhone’s silicon, and only a minor portion of it will be pushed to the cloud.

Read more
Can a $500 Pixel phone beat a $1,000 iPhone in a camera test? I found out
iPhone 15 Pro (left) and Google Pixel 8a camera modules.

Right before Google I/O 2024, Google showed off the latest Pixel device, the Google Pixel 8a. This is the latest offering from the Pixel A-series, which is a more budget-friendly Pixel for those who don’t need all the bells and whistles of the flagship Pixel 8 or 8 Pro.

The Pixel 8a features a new design with more rounded corners and a matte-finish back. It packs Google’s latest silicon, the Tensor G3, but the camera hardware remains unchanged from its predecessor, the Pixel 7a.

Read more