How to Use Google Lens on Your iPhone or iPad
Google Lens, the latest addition to Google Photos, is a powerful and accurate search engine that’s made even better when paired with the image recognition capabilities of Google’s Pixel line of smartphones. However, you don’t need a Pixel to use Google Lens — it works on iPhones and iPads as well! Here’s how to use Google Lens on your iPhone or iPad in just two easy steps!
What is Google Lens?
Google Lens is a visual search tool that uses your device’s camera to identify objects, landmarks, and other things around you. You can then learn more about what you’re seeing, save it for later, or take action such as calling a number or sending an email. To use Google Lens, simply point your camera at something and tap the Lens icon that appears. It’ll show you relevant information related to what you’re looking at. For example, when I was in Times Square, I pointed my phone at one of the iconic Coca-Cola billboards.
It identified it as Coca-Cola, told me its location (Times Square), and showed me its history. Pointing my phone at another billboard resulted in information about an upcoming Broadway show being displayed. When I scanned my surroundings, the app recognized buildings and landmarks.
There are also many ways to personalize your experience with Google Lens. With auto-recognition turned on, Lens will automatically analyze images from your photo library that correspond with what you’re pointing at in real life. Turning this feature off will mean every time you scan an object with your camera lens, Lens will have to recognize it again before giving any results.
To make sure you’re getting accurate results when using Google Lens on iOS devices, turn off Augmented Reality mode by swiping up from the bottom of the screen until the Camera viewfinder pops up. Tap the Settings button at the top right corner of the viewfinder and disable AR Mode under More Settings. Turn off AR Mode because this will not work well with Google Lens: Android devices do not support Apple’s ARKit so Apple can’t give similar functionality like it does for Android users.
What do you need to get started?
To use Google Lens on your iPhone or iPad, you’ll need an iPhone or iPad running iOS 11 or later and the Google app installed. Open the Google app and tap the Search bar. Then, point your camera at something and tap the Google Lens icon that appears. If this is your first time using the feature, you’ll be prompted to allow Google access to your camera. After giving permission, a card will appear with all of the information about what’s in front of your lens.
For example, if you’re pointing it at a restaurant, you can tap to call or visit their website. You can also change Google Lens settings by tapping the three-dot menu button next to Tap for more info. From there, scroll down until you see Device information and select it. Under Location accuracy, toggle off High accuracy. What does that do? High accuracy provides location data for objects near where you are when you take a photo.
If you disable this setting, then the location data provided may not be as accurate but any matches will show up on top of your photo without having to zoom in. When the Location data matches what’s in front of your lens, those results show up automatically without having to zoom in .
One other thing you might want to know: switching from Local to Global mode changes the language detected by Google Lens. The difference between these two modes has to do with how Google handles matching fonts from images with written text from sources such as books, magazines, and newspapers.
What happens when Local mode finds a match? In local mode, Google Lens tries to identify text found in the image before determining whether the language is supported by its library of fonts. That means if the language isn’t supported, no font matching takes place at all; instead, just a list of possible matches (in order) will appear below each object in your viewfinder window.
What happens when Global mode finds a match? In global mode, Google Lens will try to find a match for both the text in the image and whatever languages its library supports. Once it locates a close enough font match for what’s in front of your lens, it’ll present the best options below the item being viewed.
The choice between global and local depends on how important precise results are versus ease of use.
What can you do with it?
You can use Google Lens to quickly learn more about the world around you—just by pointing your camera at something. For example, you can point your camera at a flower and Google Lens will tell you what kind of flower it is. You can also use it to scan and translate text, identify landmarks, add events to your calendar, look up movie showtimes, and much more.
And as with all machine learning technology from Google, we’ll be adding new features over time to make Google Lens even better. So keep an eye out for updates! To get started with Google Lens, just download the latest version of the Google Photos app on iOS or Android, open any photo and tap Share. Then tap Google Lens. If you’re using a newer phone that supports ARCore (like Pixel 3), then after tapping Google Lens your phone’s camera should open automatically so that you can point it at objects in the real world. If you don’t have a compatible device, then follow these steps:
– Tap Get Started under Try Now;
– On the following screen, select Upload Photo;
– Find and upload a photo from your device;
– Point your camera at an object within the image to identify it. – Hold down on an object to extract more information about it.
– Swipe right to find similar items and left to see different views of the same item. – Scroll down below the list of results to view more results.
– Drag your finger on top of a result to copy its URL into your clipboard.
– Tap Done when you’re finished exploring and tap Cancel if you want to cancel. The idea behind Google Lens is to provide additional context, like identifying things in the real world and translating languages on signs. There are a few ways you can use this service without downloading an app.
For instance, many mobile devices come with Google Assistant installed which means they have access to some of these features through voice commands. But if you’re not one of those people who love talking to their phones, there are plenty of other ways to access the feature without having to touch your phone at all.
All you need is a modern browser and either Chrome or Safari (sorry Firefox users).
Simply load up any webpage that contains text or images and click the L button in the upper-right corner of your browser window. As soon as you do, a panel will appear along the bottom of your screen. Click on the magnifying glass icon in that panel and presto — Google Lens magically appears!
What does this mean? Well, you now have instant access to the power of Google search within any website you happen to be browsing. As with all machine learning technology from Google, we’ll be adding new features over time to make Google Lens even better. So keep an eye out for updates!
Sample projects
If you’re like me, you’re always looking for ways to use your iPhone or iPad to be more productive. And one of the coolest productivity tools available is Google Lens. With it, you can scan and identify objects, text, and even barcodes.
Here are a few ways you can use Google Lens to be more productive -Get information about an object: Tap the camera icon in Lens, then point your phone’s camera at something and tap identify. You’ll see what Wikipedia has to say about it, or find other similar items that are related.
-Get info about a restaurant: Point your phone’s camera at a restaurant’s sign (or menu) from any distance and hit identify. You’ll see reviews from people who’ve been there as well as what others have posted about it on Instagram.
-Scan a book cover: Point your phone’s camera at the cover of any book from any distance and tap scan. You’ll get all sorts of info including ratings, reviews, publication date, book jacket summary, author bio etc.