How to use Visual Intelligence, Apple’s version of Google Lens

How to use Visual Intelligence, Apple’s version of Google Lens


The recent introduction of iOS 18.2 finally brings many of the promised Apple Intelligence features, such as: Genmoji And Picture playground. One such long-awaited tool is Visual intelligencea feature currently reserved for the iPhone 16 Pro and Pro Max and introduced for the first time The company’s September event.

Visual Intelligence is Apple’s answer to Google Lens. It uses the camera system and AI to Analyze images in real time and provide useful information. This can help people learn more about the world around them and is particularly useful when shopping, looking up details about a restaurant or store, translating written text, summarizing text, or reading text aloud. It also integrates with Google Image Search and ChatGPT.

There are two caveats. The launch of Apple Intelligence was a complicated mess, and that trend continues with Visual Intelligence. Currently the tools only work with the iPhone 16 Pro and Pro Max the most powerful of the company’s latest mobile phones. Apple has hinted that the feature could eventually be available for older models. Google Lens, after all has been around since 2017At the time, the Pixel 2 was the hottest phone on the market.

There is also a waitlist that applies to all Apple Intelligence features. To join the list, go to Settings and search for “Apple Intelligence & Siri.” Then click “Join Waitlist.” Once approved, the software is ready for use.

As of this writing, the only way to launch Visual Intelligence is to long press the camera control button. This is the new user interface at the bottom right of the handset. After pressing, the Visual Intelligence interface opens.

One button.One button.

Apple

Now the fun begins. Simply point your phone at something and select ChatGPT from the bottom left icon or Google Image Search from the bottom right icon. Alternatively, if the field of view contains text, tap the circle at the bottom of the screen. The phone can also be pointed at a business to get useful information.

Move the phone in front of the text, turn on Visual Intelligence and tap the circle at the bottom of the screen. This will parse the text. After analysis, there are a few options. Tap Translate at the bottom of the screen to translate the text into another language. To have Siri read the text, tap Read Aloud. Tap Summarize for a quick summary of the copy.

The tool also identifies contact information in text, such as phone numbers, email addresses and websites. Depending on the type of text, users can take action. For example, tap the phone number to make it ring. Other actions include starting an email, creating a calendar event, or visiting a website. Tap the More button to see all available options. Tap Close or swipe up to end the session.

Visual Intelligence can provide details about a company that is right in front of you. Simply open the tool and point the camera at the signage. The company name should appear at the top of the screen. Tap Schedule to see hours or Order to purchase. View the menu or available services by tapping Menu and make a reservation by tapping Reservation. To call the company, read reviews, or view the website, tap More.

Swipe up or tap Close to end the session. This feature is currently only available to US customers.

First, point the camera at an object. Enable Visual Intelligence and tap the ChatGPT icon at the bottom left of the screen. Tap the Ask button to get information about the item. We used it on a bottle of hand cream which clearly identified it. A text field then appears for follow-up questions. Users can ask whatever they want, but results may vary. We asked ChatGPT where you can buy the hand cream and how much it costs. It coped with this task excellently. Yay, shopping.

Integration.Integration.

Engadget/Cherlynn Low

Tap the close button or swipe up to remove all fields, which will also exit Visual Intelligence.

When you select Google Image Search, a Safari dialog box appears containing similar photos from the web. A good use case here is searching for offers. We took a photo of a bottle of hand cream and the safari results offered many different price points to choose from. However, users have to find the best offer and complete the purchase themselves.

The tool in action.The tool in action.

Engadget/Cherlynn Low

Tap the Close button to clear these results, then swipe up from the bottom of the screen to close the tool.



Source link

Spread the love
Leave a Comment

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *