With the iPhone 16 lineup, Apple has introduced a new Apple Intelligence feature called Visual Intelligence. This feature allows users to click and hold the Camera Control button to look up details about places, and text, and identify different items.
Users have to point their device’s camera towards the item they are looking to get information about.
While the feature was limited to the iPhone 16 lineup initially, in iOS 18.4 beta Apple has expanded this feature to iPhone 15 Pro and iPhone 15 Pro Max, allowing the users of these devices to also use this feature.
On devices that do not have a Camera Control button, the Visual Intelligence feature can be triggered with the Action Button, from the Lock Screen, and through the Control Center.
If you are still confused about the visual intelligence feature of Apple Intelligence and want to learn what you can do with this feature, then this guide is for you.
Which iPhone models support visual intelligence?
Let’s take a look at the devices that support visual intelligence:
- iPhone 16
- iPhone 16 Plus
- iPhone 16 Pro
- iPhone 16 Pro Max
- iPhone 16e
- iPhone 15 Pro
- iPhone 15 Pro Max
In order to use visual intelligence your iPhone 16 should be running iOS 18.2 or later. If you have an iPhone 16e, then it should be running iOS 18.3 or later.
iPhone 15 Pro and iPhone 15 Pro Max users need to have iOS 18.4 (currently in beta) to use visual intelligence.
Things you can do with visual intelligence
Now that you know which devices support visual intelligence feature, here’s what you can do with visual intelligence on your Apple Intelligence enabled device.
Learn about places: You can point your iPhone’s camera at a business to learn information about it. You can learn information like opening hours, services offered by the business, contact information, make reservations, view menus, call the business, view their website, and more.
Identify plants and animals: Visual intelligence also allows iPhone users to get information about animals and plants. You can point the camera towards a plant and identify its species. Similarly, you can point your camera towards an animal to learn about its breed.
Translate and summarize text: Visual intelligence on the iPhone also allows you to interact with text in various ways. For example, you can use it to translate a piece of text, make your iPhone read the text aloud, make your iPhone summarize the text, and if the text contains any information such as a phone number, email address, or a web link then you can tap on it to quickly take relevant action.
Find items on Google: Visual intelligence also lets users perform Google searches to find similar items. You can point the camera at any object and then tap on the image button to perform a Google search to find similar images. This feature allows users to find and purchase items or look at prices and other information easily.
Add events: If you come across an event flyer or a poster, then you can point your iPhone’s camera towards it to create an event in the Calendar app.
Ask ChatGPT for information: If you see anything interesting around you and want to learn more about it, then you can also ask ChatGPT about it. Just point the camera towards an object and then select the option to learn more about it through ChatGPT.