Visual Intelligence on the iPhone uses the camera to identify objects and answer questions.
The feature requires specific iOS versions (18.2, 18.3, or 18.4) and models (iPhone 16-16 Pro Max, iPhone 15 Pro-Max).
To launch Visual Intelligence, tap and hold the Camera Control button on iPhone 16 models, or customize the Action Button on other models.
Visual Intelligence can be used to identify objects like plants, restaurants, and more, with answers provided through Siri.
The feature is available for users who have Apple Intelligence turned on in Settings, via Apple Intelligence & Siri.
One of the Apple Intelligence features that hasn’t been delayed is Visual Intelligence, which uses your iPhone’s camera to identify and answer questions on whatever’s around you in the world.
It lets you snap a pizza restaurant storefront and find out its opening hours, for example, or point your camera at a plant and find out what it’s called and how to care for it. If you’ve used Google Lens, you’ll get the idea.
This isn’t available to everyone, though. You have to be using iOS 18.2 on the iPhone 16, iPhone 16 Plus, iPhone 16 Pro, or iPhone 16 Pro Max; iOS 18.3 on the iPhone 16E; or iOS 18.4 on the iPhone 15 Pro and iPhone 15 Pro Max. You’ll also need to have Apple Intelligence turned on, via Apple Intelligence & Siri in Settings.
How to launch Visual Intelligence
If you have an iPhone 16 with a Camera Control button on the right-hand side, you can tap and hold this button to bring up the camera and Visual Intelligence.
If you’ve got an iPhone 16E, iPhone 15 Pro, or iPhone 15 Pro Max, you’ve got a few different options to choose from:
You can customize the Action Button to launch Visual Intelligence: Go to Settings, tap Action Button, then swipe left or right to find Vis …
A. Visual Intelligence is an Apple Intelligence feature that uses your iPhone’s camera to identify and answer questions on whatever’s around you in the world.
Q. Which iPhones support Visual Intelligence?
A. Visual Intelligence is available on iOS 18.2 for iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max; iOS 18.3 for iPhone 16E; and iOS 18.4 for iPhone 15 Pro and iPhone 15 Pro Max.
Q. How do I launch Visual Intelligence on my iPhone?
A. To launch Visual Intelligence, you can tap and hold the Camera Control button on your iPhone 16 (if available), or customize the Action Button to launch it in Settings > Action Button.
Q. What is Google Lens?
A. Google Lens is a feature that uses AI to identify objects and provide information about them, similar to Apple’s Visual Intelligence.
Q. How do I enable Apple Intelligence on my iPhone?
A. To enable Apple Intelligence, go to Settings > Apple Intelligence & Siri and turn it on.
Q. What can I use Visual Intelligence for?
A. You can use Visual Intelligence to snap a pizza restaurant storefront and find out its opening hours, or point your camera at a plant and find out what it’s called and how to care for it.
Q. Is Visual Intelligence available to everyone?
A. No, Visual Intelligence is not available to everyone; you need to be using the specified iOS versions and have Apple Intelligence turned on.
Q. Can I customize the Action Button to launch Visual Intelligence?
A. Yes, if your iPhone 16E, iPhone 15 Pro, or iPhone 15 Pro Max, you can customize the Action Button to launch Visual Intelligence in Settings > Action Button.
Q. What is the difference between Visual Intelligence and Google Lens?
A. While both features use AI to identify objects and provide information, Visual Intelligence is exclusive to Apple devices and uses the iPhone’s camera for identification.
Q. How do I access Visual Intelligence on my iPhone 16E or iPhone 15 Pro/Pro Max?
A. For iPhone 16E, you can customize the Action Button to launch Visual Intelligence in Settings > Action Button. For iPhone 15 Pro/Pro Max, you need to go to Settings > Apple Intelligence & Siri and turn it on.