On iPhone 16 models, Visual Intelligence lets you use the camera to learn more about places and objects around you. It can also summarize text, read text out loud, translate text, search Google for items, ask ChatGPT, and more. And thanks to the latest iOS 18.4 update from Apple, iPhone 15 Pro models can now get in on the action, too.

Apple Visual Intelligence
Until recently, ‌Visual Intelligence‌ was a feature limited to iPhone 16 models with a Camera Control button, which was necessary to activate the feature. However, Apple in February debuted the iPhone 16e, which lacks Camera Control and yet supports Visual Intelligence. This is because the device ships with a version of iOS that includes Visual Intelligence as an assignable option to the device's Action button.

Apple later confirmed that the same Visual Intelligence customization setting would be coming to iPhone 15 Pro models via a software update. That update is iOS 18.4, and it's available now. If you haven't updated yet, you can do so by opening Settings ➝ General ➝ Software Update.

After your device is up-to-date, you can assign Visual Intelligence to the device's Action button in the following way.

  1. Open Settings on your iPhone 15 Pro.
  2. Tap Action Button.
  3. Swipe to Visual Intelligence.

settings

Pressing and holding the Action button will now activate Visual Intelligence. Note that you can also activate Visual Intelligence using the new button option in Control Center. Here's how.

  1. Swipe down from the top-right corner of your iPhone's display, then long press on the Control Center.
  2. Tap Add a Control at the bottom.
  3. Use the search bar at the top to search for Visual Intelligence, or swipe up to the "Apple Intelligence" section and choose the button.
  4. Tap the screen to exit the Control Center's edit mode.

control center

Using Visual Intelligence

The Visual Intelligence interface features a view from the camera, a button to capture a photo, and dedicated "Ask" and "Search" buttons. Ask queries ChatGPT, and Search sends an image to Google Search.

visual intelligence buttons
When using Visual Intelligence you can either snap a photo using the shutter button and then select an option, or you can select an option in live camera view. You cannot use photos that you took previously.

To learn about everything that you can do with Visual Intelligence, be sure to check out our dedicated guide.

Top Rated Comments

cicalinarrot Avatar
10 months ago
I'd title this "$1000 phones from a year ago still not outdated, bad luck for $900 phones buyers"
Score: 11 Votes (Like | Disagree)
bhirt37 Avatar
10 months ago
Visual intelligence, what a joke. Unless I am missing something, all this feature does is upload an image to either ChatGPT or Google image search, depending on the button. What amazing things you are doing apple. These billions invested in ai are paying off!!
Score: 9 Votes (Like | Disagree)
youteonduty Avatar
10 months ago
technically you can get all these features on all the phones running upto date ios but apple decided not too as idea is to sell new hardware
Score: 6 Votes (Like | Disagree)
almostinsane Avatar
10 months ago

You’re missing something. It’s in-device processing.
How is it on device if it sends the image to google and the query to Chatgpt?
Score: 5 Votes (Like | Disagree)
lelisa13p Avatar
10 months ago

The beta period for Apple I will be longer than expected
I'm patient. I can wait. These days the thrill of a Beta for me isn't what it once was.
Score: 5 Votes (Like | Disagree)
Ctrlos Avatar
10 months ago
Apple Intelligence gets some stick (largely for a set of useful or quaint OS features) but Visual Intelligence is brilliant.

Yes it relies on ChatGPT to do much of the heavy lifting but this doesn’t require an account and is no different to using Google as the search engine in Safari.

I’ve used it for all sorts of things from asking what an unknown food in the market might taste like to how to calculate Young’s Modulus of some tensile testing samples at work. On a recent trip to Spain I used it to visually translate nearly everything I came across.

Yes it’s Google Lens for the iPhone. But it’s only a purposeful button hold away at all times and the swishy gradient animations are look slick.
Score: 4 Votes (Like | Disagree)

Popular Stories

iPhone Top Left Hole Punch Face ID Feature Purple

iPhone 18 Pro Launching Later This Year With These 12 New Features

Thursday January 15, 2026 10:56 am PST by
While the iPhone 18 Pro and iPhone 18 Pro Max are not expected to launch for another eight months, there are already plenty of rumors about the devices. Below, we have recapped 12 features rumored for the iPhone 18 Pro models, as of January 2026: The same overall design is expected, with 6.3-inch and 6.9-inch display sizes, and a "plateau" housing three rear cameras Under-screen Face ID...
iPhone Top Left Hole Punch Face ID Feature Purple

New Leak Reveals iPhone 18 Pro Display Sizes, Under-Screen Face ID, and More

Wednesday January 14, 2026 7:09 am PST by
While the iPhone 18 Pro models are still around eight months away, a leaker has shared some alleged details about the devices. In a post on Chinese social media platform Weibo this week, the account Digital Chat Station said the iPhone 18 Pro and iPhone 18 Pro Max will have the same 6.3-inch and 6.9-inch display sizes as the iPhone 17 Pro and iPhone 17 Pro Max. Consistent with previous...
2024 iPhone Boxes Feature

Apple Adjusts Trade-In Values for iPhones, Macs, and More

Thursday January 15, 2026 11:19 am PST by
Apple today updated its trade-in values for select iPhone, iPad, Mac, and Apple Watch models. Trade-ins can be completed on Apple's website, or at an Apple Store. The charts below provide an overview of Apple's current and previous trade-in values in the United States, according to the company's website. Most of the values declined slightly, but some of the Mac values increased. iPhone ...
Apple MacBook Pro M4 hero

These 5 Apple Products Will Reportedly Be Upgraded With OLED Displays

Friday January 16, 2026 7:07 pm PST by
Apple plans to upgrade the iPad mini, MacBook Pro, iPad Air, iMac, and MacBook Air with OLED displays between 2026 and 2028, according to DigiTimes. Bloomberg's Mark Gurman previously reported that the iPad mini and MacBook Pro will receive an OLED display as early as this year, but he does not expect the MacBook Air to adopt the technology until 2028 at the earliest. A new iPad Air is...
Verizon New

Verizon Offering $20 Credit After Major Outage, Here's How to Get It

Thursday January 15, 2026 7:37 am PST by
Verizon today announced it will be offering customers a $20 account credit after a major outage on Wednesday, and action is required to receive it. The carrier said affected customers can accept the credit by logging into the My Verizon app, but it might take some time before this option shows up in the app. Affected customers will receive a text message when the credit is available. On...