Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Visual Intelligence has made the Camera Control on my iPhone 16 worth using

Using Visual Intelligence on an iPhone 16 Pro showing ChatGPT answer.
Christine Romero-Chan / Digital Trends

One of the big selling points of the iPhone 16 hardware is the Camera Control button. It’s a small physical button on the bottom right of the frame that also has some capacitive capabilities. With the initial launch of iOS 18, a single press launches your camera app of choice, and you can do half presses and sliding gestures to adjust camera settings. It’s a neat idea, but it has some flaws that prevent it from being a great shutter button.

But now we have iOS 18.2, and that brought a lot of new Apple Intelligence features to our phones, especially if you have an iPhone 16. With iOS 18.2, Apple finally added Visual Intelligence, a feature similar to Google Lens, but on iPhone.

Recommended Videos

After playing with the latest update, I’m happy to report that Visual Intelligence is a real game-changer for the Camera Control, and it’s made the good but awkward button finally worth using.

Camera Control initially let me down

A person using the Camera Control on the Apple iPhone 16 Plus.
Andy Boxall / Digital Trends

When I got my iPhone 16 Pro on launch day, I was excited because I was eager to try out Camera Control for all my photography needs. But as I used the Camera Control as a shutter button and a way to adjust settings, the problems began to rear their heads.

First, the position of the Camera Control isn’t great if you want to use it as a shutter button. It’s more toward the lower center of the frame instead of being closer to the bottom. If you’re taking a photo in landscape orientation, you may still need to reach a bit to press the Camera Control. For me, this meant part of my thumb would end up in front of the screen, obstructing it. This would be a bigger hassle on the iPhone 16 Pro Max due to its size.

iPhone 16 Pro Max in Desert Titanium.
Christine Romero-Chan / Digital Trends

Another issue I had was that pressing the Camera Control meant getting some slight camera shake, which could result in some blur in a still image. Adjusting the pressure sensitivity helped a bit, but there will always be a slight shake compared to touching the on-screen shutter button.

When the iPhone 16 was originally launched, I had trouble with the pressure needed for the half press to get to camera settings. It seems that Apple has fixed that with recent updates, but I still find it faster to just use the touchscreen. A few months after launch, I pretty much only use Camera Control for launching the Camera app. Meanwhile, I continue taking photos with the on-screen shutter to ensure my photos don’t come out blurry or out of focus.

Visual Intelligence is what Camera Control needed

Using Visual Intelligence on an iPhone 16 Pro showing Google search results.
Christine Romero-Chan / Digital Trends

Prior to iOS 18.2, the Camera Control was just a camera-only Action button for me. But now that I have updated my phone and finally have access to Visual Intelligence, I’m actually using Camera Control more.

To activate Visual Intelligence, press and hold the Camera Control. It brings up a viewfinder for you to point the camera at something in the real world. Then, you can either select the shutter/Camera Control button to do a quick capture (not saved to the photo library) before inquiring about it or select either Ask or Search. The Ask option will default to prompt ChatGPT with a simple “What is this?” or you can ask for more details about what you’re looking at. Search will bring up Google results relating to the object you’re inquiring about.

Visual Intelligence on iPhone.
Jesse Hollington / Digital Trends

What you can get with Visual Intelligence depends on what you’re pointing your camera at. So far, I’ve used it to identify plants, animals, and random objects. But you can also use it to look up details about points of interest, businesses, services, contact information, translate text, and more.

Though I haven’t had much time to use it since installing iOS 18.2, I can see myself using this feature quite a bit when I’m out and about. It also definitely feels like the placement of Camera Control works better for Visual Intelligence than a camera shutter button. I’m right-handed, so I typically hold my phone that way, with my thumb on Camera Control. I can easily use Visual Intelligence one-handed, unlike using Camera Control as a camera shutter button.

I no longer regret getting my iPhone 16 Pro

iPhone 16 Pro Max next to the 16 Plus, 16 Pro and regular iPhone 16
Nirave Gondhia / Digital Trends

I’ve upgraded my iPhone every year since the beginning, but this was the first year that I actually did have some second thoughts, at least initially. When the iPhone 16 series launched, Apple Intelligence didn’t ship with them, so while the hardware was good, the software felt incomplete.

But now that Apple has rolled out the Apple Intelligence features it was advertising so heavily, I’m satisfied with my iPhone 16 Pro purchase. Camera Control’s primary purpose is Visual Intelligence, in my opinion, along with quickly getting to the camera. And when combined with the fact that the smaller iPhone 16 Pro now has 5x optical zoom, yeah, I’m a happy camper.

From what I’ve seen, it doesn’t sound like many people have used Camera Control since it debuted. I certainly only used it for one thing. But now, with iOS 18.2 and Visual Intelligence, I think Camera Control could be my new favorite iPhone feature.

Christine Romero-Chan
Christine Romero-Chan has been writing about technology, specifically Apple, for over a decade. She graduated from California…
iOS 18 is about to make Apple Maps better than ever
Two iPhones showing a comparison between Google Maps and Apple Maps.

Google Maps (left) versus Apple Maps (right) Jesse Hollington / Digital Trends

Apple Maps has finally gotten a fundamental but heavily requested and long-awaited feature: the ability to “Search Here” on Apple Maps. The new button comes with the rollout of iOS 18, and it allows you to search for a specific location on the map when it isn’t in your current location.

Read more
Apple’s AI features for the iPhone just hit a major roadblock
Summarization of notification and emails on iPhone with Apple Intelligence.

Earlier this week, the EU’s competition chief, Margrethe Vestager, told CNBC that Apple had some “very serious” issues as it tries to comply with the landmark Digital Markets Act (DMA) tech legislation. These were the rules that finally forced Apple to open iPhones for alternative app stores, allow external browser engines, and enable third-party payment options, among other things. It seems those rules also mean the best of iOS 18 won’t make it to the EU bloc either.

Apple has confirmed that a trio of crucial iPhone upgrades that it announced at WWDC 2024 earlier this month won't appear on iPhones in the EU later this year. The biggest of those would be Apple Intelligence, the suite of AI features deeply integrated within iOS 18 that are aimed at redefining what iPhones can do in the age of generative AI tools like Google’s Gemini and Microsoft’s Copilot.

Read more
iOS 18 has a hidden feature you’ll only see when your iPhone battery dies
Close-up view of remaining battery life on an iPhone 14 Pro Max.

It's been just a few days since Apple released the first developer preview of iOS 18. Since then, developers and everyday users have discovered features in the first iOS 18 beta that Apple didn't mention in its Worldwide Developers Conference (WWDC 2024) keynote. The most recent discovery concerns what happens when your iPhone's battery becomes exhausted.

Apple iPhones have a power reserve feature that conserves a small amount of battery life to support essential functions like Find My and NFC unlocking when the battery is nearly depleted. In iOS 18, the feature appears to be extended.

Read more