Does your iPhone already have Visual Intelligence? The AI that transforms your camera is coming to these models.

What if your phone could tell you what it's seeing? Not just capture it, but understand it. From identifying a flower to translating a sign in another language or telling you which restaurant is right in front of you. With the release of iOS 18.4, Apple has begun to make that vision a reality for more iPhone users.
The Visual Intelligence feature, until now reserved for newer models , is finally being extended to other devices. But what exactly is this technology? And why does Apple think it will be central to the future of its products, from iPhones to AirPods?
What is Visual Intelligence and how it turns the camera into a smart sensorThis isn't just a simple photography enhancement. Visual Intelligence is an artificial intelligence tool that uses the iPhone camera as an analytical eye. Thanks to this feature, the device can:
- Identify objects, plants and animals
- Translate texts in real time
- Summarize visual documents
- Read aloud what the camera captures
- Recognize places, businesses and monuments
Instead of opening multiple apps for these tasks, simply point the camera . Information appears instantly contextualized, thanks to a combination of visual recognition, geolocation, and generative AI.
Apple describes this feature as a way to "learn more about the world around you," and that's precisely its core: transforming the iPhone camera into a tool for understanding the environment.
New iPhones support Visual Intelligence after iOS 18.4When it first launched in iOS 18.2 , Visual Intelligence was limited to the iPhone 16 Pro and 16 Pro Max , the only models with the new Camera Control button. But iOS 18.4 has changed the game.
The following devices can now also activate Visual Intelligence:
- iPhone 15 Pro
- iPhone 15 Pro Max
- iPhone 16e
These models can access the feature by adding it to the action button, the Control Center, or even the lock screen. Apple is thus gradually democratizing a technology that previously seemed reserved for a select few .
It's not just about expanding access: features are also being added with each release. Visual Intelligence already incorporated new capabilities in iOS 18.3, and everything points to this continuing in future system updates.
Apple's long-term plan: to make every device see, understand, and respondApple isn't stopping at the iPhone. According to sources like Mark Gurman , the company's goal is to bring Visual Intelligence to more products in the ecosystem. How? By integrating cameras in places where they didn't exist before , such as:
- AirPods with a camera , capable of understanding gestures or detecting objects
- New versions of the Apple Watch with integrated lenses, designed to interact with the visual world without the need for a mobile phone.
This expansion responds to a clear strategy: to enable all Apple devices to see and process the physical world through their own lenses and without relying on external technologies. Apple wants Visual Intelligence to work with its own AI models, rather than those developed by OpenAI or Google .
Towards a camera that thinks, instead of just seeingEach update is a building block in a larger structure. Visual Intelligence isn't just a useful innovation; it's a key piece in redesigning how we use our devices. Today, you can point your camera to translate or identify something . Tomorrow, your device may tell you before you even ask.
Apple, as it did with Face ID and the Neural Engine chip , is laying out a roadmap where artificial intelligence becomes invisible, integrated, and ubiquitous. The interesting thing is that it's starting with something as mundane as the camera.
eleconomista