As we expected, WWDC 2025 – mainly came the opening key – and went without a formal update of Siri. Apple is still working on the AI-Infunded Update, which is essentially a much more NACN and actionable virtual assistant. Techradar’s editor as a whole, Lance Ulanoff, the details broke down what causes the delay after a conversation with Craig Federighi, here.
Now, even without the AI-Infunded Siri, Apple delivered a pretty significant upgrade to Apple Intelligence, but it’s not necessarily in the place you would think. It provides Visual Intelligence-a feature that is exclusive to the iPhone 16 family, the iPhone 15 Pro or iPhone 15 Pro Max-an upgrade when it gets on screen awareness and a new way of searching, all in the power of a screen.
It is a companion feature for the original set of visual intelligence knows how -a long press of the camera control button (or customization of the 15 Pro action button) draws up a live image of your iPhone’s camera and the opportunity to take a shot, as well as “Ask” or “Search” for what your iPhone is looking.
It is kind of a more basic version of Google Lens, as you can identify plants, pets and seek visually. Much of it doesn’t change with iOS 26, but you can use visual intelligence for screens. After a short demo on WWDC 2025, I’m eager to use it again.
Visual Intelligence makes screens much more actionable and can potentially save you space on your iPhone … especially if your photos app is something like mine and full of screens. The great efforts of Apple here are that this gives us a taste of consciousness on screen.
Screenshotting of a message chat with a poster for an upcoming movie night in the demo I saw revealed a glimpse of the new interface. It’s iPhone’s Classic Screenshot -Interface, but bottom left is the well -known “Ask” and “Search” to the right, while in the middle is a suggestion from Apple Intelligence that may vary based on what you screenshot.
In this case, it was “Add to Calendar” so I could easily create an invitation with the name of movie night on the right date and time as well as the location. Essentially, it identifies the elements of the screen and extracts the relevant information.
Pretty nice! Instead of just taking a screen shot of the image, you can get an action event added to your calendar in just seconds. It also bakes in functionality, which I think many iPhone owners will appreciate -even if Android phones like the best pixels or Galaxy S25 Ultra could have done this for a while.
Apple Intelligence will give these suggestions when it finds them right – it may be to create an invitation or a reminder, as well as translate other languages into your favorite, summarize text or even read aloud.
Everyone is very practical, but let’s say you roll Tiktok or Instagram wheels and see a product – maybe a nice button down or a poster that catches your eye – visual intelligence has a solution to this, and that’s kind of Apple’s answer to ‘Circle to search’ on Android.
You want to screenshot, and then, when taken, simply scrub the part of the image you want to search. It is a similar effect on the screen when choosing an object to be removed in photos ‘clean up’, but after that it allows you to search via Google or Chatgpt. Other apps may also sign up for this API that Apple provides.

And this is where this will be pretty exciting – you can roll through all the available places to search, such as Etsy or Amazon. I think this will be a fan favorite when it is sent, though not quite a reason to go out and buy an iPhone that supports visual intelligence … yet, at least.
In addition, if you would rather search for just the entire screen, this is where the buttons ‘Ask’ and ‘Search’ come in. With them you can use either Google or Chatgpt. In addition to the ability to analyze and suggest via screens or search with a selection, Apple is also expanding the types of things that visual intelligence can recognize beyond pets and plants for books, landmarks and works of art.
Not everyone was available right away at the launch, but Apple is clearly working to expand the opportunities for visual intelligence and improve the feature set of Apple Intelligence. Given that this gives us a glimpse of attention on the screen, I am pretty excited.



