- Apple is reportedly redesigning Siri to support ongoing conversations instead of one-off commands
- The new Siri will integrate across apps and use personal context
- A refreshed interface and chat-style experience aims to make Siri feel more like ChatGPT and other AI tools
Apple opens its Worldwide Developers Conference on June 8 with a familiar promise, but one that may finally be fulfilled. The company will unveil a revamped Siri, finally answering the question that’s been asked for years about whether Apple can finally bring its longtime assistant up to par with ChatGPT and other newer AI tools into something that feels relevant.
The company has spent years refining Siri in small ways, but the upcoming version tied to iOS 27 is expected to be a much deeper overhaul. Early leaked details point to a redesigned interface that supports a shift towards conversational AI and a broader role for Siri, bigger than it’s likely had in years.
Meet the new Siri
The biggest change will be how you talk to Siri. Until now, short, specific instructions delivered one at a time have been the norm. “Siri, what’s the weather?” “Siri, set a timer,” and so on. Each request exists on its own.
The article continues below
The new Siri is expected to support continuous conversations, allowing users to ask follow-up questions and combine multiple requests into a single interaction. It will be able to transfer context across exchanges, meaning you can refine a request instead of starting over each time.
It’s the kind of interaction people have already gotten used to with tools like ChatGPT. The difference is that Apple brings it directly into the operating system. Siri will track what you’re asking and adapt as the conversation evolves.
Reports suggest that Apple is leaning on large language model capabilities with support from Google’s Gemini models. Questions that once required a web search or a separate chatbot can now fall within reach. The result should theoretically be a smartphone AI assistant capable of far more different interactions.
Apple appears to be moving Siri out of its traditional full-screen takeover and into something more integrated. On newer iPhones, the assistant is expected to live on the dynamic island and expand into view when activated.
There are also signs that Apple is building a dedicated app for Siri, with chat history and a look familiar to users of many AI chatbots. The more significant transformation occurs below the surface.
A reworked AI
Siri is being reworked to work across apps and services, drawing on personal context and what’s currently on the screen. This is where Apple’s long-promised ideas about awareness and integration begin to take shape. The assistant will be able to refer to messages, emails and other data to perform multi-step tasks.
In practical terms, this could mean asking Siri to look at a conversation, pull up a relevant detail and act on it without the need for separate instructions.
Siri’s overhaul comes after a period in which Apple has been seen as a laggard in the AI space. While competitors pushed forward with conversational systems and generative tools, Apple moved more cautiously. There seemed to be a growing gap between what Siri could do and what users expected based on what ChatGPT, Gemini and Claude could do.
Apple wants to close this gap, though the approach remains consistent as it integrates these new features directly into Siri, rather than making it part of its newer Apple Intelligence brand. In contrast, Google transitioned from Assistant to Gemini (via Bard). Matching that experience while maintaining Apple’s emphasis on privacy and control is no simple task, but Apple clearly thinks it’s time to try.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews and opinions in your feeds. Be sure to click the Follow button!
And of course you can too follow TechRadar on TikTok for news, reviews, video unboxings, and get regular updates from us on WhatsApp also.

The best SSDs for all budgets



