Apple is reportedly preparing a major Siri overhaul for iOS 27, turning the assistant into a multi-agent AI system designed to handle more complex requests by tapping into a wider range of specialized chatbots and services. The shift signals Apple’s aim to move beyond a single, tightly integrated chatbot experience and give Siri the flexibility to route tasks to whichever AI is best suited for the job.
At the center of the plan is a new approach that would let Siri work with multiple third-party AI agents. Reports say Apple intends to support popular options such as Google’s Gemini and Anthropic’s Claude. Instead of forcing users into one default model, Siri would be able to hand off a question or task to a chosen agent on request—or automatically select an agent when Siri can’t confidently answer on its own. In practice, this would act like a “handoff” system: Siri stays as the front door, but the heavy lifting can be completed by another AI app when needed.
To make this work, Apple is said to be building an “Extensions” area inside the Apple Intelligence and Siri section in the Settings app. From there, users would be able to add supported chatbots via the App Store. Once installed, these AI tools would be able to operate in sync with Siri, creating a more modular assistant that can expand over time as new agents become available.
This multi-agent direction also opens up a clear business opportunity. Apple reportedly plans to take a share of subscription revenue generated by these chatbot services, creating a new monetization path tied to the growing AI app economy.
Alongside third-party support, Apple is also reportedly developing a dedicated Siri chatbot experience that would run on Google’s cloud infrastructure and TPUs, while still being owned and controlled by Apple. The company is said to be emphasizing that this arrangement won’t weaken its privacy protections, suggesting Apple will keep its existing safeguards in place even as it leans on external compute resources.
The upcoming Siri chatbot is expected to be far more capable than today’s assistant. It’s described as being built directly into Apple’s software so it can do more than answer general questions. The reported feature set includes using personal context, taking actions inside apps, searching the web, generating content (including images), helping with coding tasks, summarizing and analyzing information, and uploading files. Apple is also said to be developing a system that lets the Siri chatbot understand what’s on your screen—such as open windows and visible content—so it can respond with more awareness and even adjust device settings and features based on what you’re doing.
Another notable change: Siri may no longer be primarily a voice-only tool. Apple is reportedly working on a dedicated Siri app in iOS 27 that would store and organize your conversation history, making it easier to revisit past chats, continue threads, and treat Siri more like a persistent AI assistant rather than a one-off voice command feature.
Apple is also testing interface changes. While Siri activation would still work through voice and the power button, a new in-progress UI may live in the Dynamic Island. On top of that, Apple is reportedly exploring the idea of replacing the current Spotlight search experience with Siri, bringing web and on-device search, app discovery, and proactive “Siri Suggestions” into a single, unified interface. Those suggestions would continue to span apps, upcoming calendar items, and AI-recommended setting changes.
Under the hood, the Siri chatbot is rumored to be powered by a more advanced Gemini-based system, internally referred to as Apple Foundation Models version 11. The expectation is that it will be competitive with the next generation of leading AI models and significantly more capable than the model behind the current Siri revamp.
Separate reporting also claims Apple has gained broad access to Google’s Gemini models through a confidential arrangement. A key benefit would be model distillation, a machine learning technique where a smaller “student” model is trained to mimic a larger “teacher” model. By learning from a powerful server-side model, Apple could substantially improve its on-device AI performance—one of the biggest factors in making an assistant feel fast, private, and reliable in everyday use.
If these changes land as described, iOS 27 could mark one of the biggest Siri upgrades in years: a more app-like assistant, a more capable chatbot, deeper on-device and in-app actions, and a multi-agent system that can tap different AI tools depending on what you ask.




