Apple's New Siri Is Powered by Google Gemini — iOS 26.4 and Project Campos Explained
April 2, 2026 · 7 min read
TL;DR
Apple confirmed March 1, 2026 that iOS 26.4 Siri runs on Google Gemini — a 1.2 trillion parameter model — under Project Campos, reportedly worth $1B/year. New capabilities: on-screen context awareness, multi-step cross-app automation, complex reasoning. Privacy: Apple's Private Cloud Compute keeps user data off Google's servers. No Google branding in the interface. Full agentic features come with iOS 27 in September 2026.
Apple spent over a decade insisting Siri would be built entirely in-house. The iOS 26.4 announcement is a complete reversal: Siri's intelligence now runs on Google's Gemini models, making this the most significant strategic concession in Apple's history. Here is what changed, what it does, and what it means for every iPhone user.
What Project Campos Is
Project Campos is the internal name for Apple's multi-year AI infrastructure partnership with Google. The joint statement from both companies on January 12, 2026 was direct: "Apple Foundation Models will be based on Google's Gemini models and cloud technology."
The deal is worth approximately $1 billion annually. Apple evaluated OpenAI (which powers ChatGPT integration in iOS 18 via Siri) and Anthropic before selecting Google as the foundation for its next-generation assistant. Google's Gemini 1.2 trillion parameter model won the evaluation for capability, latency, and infrastructure alignment.
Despite the partnership, no Google branding appears anywhere in the Siri interface. The experience is entirely Apple-branded.
What the New Siri Can Do in iOS 26.4
| Feature | What It Does |
|---|---|
| On-screen context awareness | Siri sees and interprets what is currently displayed on your screen — summarizes documents, extracts flight details from emails, books restaurants from Safari reviews |
| Multi-step cross-app automation | One command executes sequences across apps: "Review urgent work emails, create a memo summarizing them, and set reminders for each action item" |
| Complex reasoning | Gemini's 1.2T parameters replace Apple's smaller on-device models for queries requiring nuanced understanding, context, and planning |
| Private Cloud Compute | Queries processed on Apple's own servers using de-identified data — Google's model runs the inference but does not receive identifiable user data |
The Privacy Architecture
The most important technical detail for users is how privacy works. Siri does not send your raw queries or personal data to Google. Apple's Private Cloud Compute layer sits between the user and Gemini:
- Your query is processed by Apple's servers
- Personal identifiers are stripped
- The anonymized query is sent to Gemini for inference
- The response returns through Apple's servers to your device
Google receives inference requests, not user identity. Apple maintains that this architecture meets its privacy commitments. Independent audits of this claim have not yet been published.
What Is Coming in iOS 27
iOS 26.4's Siri is the first phase. The second phase — full agentic capabilities with deep app integration — is planned for iOS 27, expected in September 2026. This includes autonomous task completion across third-party apps, proactive suggestions, and persistent context across Siri sessions.
The September 2026 update is expected to position Siri as a true AI agent comparable to Google Gemini Live and ChatGPT's computer use capabilities.
What This Means for the AI Market
Apple's partnership with Google for Siri has three implications for the broader AI ecosystem:
- Google's distribution advantage grows: Gemini now reaches approximately 1.5 billion active iPhone users, making it the most widely distributed AI model by installed base.
- The cost of building frontier AI models is too high for even Apple: If Apple cannot build competitive on-device AI at scale, this validates why smaller companies are licensing models rather than training their own.
- OpenAI's position weakens: Siri will compete directly with ChatGPT integration in the iPhone interface. Apple controls which AI gets default placement — and Gemini is now the default.
Use AI Across Every Platform
While Siri gets smarter on your iPhone, Happycapy works across all your devices for complex planning, research, and workflow automation.
Try Happycapy →Frequently Asked Questions
Is the new Siri in iOS 26.4 powered by Google Gemini?
Yes. Apple confirmed on March 1, 2026 that iOS 26.4 ships Siri powered by Google Gemini as its core AI engine, under a multi-year partnership known internally as Project Campos. Apple Foundation Models are now based on Gemini and Google Cloud technology.
How much is Apple paying Google for the Gemini partnership?
The Apple-Google Gemini partnership is reported to be worth approximately $1 billion annually. This is a multi-year agreement under which Apple Foundation Models are built on Gemini and Google Cloud technology.
Does Apple share your data with Google through the new Siri?
No, according to Apple. The new Siri uses Apple's Private Cloud Compute architecture to ensure user data is de-identified and processed on Apple's own servers. Google's model runs the inference, but Apple's privacy layer means Google does not receive identifiable user data from Siri queries.
What are the new features of Siri in iOS 26.4?
iOS 26.4 Siri adds on-screen context awareness (sees what's on your screen), multi-step cross-app automation, and complex reasoning powered by Gemini 1.2T parameters. Full agentic capabilities with deep app integration are planned for iOS 27 in September 2026.
Sources
- • CNBC: Apple picks Google's Gemini to run AI-powered Siri, January 12, 2026
- • CNN Business: Apple teams up with Google Gemini for AI-powered Siri, January 12, 2026
- • Google Blog: Joint statement from Google and Apple, January 12, 2026
- • MacRumors: Apple Explains How Gemini-Powered Siri Will Work, January 30, 2026