Apple is apparently working on a Siri chatbot that will rival Claude, Gemini, and ChatGPT, and Apple is aiming to debut it in less than six months when iOS 27 is unveiled at WWDC. Bloomberg shared details on the chatbot earlier today, but there was one major question unanswered: what will Apple charge?

Anthropic, Google, OpenAI, and other companies that run major chatbots offer a free version, but it's often throttled and a paid subscription is required for full functionality. Apple is reportedly planning to integrate its Siri chatbot deeply into iOS, iPadOS, and macOS instead of offering a standalone app. A Siri chatbot available on billions of devices is going to be expensive to run, but Siri is also so core to Apple products that people aren't going to want to pay for what's always been free.
What the Siri Chatbot Can Do
Per Bloomberg, the Siri chatbot will be able to "search the web for information, create content, generate images, summarize information and analyze uploaded files." It will also be able to control Apple devices and use personal data and on-screen information for search and to complete tasks. That sounds like just about everything that existing chatbots like ChatGPT can do, plus Apple is integrating the chatbot into all of its apps.
On-Device Siri Chatbot?
Some of those tasks can be completed on-device using the powerful A-series and M-series chips Apple has been building into its products, but Apple is using a custom AI model developed with the Google Gemini team. According to Bloomberg, the model is roughly comparable to Gemini 3, and the full version of Gemini 3 can't run on a high-end Mac, let alone a mobile device.
Apple is going to need servers to run the Siri chatbot, and while it has been building Private Cloud Compute servers for AI features, it's unlikely that it has enough for a Siri chatbot. Bloomberg suggests that Apple is actually discussing running its chatbot on Google servers, and Google isn't going to do that for free.
Compute Costs and Infrastructure
Whether Apple is using its own private cloud compute servers or Google's Tensor servers, it needs serious compute power. Every question Siri is asked and every image Siri generates will cost Apple.
OpenAI is not profitable, and it spends billions on inference each year. OpenAI has committed to spending $1.4 trillion on infrastructure to keep up with demand, an amount of money that it doesn't have yet. Google spent $85 billion on infrastructure to meet AI demand in 2025. In August, Google said that the median Gemini Apps text prompt uses 0.24 watt-hours of energy. At scale, across all Google devices and all Google products, that's hundreds of millions of dollars per year just in electricity costs.
How Gemini is Priced
Google has already integrated Gemini into its Pixel smartphones and other Android devices. It has a split tier system that Apple might adopt.
Android users have access to a free version of Gemini that costs Google less to run. It can answer questions, summarize text, write emails, and control apps and smartphone features. Android users have to pay $20 per month for Gemini Advanced to get access to the more advanced version of Gemini that offers better reasoning, longer context for analyzing bigger documents, and improved coding.
Apple could do something similar, offering a basic version of Siri that's accessible to everyone, with more advanced models available with a subscription. iCloud already provides a model for a free/paid product split. Apple offers all Apple users 5GB of cloud storage for free, but anything more will cost you.
Temporarily Free?
Apple could make its Siri chatbot free to use to begin with, which would lure users who are paying for other services like ChatGPT. ChatGPT, Claude, and Gemini are all around $20 per month, so Apple eating Siri chatbot costs for a year or two would be hard to compete with. Even undercutting current prices would likely lure customers and make Apple an immediate key player in the AI market.
Right now, Apple Intelligence is entirely free to use even for images generated with Image Playground, but the capabilities are limited and some functionality runs on-device.
Possible Cost
Apple might not be able to absorb AI costs, and there could be paid options right when the Siri chatbot launches. If that's the case, pricing will likely be competitive with existing chatbots.
AI companies have decided entry-level plans should cost $20/month, but it's not clear if that price point is actually sustainable with the growing costs of training new models and supporting more users.
- ChatGPT Plus - $20/month
- Copilot Pro - $20/month
- Gemini Advanced - $19.99/month
- Claude Pro - $20/month
- Perplexity Pro - $20/month
Siri ChatGPT Integration
Right now, Apple has a partnership with OpenAI to hand complex requests off to ChatGPT. Apple doesn't pay OpenAI for this feature, but it does put ChatGPT in front of millions of Apple users. When Apple launches its Siri chatbot, ChatGPT integration could be removed.
Eliminating the ChatGPT integration might also impact Apple's legal battle with Elon Musk. Musk's xAI company sued Apple and OpenAI for colluding to promote ChatGPT over other AI bots like Grok, arguing that Apple should let other chatbots integrate with Siri.
If Apple stops offering ChatGPT through Siri in favor of its own Siri chatbot, it would be no different than Google integrating Gemini into all Android devices, or Meta limiting its smart glasses to Meta AI.
Launch Timing
We'll probably be hearing more about the Siri chatbot in the coming months. Apple is aiming to unveil the functionality in iOS 27, iPadOS 27, and macOS 27, which will be previewed in June at WWDC.





















Apple home hub (concept)

Meta Ray-Ban smart glasses



















