Apple Announces Foundation Models Framework for Developers to Leverage AI

Apple at WWDC today announced Foundation Models Framework, a new API allowing third-party developers to leverage the large language models at the heart of Apple Intelligence and build it into their apps.

foundation models framework
With the Foundation Models Framework, developers can integrate Apple's on-device models directly into apps, allowing them to build on Apple Intelligence.

"Last year, we took the first steps on a journey to bring users intelligence that's helpful, relevant, easy to use, and right where users need it, all while protecting their privacy. Now, the models that power Apple Intelligence are becoming more capable and efficient, and we're integrating features in even more places across each of our operating systems," said Craig Federighi, Apple's senior vice president of Software Engineering. "We're also taking the huge step of giving developers direct access to the on-device foundation model powering Apple Intelligence, allowing them to tap into intelligence that is powerful, fast, built with privacy, and available even when users are offline. We think this will ignite a whole new wave of intelligent experiences in the apps users rely on every day. We can't wait to see what developers create."

The Foundation Models framework lets developers build AI-powered features that work offline, protect privacy, and incur no inference costs. For example, an education app can generate quizzes from user notes on-device, and an outdoors app can offer offline natural language search.

Apple says the framework is available for testing starting today through the Apple Developer Program at developer.apple.com, and a public beta will be available through the Apple Beta Software Program next month at beta.apple.com. It includes built-in features like guided generation and tool calling for easy integration of generative capabilities into existing apps.

Popular Stories

iOS 26

15 New Things Your iPhone Can Do in iOS 26.2

Friday December 5, 2025 9:40 am PST by
Apple is about to release iOS 26.2, the second major point update for iPhones since iOS 26 was rolled out in September, and there are at least 15 notable changes and improvements worth checking out. We've rounded them up below. Apple is expected to roll out iOS 26.2 to compatible devices sometime between December 8 and December 16. When the update drops, you can check Apple's servers for the ...
Intel Inside iPhone Feature

Apple's Return to Intel Rumored to Extend to iPhone

Friday December 5, 2025 10:08 am PST by
Intel is expected to begin supplying some Mac and iPad chips in a few years, and the latest rumor claims the partnership might extend to the iPhone. In a research note with investment firm GF Securities this week, obtained by MacRumors, analyst Jeff Pu said he and his colleagues "now expect" Intel to reach a supply deal with Apple for at least some non-pro iPhone chips starting in 2028....
Photos App Icon Liquid Glass

John Gruber Shares Scathing Commentary About Apple's Departing Software Design Chief

Thursday December 4, 2025 9:30 am PST by
In a statement shared with Bloomberg on Wednesday, Apple confirmed that its software design chief Alan Dye will be leaving. Apple said Dye will be succeeded by Stephen Lemay, who has been a software designer at the company since 1999. Meta CEO Mark Zuckerberg announced that Dye will lead a new creative studio within the company's AR/VR division Reality Labs. On his blog Daring Fireball,...
ive and altman

Jony Ive's OpenAI Device Barred From Using 'io' Name

Friday December 5, 2025 6:22 am PST by
A U.S. appeals court has upheld a temporary restraining order that prevents OpenAI and Jony Ive's new hardware venture from using the name "io" for products similar to those planned by AI audio startup iyO, Bloomberg Law reports. iyO sued OpenAI earlier this year after the latter announced its partnership with Ive's new firm, arguing that OpenAI's planned "io" branding was too close to its...
iOS 26

Apple Seeds iOS 26.2 and iPadOS 26.2 Release Candidates to Developers and Public Beta Testers

Wednesday December 3, 2025 10:33 am PST by
Apple today seeded the release candidate versions of upcoming iOS 26.2 and iPadOS 26.2 updates to developers and public beta testers, with the software coming two weeks after Apple seeded the third betas. The release candidates represent the final versions of iOS 26.2 and iPadOS 26.2 that will be provided to the public if no further bugs are found during this final week of testing....
iphone air camera

iPhone Air's Resale Value Has Dropped Dramatically, Data Shows

Thursday December 4, 2025 5:27 am PST by
The iPhone Air has recorded the steepest early resale value drop of any iPhone model in years, with new data showing that several configurations have lost almost 50% of their value within ten weeks of launch. According to a ten-week analysis published by SellCell, Apple's latest lineup is showing a pronounced split in resale performance between the iPhone 17 models and the iPhone Air....
iPhone 17 Pro Cosmic Orange

iPhone 17 Pro Lost a Camera Feature Pro Models Have Had Since 2020

Thursday December 4, 2025 5:18 am PST by
iPhone 17 Pro models, it turns out, can't take photos in Night mode when Portrait mode is selected in the Camera app – a capability that's been available on Apple's Pro devices since the iPhone 12 Pro in 2020. If you're an iPhone 17 Pro or iPhone 17 Pro Max owner, try it for yourself: Open the Camera app with Photo selected in the carousel, then cover the rear lenses with your hand to...
ios 18 to ios 26 upgrade

Apple Pushes iPhone Users Still on iOS 18 to Upgrade to iOS 26

Tuesday December 2, 2025 11:09 am PST by
Apple is encouraging iPhone users who are still running iOS 18 to upgrade to iOS 26 by making the iOS 26 software upgrade option more prominent. Since iOS 26 launched in September, it has been displayed as an optional upgrade at the bottom of the Software Update interface in the Settings app. iOS 18 has been the default operating system option, and users running iOS 18 have seen iOS 18...
maxresdefault

iPhone Fold: Launch, Pricing, and What to Expect From Apple's Foldable

Monday December 1, 2025 3:00 am PST by
Apple is expected to launch a new foldable iPhone next year, based on multiple rumors and credible sources. The long-awaited device has been rumored for years now, but signs increasingly suggest that 2026 could indeed be the year that Apple releases its first foldable device. Subscribe to the MacRumors YouTube channel for more videos. Below, we've collated an updated set of key details that ...
iPhone 17 Pro Cosmic Orange

10 Reasons to Wait for Next Year's iPhone 18 Pro

Monday December 1, 2025 2:40 am PST by
Apple's iPhone development roadmap runs several years into the future and the company is continually working with suppliers on several successive iPhone models at the same time, which is why we often get rumored features months ahead of launch. The iPhone 18 series is no different, and we already have a good idea of what to expect for the iPhone 18 Pro and iPhone 18 Pro Max. One thing worth...

Top Rated Comments

heretiq Avatar
26 weeks ago

Aren't the on device models quite limited in capabilities? What can they do? In any case even access to a limited model could be huge.
While Apple’s OpenELM LLM is no ChatGPT, it is a very capable resource for incorporating conversational interface, on-device RAG and LoRA fine-tuning into an app.

We used it to incorporate a fine-tuned model into one of our apps and was very pleased with the results. We held off on shipping the updated app because at the time it would require the app to download a 3.8GB fine-tuned OpenELM model from huggingface — which we didn’t want to require users to do.

We also tried implementing the same AI feature set by incorporating ChatGPT and Perplexity via API. It worked functionally but the latency and API costs were prohibitive for our use case.

Need to see the details but this announcement could possibly eliminate all of these problems — assuming the foundation models include OpenELM equivalents and the API supports fine-tuning via Adapters as was announced at last year’s WWDC. I can’t wait to see the details and start working with the beta!

Follow-up: After watching the WWDC Platforms State of Union keynote .. AFM Framework is a Bonafide Game Changer!

The AFM framework and APIs exceed my expectations by delivering utility that goes well beyond providing the desired functionality — which for me was simply (a) a capable on-device LLM that eliminates the need for costly, high-latency, off-board 3rd-party LLMs, (b) adapters to allow model fine-tuning, (c) user data privacy, and (d) off-line operation.

The unexpected benefits include tool calling, response streaming and model macros that eliminate complex and error-prone LLM response parsing and mapping to app data structures.

I took the last item (complex, imprecise and time-consuming parsing and mapping) as a given and something that developers should just expect to do when incorporating LLMs into an app with structured data — and was completely surprised to hear that Apple completely eliminated this issue for Apple platform developers! This is a really big deal because this single issue is actually a limiting factor for app development use cases. Prior to the Apple specialized macro utility the solution was either complex and brittle regular expressions that were guaranteed to fail (because LLM output is non-deterministic), or ballooning LLM API cost and latency to add guardrails to constrain LLM output to behave more deterministically.

The final word will depend on the stability of the AFM implementation and how well it aligns with what was demonstrated, but this developer is very pleased. The AFM API is a year late, but definitely way better than what was expected.

Bravo Apple. Thank you! ??
Score: 6 Votes (Like | Disagree)
MacTwick Avatar
26 weeks ago
This is huge. I have so many ideas for my app now! I can't wait.
Score: 4 Votes (Like | Disagree)
macduke Avatar
26 weeks ago
I think this could open up a lot of new and exciting apps for developers to build, but probably at the expense of battery life. It will be interesting to see how this evolves. I think this will be one of those things that is a much bigger deal a few years down the road, so better to see it now than later.
Score: 3 Votes (Like | Disagree)
heretiq Avatar
26 weeks ago

('https://www.macrumors.com/2025/06/09/foundation-models-framework/')

Apple at WWDC today announced Foundation Models Framework, a new API allowing third-party developers to leverage the large language models at the heart of Apple Intelligence and build it into their apps.



With the Foundation Models Framework, developers can integrate Apple's on-device models directly into apps, allowing them to build on Apple Intelligence.
The Foundation Models framework lets developers build AI-powered features that work offline, protect privacy, and incur no inference costs. For example, an education app can generate quizzes from user notes on-device, and an outdoors app can offer offline natural language search.

Apple says the framework is available for testing starting today through the Apple Developer Program at developer.apple.com, and a public beta will be available through the Apple Beta Software Program next month at beta.apple.com. It includes built-in features like guided generation and tool calling for easy integration of generative capabilities into existing apps.

Article Link: Apple Announces Foundation Models Framework for Developers to Leverage AI ('https://www.macrumors.com/2025/06/09/foundation-models-framework/')
Finally!! Been waiting a year for this. Need to see the details but this could be a game changer for devs.
Score: 1 Votes (Like | Disagree)
name99 Avatar
26 weeks ago

Aren't the on device models quite limited in capabilities? What can they do? In any case even access to a limited model could be huge.
They have two main capabilities
- "understanding" language and
- "understanding" images.

The obvious thing you can do is baby steps towards a language-driven UI. Imagine something like you tell UberEats "What was that Asian food I ordered last week? Can you order me that again?"
I think at least part of why Apple is doing this is research, to see how this plays out in the real world.

There are also some less obvious capabilities this allows. For example imagine a note-taking app that creates quizzes from your last week of notes, so you can see what you remember, vs what you don't remember. (I used the word "remember" here deliberately. A better app would create quizzes to see what you UNDERSTAND, but that's probably still too much to expect, even from a leading edge LLM, let along a small edge model.)

Similarly presumably we will see things like photo editing apps where you can just tell the app "remove Jenna's face" and see what happens.
Again this is research. Ultimately the goal is a system-wide language UI, not dedicated per-app code handling this stuff. But at least this gets Apple some of the way there for a year or two, while they figure out the bigger solution.
Score: 1 Votes (Like | Disagree)