Apple Announces Foundation Models Framework for Developers to Leverage AI

Apple at WWDC today announced Foundation Models Framework, a new API allowing third-party developers to leverage the large language models at the heart of Apple Intelligence and build it into their apps.

foundation models framework
With the Foundation Models Framework, developers can integrate Apple's on-device models directly into apps, allowing them to build on Apple Intelligence.

"Last year, we took the first steps on a journey to bring users intelligence that's helpful, relevant, easy to use, and right where users need it, all while protecting their privacy. Now, the models that power Apple Intelligence are becoming more capable and efficient, and we're integrating features in even more places across each of our operating systems," said Craig Federighi, Apple's senior vice president of Software Engineering. "We're also taking the huge step of giving developers direct access to the on-device foundation model powering Apple Intelligence, allowing them to tap into intelligence that is powerful, fast, built with privacy, and available even when users are offline. We think this will ignite a whole new wave of intelligent experiences in the apps users rely on every day. We can't wait to see what developers create."

The Foundation Models framework lets developers build AI-powered features that work offline, protect privacy, and incur no inference costs. For example, an education app can generate quizzes from user notes on-device, and an outdoors app can offer offline natural language search.

Apple says the framework is available for testing starting today through the Apple Developer Program at developer.apple.com, and a public beta will be available through the Apple Beta Software Program next month at beta.apple.com. It includes built-in features like guided generation and tool calling for easy integration of generative capabilities into existing apps.

Popular Stories

AirPods Pro Firmware Feature

Apple Releases New Firmware for AirPods Pro 2, AirPods Pro 3, and AirPods 4

Thursday November 13, 2025 11:35 am PST by
Apple today released new firmware designed for the AirPods Pro 3, the AirPods 4, and the prior-generation AirPods Pro 2. The AirPods Pro 3 firmware is 8B25, while the AirPods Pro 2 and AirPods 4 firmware is 8B21, all up from the prior 8A358 firmware released in October. There's no word on what's include in the updated firmware, but the AirPods Pro 2, AirPods 4 with ANC, and AirPods Pro 3...
CarPlay Pinned Messages

iOS 26.2 Adds New CarPlay Setting

Thursday November 13, 2025 6:48 am PST by
iOS 26 extended pinned conversations in the Messages app to CarPlay, for quick access to your most frequent chats. However, some drivers may prefer the classic view with a list of individual conversations only, and Apple now lets users choose. Apple released the second beta of iOS 26.2 this week, and it introduces a new CarPlay setting for turning off pinned conversations in the Messages...
Tesla Charging

Tesla Working to Add Apple CarPlay Support to Vehicles

Thursday November 13, 2025 8:31 am PST by
Tesla is working to add support for Apple CarPlay in its vehicles, Bloomberg's Mark Gurman reports. Tesla vehicles rely on its own infotainment software system, which integrates vehicle functions, navigation, music, web browsing, and more. The automaker has been an outlier in foregoing support for Apple CarPlay, which has otherwise become an industry standard feature, allowing users to...
tvOS 26 Profiles

tvOS 26.2 Adds a Useful New Feature to Your Apple TV

Friday November 14, 2025 10:02 am PST by
Starting with the upcoming tvOS 26.2 update, currently in beta, additional profiles created on the Apple TV no longer require their own Apple Account. In the Settings app on the Apple TV, under Profiles and Accounts, anyone can create a new profile by simply entering a name and indicating whether the profile is for a kid. The profile will be associated with the primary user's Apple Account,...
iPhone Pocket Short

iPhone Pocket Now Available to Order, But Already Selling Out

Friday November 14, 2025 6:20 am PST by
Apple recently teamed up with Japanese fashion brand ISSEY MIYAKE to create the iPhone Pocket, a limited-edition knitted accessory designed to carry an iPhone. iPhone Pocket is available to order on Apple's online store starting today, in the United States, France, China, Italy, Japan, Singapore, South Korea, and the United Kingdom. However, it is already completely sold out in the United...
homepod mini thumb feature

New HomePod Mini, Apple TV, and AirTag Were Expected This Year — Where Are They?

Wednesday November 12, 2025 11:42 am PST by
While it was rumored that Apple planned to release new versions of the HomePod mini, Apple TV, and AirTag this year, it is no longer clear if that will still happen. Back in January, Bloomberg's Mark Gurman said Apple planned to release new HomePod mini and Apple TV models "toward the end of the year," while he at one point expected a new AirTag to launch "around the middle of 2025." Yet,...
iOS 26

iOS 26.2 Available Next Month With These 8 New Features

Tuesday November 11, 2025 9:48 am PST by
Apple released the first iOS 26.2 beta last week. The upcoming update includes a handful of new features and changes on the iPhone, including a new Liquid Glass slider for the Lock Screen's clock, offline lyrics in Apple Music, and more. In a recent press release, Apple confirmed that iOS 26.2 will be released to all users in December, but it did not provide a specific release date....
m1 chip slide

Five Years of Apple Silicon: M1 to M5 Performance Comparison

Monday November 10, 2025 1:08 pm PST by
Today marks the fifth anniversary of the Apple silicon chip that replaced Intel chips in Apple's Mac lineup. The first Apple silicon chip, the M1, was unveiled on November 10, 2020. The M1 debuted in the MacBook Air, Mac mini, and 13-inch MacBook Pro. The M1 chip was impressive when it launched, featuring the "world's fastest CPU core" and industry-leading performance per watt, and it's only ...
walmart new ornametns

Walmart Black Friday Deals Begin Today With Low Prices on Headphones, TVs, and More

Friday November 14, 2025 7:55 am PST by
Walmart's Black Friday sale has officially kicked off today, with an online shopping event that's also seeing some matching deals in retail locations. There are quite a few major discounts in this sale, including savings on headphones, TVs, and more. Note: MacRumors is an affiliate partner with Walmart. When you click a link and make a purchase, we may receive a small payment, which helps us...
Tim Cook WWDC 2018

Report: Tim Cook to Step Down as Apple CEO 'as Soon as Next Year'

Saturday November 15, 2025 2:40 pm PST by
Apple is preparing for Tim Cook to step down as CEO of the company "as soon as next year," according to the Financial Times. The company's board of directors and senior executives "recently intensified preparations for Cook to hand over the reins," the report said. While the report said that Apple is unlikely to name a new CEO before its next earnings report in late January, it went on to ...

Top Rated Comments

heretiq Avatar
23 weeks ago

Aren't the on device models quite limited in capabilities? What can they do? In any case even access to a limited model could be huge.
While Apple’s OpenELM LLM is no ChatGPT, it is a very capable resource for incorporating conversational interface, on-device RAG and LoRA fine-tuning into an app.

We used it to incorporate a fine-tuned model into one of our apps and was very pleased with the results. We held off on shipping the updated app because at the time it would require the app to download a 3.8GB fine-tuned OpenELM model from huggingface — which we didn’t want to require users to do.

We also tried implementing the same AI feature set by incorporating ChatGPT and Perplexity via API. It worked functionally but the latency and API costs were prohibitive for our use case.

Need to see the details but this announcement could possibly eliminate all of these problems — assuming the foundation models include OpenELM equivalents and the API supports fine-tuning via Adapters as was announced at last year’s WWDC. I can’t wait to see the details and start working with the beta!

Follow-up: After watching the WWDC Platforms State of Union keynote .. AFM Framework is a Bonafide Game Changer!

The AFM framework and APIs exceed my expectations by delivering utility that goes well beyond providing the desired functionality — which for me was simply (a) a capable on-device LLM that eliminates the need for costly, high-latency, off-board 3rd-party LLMs, (b) adapters to allow model fine-tuning, (c) user data privacy, and (d) off-line operation.

The unexpected benefits include tool calling, response streaming and model macros that eliminate complex and error-prone LLM response parsing and mapping to app data structures.

I took the last item (complex, imprecise and time-consuming parsing and mapping) as a given and something that developers should just expect to do when incorporating LLMs into an app with structured data — and was completely surprised to hear that Apple completely eliminated this issue for Apple platform developers! This is a really big deal because this single issue is actually a limiting factor for app development use cases. Prior to the Apple specialized macro utility the solution was either complex and brittle regular expressions that were guaranteed to fail (because LLM output is non-deterministic), or ballooning LLM API cost and latency to add guardrails to constrain LLM output to behave more deterministically.

The final word will depend on the stability of the AFM implementation and how well it aligns with what was demonstrated, but this developer is very pleased. The AFM API is a year late, but definitely way better than what was expected.

Bravo Apple. Thank you! ??
Score: 6 Votes (Like | Disagree)
MacTwick Avatar
23 weeks ago
This is huge. I have so many ideas for my app now! I can't wait.
Score: 4 Votes (Like | Disagree)
macduke Avatar
23 weeks ago
I think this could open up a lot of new and exciting apps for developers to build, but probably at the expense of battery life. It will be interesting to see how this evolves. I think this will be one of those things that is a much bigger deal a few years down the road, so better to see it now than later.
Score: 3 Votes (Like | Disagree)
heretiq Avatar
23 weeks ago

('https://www.macrumors.com/2025/06/09/foundation-models-framework/')

Apple at WWDC today announced Foundation Models Framework, a new API allowing third-party developers to leverage the large language models at the heart of Apple Intelligence and build it into their apps.



With the Foundation Models Framework, developers can integrate Apple's on-device models directly into apps, allowing them to build on Apple Intelligence.
The Foundation Models framework lets developers build AI-powered features that work offline, protect privacy, and incur no inference costs. For example, an education app can generate quizzes from user notes on-device, and an outdoors app can offer offline natural language search.

Apple says the framework is available for testing starting today through the Apple Developer Program at developer.apple.com, and a public beta will be available through the Apple Beta Software Program next month at beta.apple.com. It includes built-in features like guided generation and tool calling for easy integration of generative capabilities into existing apps.

Article Link: Apple Announces Foundation Models Framework for Developers to Leverage AI ('https://www.macrumors.com/2025/06/09/foundation-models-framework/')
Finally!! Been waiting a year for this. Need to see the details but this could be a game changer for devs.
Score: 1 Votes (Like | Disagree)
name99 Avatar
23 weeks ago

Aren't the on device models quite limited in capabilities? What can they do? In any case even access to a limited model could be huge.
They have two main capabilities
- "understanding" language and
- "understanding" images.

The obvious thing you can do is baby steps towards a language-driven UI. Imagine something like you tell UberEats "What was that Asian food I ordered last week? Can you order me that again?"
I think at least part of why Apple is doing this is research, to see how this plays out in the real world.

There are also some less obvious capabilities this allows. For example imagine a note-taking app that creates quizzes from your last week of notes, so you can see what you remember, vs what you don't remember. (I used the word "remember" here deliberately. A better app would create quizzes to see what you UNDERSTAND, but that's probably still too much to expect, even from a leading edge LLM, let along a small edge model.)

Similarly presumably we will see things like photo editing apps where you can just tell the app "remove Jenna's face" and see what happens.
Again this is research. Ultimately the goal is a system-wide language UI, not dedicated per-app code handling this stuff. But at least this gets Apple some of the way there for a year or two, while they figure out the bigger solution.
Score: 1 Votes (Like | Disagree)