Skip to Content

Apple Releases Open Source AI Models That Run On-Device

Apple today released several open source large language models (LLMs) that are designed to run on-device rather than through cloud servers. Called OpenELM (Open-source Efficient Language Models), the LLMs are available on the Hugging Face Hub, a community for sharing AI code.

Apple Silicon AI Optimized Feature Siri
As outlined in a white paper [PDF], there are eight total OpenELM models, four of which were pre-trained using the CoreNet library, and four instruction tuned models. Apple uses a layer-wise scaling strategy that is aimed at improving accuracy and efficiency.

Apple provided code, training logs, and multiple versions rather than just the final trained model, and the researchers behind the project hope that it will lead to faster progress and "more trustworthy results" in the natural language AI field.

OpenELM, a state-of-the-art open language model. OpenELM uses a layer-wise scaling strategy to efficiently allocate parameters within each layer of the transformer model, leading to enhanced accuracy. For example, with a parameter budget of approximately one billion parameters, OpenELM exhibits a 2.36% improvement in accuracy compared to OLMo while requiring 2x fewer pre-training tokens.

Diverging from prior practices that only provide model weights and inference code, and pre-train on private datasets, our release includes the complete framework for training and evaluation of the language model on publicly available datasets, including training logs, multiple checkpoints, and pre-training configurations.

Apple says that it is releasing the OpenELM models to "empower and enrich the open research community" with state-of-the-art language models. Sharing open source models gives researchers a way to investigate risks and data and model biases. Developers and companies are able to use the models as-is or make modifications.

The open sharing of information has become an important tool for Apple to recruit top engineers, scientists, and experts because it provides opportunities for research papers that would not normally have been able to be published under Apple's secretive policies.

Apple has not yet brought these kinds of AI capabilities to its devices, but iOS 18 is expected to include a number of new AI features, and rumors suggest that Apple is planning to run its large language models on-device for privacy purposes.

Popular Stories

MacBook Neo Feature Pastel 1

First MacBook Neo Benchmarks Are In: Here's How It Compares to the M1 MacBook Air

Thursday March 5, 2026 4:07 pm PST by
Benchmarks for the new MacBook Neo surfaced today, and unsurprisingly, CPU performance is almost identical to the iPhone 16 Pro. The MacBook Neo uses the same 6-core A18 Pro chip that was first introduced in the iPhone 16 Pro, but it has one fewer GPU core. The MacBook Neo earned a single-core score of 3461 and a multi-core score of 8668, along with a Metal score of 31286. Here's how the...
MacBook Neo Feature Pastel 1

Apple Announces $599 'MacBook Neo' With A18 Pro Chip

Wednesday March 4, 2026 6:15 am PST by
Apple today announced the "MacBook Neo," an all-new kind of low-cost Mac featuring the A18 Pro chip for $599. The MacBook Neo is the first Mac to be powered by an iPhone chip; the A18 Pro debuted in 2024's iPhone 16 Pro models. Apple says it is up to 50% faster for everyday tasks than the bestselling PC with the latest shipping Intel Core Ultra 5, up to 3x faster for on-device AI workloads,...
Multicolored Low Cost A18 Pro MacBook Feature

Apple Accidentally Leaks 'MacBook Neo'

Tuesday March 3, 2026 7:00 am PST by
Apple appears to have prematurely revealed the name of its rumored lower-cost MacBook model, which is expected to be announced this Wednesday. A regulatory document for a "MacBook Neo" (Model A3404) has appeared on Apple's website. Unfortunately, there are no further details or images available yet. While the PDF file does not contain the "MacBook Neo" name, it briefly appeared in a link...

Top Rated Comments

Populus Avatar
24 months ago

What? No comments? Hey, ChatGPT, generate me a comment....
[Hey ChatGPT, please generate a comment in the style of a typical MacRumors average user, with a touch of acid humor, regarding this piece of news.]

"Oh great, Apple's finally joining the open-source party—just a decade late and probably still with strings attached somewhere in those 'open' terms. They're throwing us a bone with OpenELM, but let's be real, they’re probably just doing it to lure in some AI hotshots tired of their corporate overlords. Now we just have to sit back and wait for iOS 18, where they'll inevitably limit these models to the latest hardware, forcing us all to upgrade. Because, you know, my current iPhone can't possibly handle a couple more AI tricks without combusting."

Attachment Image
Score: 78 Votes (Like | Disagree)
NT1440 Avatar
24 months ago

It's clear they are behind and need help. OpenAI's GPT-4 has 1.76 trillion parameters. Apple's OpemELM model has 3 billions parameters.
Does ChatGPT even attempt to run on device? You know, the whole point of this?

The thing I’ve noticed about all these AI hype-people is they certainly know what the “leader” of the pack day to day is, but somehow can’t imagine smaller models being a better solution for a given task. Instead of a one size fits all approach, what’s wrong with invoking a trained AI that is smaller but specifically suited for the task at hand *automatically based on the context of what you’re currently doing*?

Just some food for thought…
Score: 42 Votes (Like | Disagree)
24 months ago

Does ChatGPT even attempt to run on device? You know, the whole point of this?

The thing I’ve noticed about all these AI hype-people is they certainly know what the “leader” of the pack day to day is, but somehow can’t imagine smaller models being a better solution for a given task. Instead of a one size fits all approach, what’s wrong with invoking a trained AI that is smaller but specifically suited for the task at hand *automatically based on the context of what you’re currently doing*?

Just some food for thought…
Thank you for adding something meaningful to this otherwise nauseatingly cynical thread. 🙏🏽
Score: 22 Votes (Like | Disagree)
24 months ago

All this for an improved Siri is crazy ngl
Yep, it will now reply 10x faster with "I'm sorry, I seem to be having a problem right now"
Score: 16 Votes (Like | Disagree)
Arislan Avatar
24 months ago

It's clear they are behind and need help. OpenAI's GPT-4 has 1.76 trillion parameters. Apple's OpemELM model has 3 billions parameters.
And does that 1.76 trillion parameter model run on a phone without cloud servers?
Score: 16 Votes (Like | Disagree)
24 months ago

Apple finally decides to join the open-source party, huh? Only a decade late and I bet there are still some hidden strings attached to their so-called 'open' terms. They're throwing us a bone with OpenELM, but let's be real, they're probably just trying to lure in some AI hotshots tired of their corporate overlords. And now we have to wait for iOS 18, where they'll undoubtedly limit these models to the latest hardware, forcing us all to upgrade. Because, you know, my current iPhone surely can't handle a few more AI tricks without spontaneously combusting. Classic Apple move.
Hmmm .. FYI: it seems that @Populus beat you to using ChatGPT to “contribute” to the discussion; only difference is that he acknowledged he was using it .. 😏
Score: 15 Votes (Like | Disagree)