Third-Party Devs Will Be Able to Access iPadOS Apple Pencil Latency Improvements for Art Apps

Apple in iPadOS introduced some performance improvements between the iPad Pro and the Apple Pencil, cutting latency from 20ms to 9ms with the new software.

Third-party developers who make apps that use the ‌Apple Pencil‌ will also be able to take advantage of some of these latency improvements, Apple software development chief Craig Federighi confirmed last week.

ipadproapplepencil
Federighi shared the information in a response to an email sent by Artstudio Pro developer Cladio Juliano, who tweeted what Federighi had to say last week. The info was highlighted today in a tweet by developer Steve Troughton-Smith.

In the email, Federighi explains that third-party developers have had access to predicted touches via UIKit since iOS 9, and with iOS 13, developers will receive the "latest and greatest" touch prediction advancements in minimizing PencilKit drawing latency.

Federighi explains just how Apple introduced the latency improvements, and he points out that there's a small gap of 4ms that developers won't have access to at the current time because Apple didn't have a way to safely expose the capability to developers. From Federighi's email:

Note that we achieve low latency through a combination of several techniques: Metal rendering optimizations, touch prediction, and mid-frame event processing. Third-party developers can achieve similar low-latency drawing experiences by taking advantage of Metal rendering and touch prediction best practices covered in the WWDC Sessions I've referenced below.

With these you can achieve nearly all of the improvements you've seen in PencilKit drawing with your own renderer. (There does remain a small gap: 4 ms of our improvement comes from a technique called mid-frame event processing; we are looking for ways to expose this capability to third party engines in the future, but for this year this one was only safely achievable through tight integration within our frameworks).

For developers, the WWDC sessions Federighi suggests include PencilKit, Adopting Predicted Touches, and Metal Performance Optimization.

In a nutshell, the information shared by Federighi confirms that third-party apps that take advantage of the ‌Apple Pencil‌ will be getting some of the same latency improvements that we'll be seeing when using the ‌Apple Pencil‌ within native functions like Markup.

The ‌Apple Pencil‌ latency improvements are built into iPadOS, the version of iOS 13 that is designed to run on the iPad. All of Apple's current iPads support the ‌Apple Pencil‌. ‌iPad Pro‌ models work with the ‌Apple Pencil‌ 2, while the 6th-generation ‌iPad‌, iPad mini, and iPad Air work with the original ‌Apple Pencil‌.

Related Forum: iOS 13

Popular Stories

iOS 26

15 New Things Your iPhone Can Do in iOS 26.2

Friday December 5, 2025 9:40 am PST by
Apple is about to release iOS 26.2, the second major point update for iPhones since iOS 26 was rolled out in September, and there are at least 15 notable changes and improvements worth checking out. We've rounded them up below. Apple is expected to roll out iOS 26.2 to compatible devices sometime between December 8 and December 16. When the update drops, you can check Apple's servers for the ...
ios 18 to ios 26 upgrade

Apple Pushes iPhone Users Still on iOS 18 to Upgrade to iOS 26

Tuesday December 2, 2025 11:09 am PST by
Apple is encouraging iPhone users who are still running iOS 18 to upgrade to iOS 26 by making the iOS 26 software upgrade option more prominent. Since iOS 26 launched in September, it has been displayed as an optional upgrade at the bottom of the Software Update interface in the Settings app. iOS 18 has been the default operating system option, and users running iOS 18 have seen iOS 18...
iOS 26

Apple Seeds iOS 26.2 and iPadOS 26.2 Release Candidates to Developers and Public Beta Testers

Wednesday December 3, 2025 10:33 am PST by
Apple today seeded the release candidate versions of upcoming iOS 26.2 and iPadOS 26.2 updates to developers and public beta testers, with the software coming two weeks after Apple seeded the third betas. The release candidates represent the final versions of iOS 26.2 and iPadOS 26.2 that will be provided to the public if no further bugs are found during this final week of testing....
Photos App Icon Liquid Glass

John Gruber Shares Scathing Commentary About Apple's Departing Software Design Chief

Thursday December 4, 2025 9:30 am PST by
In a statement shared with Bloomberg on Wednesday, Apple confirmed that its software design chief Alan Dye will be leaving. Apple said Dye will be succeeded by Stephen Lemay, who has been a software designer at the company since 1999. Meta CEO Mark Zuckerberg announced that Dye will lead a new creative studio within the company's AR/VR division Reality Labs. On his blog Daring Fireball,...
iOS 26

When Will Apple Release iOS 26.2?

Monday December 1, 2025 4:37 pm PST by
We're getting closer to the launch of the final major iOS update of the year, with Apple set to release iOS 26.2 in December. We've had three betas so far and are expecting a fourth beta or a release candidate this week, so a launch could follow as soon as next week. Past Launch Dates Apple's past iOS x.2 updates from the last few years have all happened right around the middle of the...
Intel Inside iPhone Feature

Apple's Return to Intel Rumored to Extend to iPhone

Friday December 5, 2025 10:08 am PST by
Intel is expected to begin supplying some Mac and iPad chips in a few years, and the latest rumor claims the partnership might extend to the iPhone. In a research note with investment firm GF Securities this week, obtained by MacRumors, analyst Jeff Pu said he and his colleagues "now expect" Intel to reach a supply deal with Apple for at least some non-pro iPhone chips starting in 2028....
maxresdefault

iPhone Fold: Launch, Pricing, and What to Expect From Apple's Foldable

Monday December 1, 2025 3:00 am PST by
Apple is expected to launch a new foldable iPhone next year, based on multiple rumors and credible sources. The long-awaited device has been rumored for years now, but signs increasingly suggest that 2026 could indeed be the year that Apple releases its first foldable device. Subscribe to the MacRumors YouTube channel for more videos. Below, we've collated an updated set of key details that ...
ive and altman

Jony Ive's OpenAI Device Barred From Using 'io' Name

Friday December 5, 2025 6:22 am PST by
A U.S. appeals court has upheld a temporary restraining order that prevents OpenAI and Jony Ive's new hardware venture from using the name "io" for products similar to those planned by AI audio startup iyO, Bloomberg Law reports. iyO sued OpenAI earlier this year after the latter announced its partnership with Ive's new firm, arguing that OpenAI's planned "io" branding was too close to its...
iphone air camera

iPhone Air's Resale Value Has Dropped Dramatically, Data Shows

Thursday December 4, 2025 5:27 am PST by
The iPhone Air has recorded the steepest early resale value drop of any iPhone model in years, with new data showing that several configurations have lost almost 50% of their value within ten weeks of launch. According to a ten-week analysis published by SellCell, Apple's latest lineup is showing a pronounced split in resale performance between the iPhone 17 models and the iPhone Air....
iPhone 17 Pro Cosmic Orange

iPhone 17 Pro Lost a Camera Feature Pro Models Have Had Since 2020

Thursday December 4, 2025 5:18 am PST by
iPhone 17 Pro models, it turns out, can't take photos in Night mode when Portrait mode is selected in the Camera app – a capability that's been available on Apple's Pro devices since the iPhone 12 Pro in 2020. If you're an iPhone 17 Pro or iPhone 17 Pro Max owner, try it for yourself: Open the Camera app with Photo selected in the carousel, then cover the rear lenses with your hand to...

Top Rated Comments

thisisnotmyname Avatar
84 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
Score: 12 Votes (Like | Disagree)
cocky jeremy Avatar
84 months ago
I mean... duh?
Score: 6 Votes (Like | Disagree)
cmaier Avatar
84 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
It is, if you use the appropriate control. But developer’s may want to integrate it into their own canvas or controls. In which case it is harder to expose it since things you do in your own code can interfere with the ability of the pen code to get the cycles it needs from the GPU and CPU.
Score: 4 Votes (Like | Disagree)
nexusrule Avatar
84 months ago
How nice of Apple. You would think they would limit functionality improvements to their own apps.
I think you don’t know how development works. When you start creating code you can’t always abstract it in a way that’s usable by third party devs through an API. What Federighi meant is right now the code that allow for that part of delay reduction is split between different of Apple software technologies. To be made safely accessible to others devs it needs to be abstracted, made indipendent, because private frameworks can’t be exposed for security reasons. You build these frameworks after you have the working feature, it’s simply impossible to abstract a solution that doesn’t exist. And this sort of work can require a massive rewrite of some parts of the relevant underlying technologies, and it requires time.
Score: 4 Votes (Like | Disagree)
NickName99 Avatar
84 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Score: 4 Votes (Like | Disagree)
Cayden Avatar
84 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Now I’m not sure so take this with a grain of salt, but as an engineer I’m inclined to believe “mid-frame event processing” means they are updating some pixel information (likely just the pixels associated with the pencil) in between frame updates in which all pixel information is updated and displayed. In other words, in between hardware detections of the pencil location, software would update where it presicts the pencil to be on the next update, and it can start looking for the pencil there instead of looking arbitrarily, mean the location can (usually) be found quicker. What I’m not sure about is if these pixels are actually being updated mid-frame or if the processing is simply keeping this information stored until the next frame is ready to update. I can’t see how the pixels could be updated mid-frame unless they had an individual refresh rate, so I’m inclined to believe the second case. If it’s the second case, it would make sense why Apple doesn’t want to give developers access to this, as this could quickly lead to timing errors between the software and hardware interrupts, such that it would only work within Apple’s framework and not an arbitrary code framework.
Score: 3 Votes (Like | Disagree)