Third-Party Devs Will Be Able to Access iPadOS Apple Pencil Latency Improvements for Art Apps

Apple in iPadOS introduced some performance improvements between the iPad Pro and the Apple Pencil, cutting latency from 20ms to 9ms with the new software.

Third-party developers who make apps that use the ‌Apple Pencil‌ will also be able to take advantage of some of these latency improvements, Apple software development chief Craig Federighi confirmed last week.

ipadproapplepencil
Federighi shared the information in a response to an email sent by Artstudio Pro developer Cladio Juliano, who tweeted what Federighi had to say last week. The info was highlighted today in a tweet by developer Steve Troughton-Smith.

In the email, Federighi explains that third-party developers have had access to predicted touches via UIKit since iOS 9, and with iOS 13, developers will receive the "latest and greatest" touch prediction advancements in minimizing PencilKit drawing latency.

Federighi explains just how Apple introduced the latency improvements, and he points out that there's a small gap of 4ms that developers won't have access to at the current time because Apple didn't have a way to safely expose the capability to developers. From Federighi's email:

Note that we achieve low latency through a combination of several techniques: Metal rendering optimizations, touch prediction, and mid-frame event processing. Third-party developers can achieve similar low-latency drawing experiences by taking advantage of Metal rendering and touch prediction best practices covered in the WWDC Sessions I've referenced below.

With these you can achieve nearly all of the improvements you've seen in PencilKit drawing with your own renderer. (There does remain a small gap: 4 ms of our improvement comes from a technique called mid-frame event processing; we are looking for ways to expose this capability to third party engines in the future, but for this year this one was only safely achievable through tight integration within our frameworks).

For developers, the WWDC sessions Federighi suggests include PencilKit, Adopting Predicted Touches, and Metal Performance Optimization.

In a nutshell, the information shared by Federighi confirms that third-party apps that take advantage of the ‌Apple Pencil‌ will be getting some of the same latency improvements that we'll be seeing when using the ‌Apple Pencil‌ within native functions like Markup.

The ‌Apple Pencil‌ latency improvements are built into iPadOS, the version of iOS 13 that is designed to run on the iPad. All of Apple's current iPads support the ‌Apple Pencil‌. ‌iPad Pro‌ models work with the ‌Apple Pencil‌ 2, while the 6th-generation ‌iPad‌, iPad mini, and iPad Air work with the original ‌Apple Pencil‌.

Related Forum: iOS 13

Popular Stories

iphone 17 models

No iPhone 18 Launch This Year, Reports Suggest

Thursday January 1, 2026 8:43 am PST by
Apple is not expected to release a standard iPhone 18 model this year, according to a growing number of reports that suggest the company is planning a significant change to its long-standing annual iPhone launch cycle. Despite the immense success of the iPhone 17 in 2025, the iPhone 18 is not expected to arrive until the spring of 2027, leaving the iPhone 17 in the lineup as the latest...
duolingo ad live activity

Duolingo Used iPhone's Dynamic Island to Display Ads, Violating Apple Design Guidelines

Friday January 2, 2026 1:36 pm PST by
Language learning app Duolingo has apparently been using the iPhone's Live Activity feature to display ads on the Lock Screen and the Dynamic Island, which violates Apple's design guidelines. According to multiple reports on Reddit, the Duolingo app has been displaying an ad for a "Super offer," which is Duolingo's paid subscription option. Apple's guidelines for Live Activity state that...
Low Cost A18 Pro MacBook Feature Pink

Apple's 2026 Low-Cost A18 Pro MacBook: What We Know So Far

Friday January 2, 2026 4:33 pm PST by
Apple is planning to release a low-cost MacBook in 2026, which will apparently compete with more affordable Chromebooks and Windows PCs. Apple's most affordable Mac right now is the $999 MacBook Air, and the upcoming low-cost MacBook is expected to be cheaper. Here's what we know about the low-cost MacBook so far. Size Rumors suggest the low-cost MacBook will have a display that's around 13 ...
govee floor lamp

CES 2026: Govee Announces New Matter-Connected Ceiling and Floor Lights

Sunday January 4, 2026 5:00 am PST by
Govee today introduced three new HomeKit-compatible lighting products, including the Govee Floor Lamp 3, the Govee Ceiling Light Ultra, and the Govee Sky Ceiling Light. The Govee Floor Lamp 3 is the successor to the Floor Lamp 2, and it offers Matter integration with the option to connect to HomeKit. The Floor Lamp 3 offers an upgraded LuminBlend+ lighting system that can reproduce 281...
airpods pro 3 glitter

AirPods New Year's Deals Include Up to $99 Off AirPods Max, AirPods Pro 3, and AirPods 4

Sunday January 4, 2026 8:04 am PST by
Now that the calendar has flipped over into January, steep discounts on popular Apple products have become more rare after the holidays. However, if you didn't get a new pair of AirPods recently and are looking for a model on sale, Amazon does have a few solid second-best prices this week. Note: MacRumors is an affiliate partner with some of these vendors. When you click a link and make a...
Belkin 25W Battery magnetic

CES 2026: Belkin Announces Magnetic Ring Power Bank, Modular Dock, and More

Sunday January 4, 2026 3:02 pm PST by
Belkin today announced a range of new charging and connectivity accessories at CES 2026, expanding its portfolio of products aimed at Apple device users. UltraCharge Pro Power Bank 10K with Magnetic Ring The lineup includes new Qi2 and Qi2.2 wireless chargers, magnetic power banks, a high-capacity laptop battery, and USB-C productivity accessories, with an emphasis on higher charging...
samsung crease less foldable display ces 2026%402x

Foldable iPhone's Crease-Free Display Tech Spotted at CES 2026

Tuesday January 6, 2026 3:04 am PST by
CES 2026 has just provided a first glimpse of the folding display technology that Apple is expected to use in its upcoming foldable iPhone. At the event, Samsung Display briefly showcased its new crease-less foldable OLED panel beside a Galaxy Z Fold 7, and according to SamMobile, which saw the test booth before it was abruptly removed, the new panel "has no crease at all" in comparison. The ...
AirPods Pro 3 Year of the Horse Feature

Apple Launches Year of the Horse AirPods Pro 3 for Lunar New Year

Monday January 5, 2026 11:28 am PST by
Apple has designed a limited edition version of the AirPods Pro 3 to celebrate Lunar New Year, and customers in select countries can purchase them starting today. The Year of the Horse Special Edition AirPods Pro 3 feature a unique horse emoji character that's otherwise unavailable. Customers in China, Hong Kong, Taiwan, Malaysia, and Singapore are able to buy the AirPods, and they'll be...

Top Rated Comments

thisisnotmyname Avatar
86 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
Score: 12 Votes (Like | Disagree)
cocky jeremy Avatar
86 months ago
I mean... duh?
Score: 6 Votes (Like | Disagree)
cmaier Avatar
86 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
It is, if you use the appropriate control. But developer’s may want to integrate it into their own canvas or controls. In which case it is harder to expose it since things you do in your own code can interfere with the ability of the pen code to get the cycles it needs from the GPU and CPU.
Score: 4 Votes (Like | Disagree)
nexusrule Avatar
86 months ago
How nice of Apple. You would think they would limit functionality improvements to their own apps.
I think you don’t know how development works. When you start creating code you can’t always abstract it in a way that’s usable by third party devs through an API. What Federighi meant is right now the code that allow for that part of delay reduction is split between different of Apple software technologies. To be made safely accessible to others devs it needs to be abstracted, made indipendent, because private frameworks can’t be exposed for security reasons. You build these frameworks after you have the working feature, it’s simply impossible to abstract a solution that doesn’t exist. And this sort of work can require a massive rewrite of some parts of the relevant underlying technologies, and it requires time.
Score: 4 Votes (Like | Disagree)
NickName99 Avatar
86 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Score: 4 Votes (Like | Disagree)
Cayden Avatar
86 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Now I’m not sure so take this with a grain of salt, but as an engineer I’m inclined to believe “mid-frame event processing” means they are updating some pixel information (likely just the pixels associated with the pencil) in between frame updates in which all pixel information is updated and displayed. In other words, in between hardware detections of the pencil location, software would update where it presicts the pencil to be on the next update, and it can start looking for the pencil there instead of looking arbitrarily, mean the location can (usually) be found quicker. What I’m not sure about is if these pixels are actually being updated mid-frame or if the processing is simply keeping this information stored until the next frame is ready to update. I can’t see how the pixels could be updated mid-frame unless they had an individual refresh rate, so I’m inclined to believe the second case. If it’s the second case, it would make sense why Apple doesn’t want to give developers access to this, as this could quickly lead to timing errors between the software and hardware interrupts, such that it would only work within Apple’s framework and not an arbitrary code framework.
Score: 3 Votes (Like | Disagree)