Third-Party Devs Will Be Able to Access iPadOS Apple Pencil Latency Improvements for Art Apps

Apple in iPadOS introduced some performance improvements between the iPad Pro and the Apple Pencil, cutting latency from 20ms to 9ms with the new software.

Third-party developers who make apps that use the ‌Apple Pencil‌ will also be able to take advantage of some of these latency improvements, Apple software development chief Craig Federighi confirmed last week.

ipadproapplepencil
Federighi shared the information in a response to an email sent by Artstudio Pro developer Cladio Juliano, who tweeted what Federighi had to say last week. The info was highlighted today in a tweet by developer Steve Troughton-Smith.

In the email, Federighi explains that third-party developers have had access to predicted touches via UIKit since iOS 9, and with iOS 13, developers will receive the "latest and greatest" touch prediction advancements in minimizing PencilKit drawing latency.

Federighi explains just how Apple introduced the latency improvements, and he points out that there's a small gap of 4ms that developers won't have access to at the current time because Apple didn't have a way to safely expose the capability to developers. From Federighi's email:

Note that we achieve low latency through a combination of several techniques: Metal rendering optimizations, touch prediction, and mid-frame event processing. Third-party developers can achieve similar low-latency drawing experiences by taking advantage of Metal rendering and touch prediction best practices covered in the WWDC Sessions I've referenced below.

With these you can achieve nearly all of the improvements you've seen in PencilKit drawing with your own renderer. (There does remain a small gap: 4 ms of our improvement comes from a technique called mid-frame event processing; we are looking for ways to expose this capability to third party engines in the future, but for this year this one was only safely achievable through tight integration within our frameworks).

For developers, the WWDC sessions Federighi suggests include PencilKit, Adopting Predicted Touches, and Metal Performance Optimization.

In a nutshell, the information shared by Federighi confirms that third-party apps that take advantage of the ‌Apple Pencil‌ will be getting some of the same latency improvements that we'll be seeing when using the ‌Apple Pencil‌ within native functions like Markup.

The ‌Apple Pencil‌ latency improvements are built into iPadOS, the version of iOS 13 that is designed to run on the iPad. All of Apple's current iPads support the ‌Apple Pencil‌. ‌iPad Pro‌ models work with the ‌Apple Pencil‌ 2, while the 6th-generation ‌iPad‌, iPad mini, and iPad Air work with the original ‌Apple Pencil‌.

Related Forum: iOS 13

Popular Stories

iphone 16 display

iPhone 17's Scratch Resistant Anti-Reflective Display Coating Canceled

Monday April 28, 2025 12:48 pm PDT by
Apple may have canceled the super scratch resistant anti-reflective display coating that it planned to use for the iPhone 17 Pro models, according to a source with reliable information that spoke to MacRumors. Last spring, Weibo leaker Instant Digital suggested Apple was working on a new anti-reflective display layer that was more scratch resistant than the Ceramic Shield. We haven't heard...
iPhone 17 Air Pastel Feature

iPhone 17 Reaches Key Milestone Ahead of Mass Production

Monday April 28, 2025 8:44 am PDT by
Apple has completed Engineering Validation Testing (EVT) for at least one iPhone 17 model, according to a paywalled preview of an upcoming DigiTimes report. iPhone 17 Air mockup based on rumored design The EVT stage involves Apple testing iPhone 17 prototypes to ensure the hardware works as expected. There are still DVT (Design Validation Test) and PVT (Production Validation Test) stages to...
iphone 17 air iphone 16 pro

iPhone 17 Air USB-C Port May Have This Unusual Design Quirk

Wednesday April 30, 2025 3:59 am PDT by
Apple is preparing to launch a dramatically thinner iPhone this September, and if recent leaks are anything to go by, the so-called iPhone 17 Air could boast one of the most radical design shifts in recent years. iPhone 17 Air dummy model alongside iPhone 16 Pro (credit: AppleTrack) At just 5.5mm thick (excluding a slightly raised camera bump), the 6.6-inch iPhone 17 Air is expected to become ...
Beyond iPhone 13 Better Blue

20th Anniversary iPhone Likely to Be Made in China Due to 'Extraordinarily Complex' Design

Monday April 28, 2025 4:29 am PDT by
Apple will likely manufacture its 20th anniversary iPhone models in China, despite broader efforts to shift production to India, according to Bloomberg's Mark Gurman. In 2027, Apple is planning a "major shake-up" for the iPhone lineup to mark two decades since the original model launched. Gurman's previous reporting indicates the company will introduce a foldable iPhone alongside a "bold"...
apple watch ultra yellow

What's Next for the Apple Watch Ultra 3 and Apple Watch SE 3

Friday April 25, 2025 2:44 pm PDT by
This week marks the 10th anniversary of the Apple Watch, which launched on April 24, 2015. Yesterday, we recapped features rumored for the Apple Watch Series 11, but since 2015, the Apple Watch has also branched out into the Apple Watch Ultra and the Apple Watch SE, so we thought we'd take a look at what's next for those product lines, too. 2025 Apple Watch Ultra 3 Apple didn't update the...
AirPods Pro 3 Mock Feature

AirPods Pro 3 Just Months Away – Here's What We Know

Tuesday April 29, 2025 1:30 am PDT by
Despite being more than two years old, Apple's AirPods Pro 2 still dominate the premium wireless‑earbud space, thanks to a potent mix of top‑tier audio, class‑leading noise cancellation, and Apple's habit of delivering major new features through software updates. With AirPods Pro 3 widely expected to arrive in 2025, prospective buyers now face a familiar dilemma: snap up the proven...
iPhone 17 Pro Blue Feature Tighter Crop

iPhone 17 Pro Launching Later This Year With These 13 New Features

Wednesday April 23, 2025 8:31 am PDT by
While the iPhone 17 Pro and iPhone 17 Pro Max are not expected to launch until September, there are already plenty of rumors about the devices. Below, we recap key changes rumored for the iPhone 17 Pro models as of April 2025: Aluminum frame: iPhone 17 Pro models are rumored to have an aluminum frame, whereas the iPhone 15 Pro and iPhone 16 Pro models have a titanium frame, and the iPhone ...
iPhone 17 Pro on Desk Feature

All iPhone 17 Models Again Rumored to Feature 12GB of RAM

Tuesday April 29, 2025 3:36 am PDT by
All upcoming iPhone 17 models will come equipped with 12GB of RAM to support Apple Intelligence, according to the Weibo-based leaker Digital Chat Station. The claim from the Chinese leaker, who has sources within Apple's supply chain, comes a few days after industry analyst Ming-Chi Kuo said that the iPhone 17 Air, iPhone 17 Pro, and iPhone 17 Pro Max will all be equipped with 12GB of RAM. ...

Top Rated Comments

thisisnotmyname Avatar
77 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
Score: 12 Votes (Like | Disagree)
cocky jeremy Avatar
77 months ago
I mean... duh?
Score: 6 Votes (Like | Disagree)
cmaier Avatar
77 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
It is, if you use the appropriate control. But developer’s may want to integrate it into their own canvas or controls. In which case it is harder to expose it since things you do in your own code can interfere with the ability of the pen code to get the cycles it needs from the GPU and CPU.
Score: 4 Votes (Like | Disagree)
nexusrule Avatar
77 months ago
How nice of Apple. You would think they would limit functionality improvements to their own apps.
I think you don’t know how development works. When you start creating code you can’t always abstract it in a way that’s usable by third party devs through an API. What Federighi meant is right now the code that allow for that part of delay reduction is split between different of Apple software technologies. To be made safely accessible to others devs it needs to be abstracted, made indipendent, because private frameworks can’t be exposed for security reasons. You build these frameworks after you have the working feature, it’s simply impossible to abstract a solution that doesn’t exist. And this sort of work can require a massive rewrite of some parts of the relevant underlying technologies, and it requires time.
Score: 4 Votes (Like | Disagree)
NickName99 Avatar
77 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Score: 4 Votes (Like | Disagree)
Cayden Avatar
77 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Now I’m not sure so take this with a grain of salt, but as an engineer I’m inclined to believe “mid-frame event processing” means they are updating some pixel information (likely just the pixels associated with the pencil) in between frame updates in which all pixel information is updated and displayed. In other words, in between hardware detections of the pencil location, software would update where it presicts the pencil to be on the next update, and it can start looking for the pencil there instead of looking arbitrarily, mean the location can (usually) be found quicker. What I’m not sure about is if these pixels are actually being updated mid-frame or if the processing is simply keeping this information stored until the next frame is ready to update. I can’t see how the pixels could be updated mid-frame unless they had an individual refresh rate, so I’m inclined to believe the second case. If it’s the second case, it would make sense why Apple doesn’t want to give developers access to this, as this could quickly lead to timing errors between the software and hardware interrupts, such that it would only work within Apple’s framework and not an arbitrary code framework.
Score: 3 Votes (Like | Disagree)