Third-Party Devs Will Be Able to Access iPadOS Apple Pencil Latency Improvements for Art Apps

Apple in iPadOS introduced some performance improvements between the iPad Pro and the Apple Pencil, cutting latency from 20ms to 9ms with the new software.

Third-party developers who make apps that use the ‌Apple Pencil‌ will also be able to take advantage of some of these latency improvements, Apple software development chief Craig Federighi confirmed last week.

ipadproapplepencil
Federighi shared the information in a response to an email sent by Artstudio Pro developer Cladio Juliano, who tweeted what Federighi had to say last week. The info was highlighted today in a tweet by developer Steve Troughton-Smith.

In the email, Federighi explains that third-party developers have had access to predicted touches via UIKit since iOS 9, and with iOS 13, developers will receive the "latest and greatest" touch prediction advancements in minimizing PencilKit drawing latency.

Federighi explains just how Apple introduced the latency improvements, and he points out that there's a small gap of 4ms that developers won't have access to at the current time because Apple didn't have a way to safely expose the capability to developers. From Federighi's email:

Note that we achieve low latency through a combination of several techniques: Metal rendering optimizations, touch prediction, and mid-frame event processing. Third-party developers can achieve similar low-latency drawing experiences by taking advantage of Metal rendering and touch prediction best practices covered in the WWDC Sessions I've referenced below.

With these you can achieve nearly all of the improvements you've seen in PencilKit drawing with your own renderer. (There does remain a small gap: 4 ms of our improvement comes from a technique called mid-frame event processing; we are looking for ways to expose this capability to third party engines in the future, but for this year this one was only safely achievable through tight integration within our frameworks).

For developers, the WWDC sessions Federighi suggests include PencilKit, Adopting Predicted Touches, and Metal Performance Optimization.

In a nutshell, the information shared by Federighi confirms that third-party apps that take advantage of the ‌Apple Pencil‌ will be getting some of the same latency improvements that we'll be seeing when using the ‌Apple Pencil‌ within native functions like Markup.

The ‌Apple Pencil‌ latency improvements are built into iPadOS, the version of iOS 13 that is designed to run on the iPad. All of Apple's current iPads support the ‌Apple Pencil‌. ‌iPad Pro‌ models work with the ‌Apple Pencil‌ 2, while the 6th-generation ‌iPad‌, iPad mini, and iPad Air work with the original ‌Apple Pencil‌.

Related Forum: iOS 13

Popular Stories

AirPods Pro Firmware Feature

Apple Releases New Firmware for AirPods Pro 2, AirPods Pro 3, and AirPods 4

Thursday November 13, 2025 11:35 am PST by
Apple today released new firmware designed for the AirPods Pro 3, the AirPods 4, and the prior-generation AirPods Pro 2. The AirPods Pro 3 firmware is 8B25, while the AirPods Pro 2 and AirPods 4 firmware is 8B21, all up from the prior 8A358 firmware released in October. There's no word on what's include in the updated firmware, but the AirPods Pro 2, AirPods 4 with ANC, and AirPods Pro 3...
iOS 26

iOS 26.2 Available Next Month With These 8 New Features

Tuesday November 11, 2025 9:48 am PST by
Apple released the first iOS 26.2 beta last week. The upcoming update includes a handful of new features and changes on the iPhone, including a new Liquid Glass slider for the Lock Screen's clock, offline lyrics in Apple Music, and more. In a recent press release, Apple confirmed that iOS 26.2 will be released to all users in December, but it did not provide a specific release date....
CarPlay Pinned Messages

iOS 26.2 Adds New CarPlay Setting

Thursday November 13, 2025 6:48 am PST by
iOS 26 extended pinned conversations in the Messages app to CarPlay, for quick access to your most frequent chats. However, some drivers may prefer the classic view with a list of individual conversations only, and Apple now lets users choose. Apple released the second beta of iOS 26.2 this week, and it introduces a new CarPlay setting for turning off pinned conversations in the Messages...
homepod mini thumb feature

New HomePod Mini, Apple TV, and AirTag Were Expected This Year — Where Are They?

Wednesday November 12, 2025 11:42 am PST by
While it was rumored that Apple planned to release new versions of the HomePod mini, Apple TV, and AirTag this year, it is no longer clear if that will still happen. Back in January, Bloomberg's Mark Gurman said Apple planned to release new HomePod mini and Apple TV models "toward the end of the year," while he at one point expected a new AirTag to launch "around the middle of 2025." Yet,...
ios 26 digital id passport wallet

Apple Announces Launch of U.S. Passport Feature in iPhone's Wallet App

Wednesday November 12, 2025 9:15 am PST by
Apple today announced that iPhone users can now create a Digital ID in the Apple Wallet app based on information from their U.S. passport. To create and present a Digital ID based on a U.S. passport, you need: An iPhone 11 or later running iOS 26.1 or later, or an Apple Watch Series 6 or later running watchOS 26.1 or later Face ID or Touch ID and Bluetooth turned on An Apple Account ...
Tesla Charging

Tesla Working to Add Apple CarPlay Support to Vehicles

Thursday November 13, 2025 8:31 am PST by
Tesla is working to add support for Apple CarPlay in its vehicles, Bloomberg's Mark Gurman reports. Tesla vehicles rely on its own infotainment software system, which integrates vehicle functions, navigation, music, web browsing, and more. The automaker has been an outlier in foregoing support for Apple CarPlay, which has otherwise become an industry standard feature, allowing users to...
m1 chip slide

Five Years of Apple Silicon: M1 to M5 Performance Comparison

Monday November 10, 2025 1:08 pm PST by
Today marks the fifth anniversary of the Apple silicon chip that replaced Intel chips in Apple's Mac lineup. The first Apple silicon chip, the M1, was unveiled on November 10, 2020. The M1 debuted in the MacBook Air, Mac mini, and 13-inch MacBook Pro. The M1 chip was impressive when it launched, featuring the "world's fastest CPU core" and industry-leading performance per watt, and it's only ...
iOS 26

Everything New in iOS 26.2 Beta 2

Wednesday November 12, 2025 3:29 pm PST by
Apple today provided developers with the second beta of iOS 26.2, which adds a few new features worth knowing about. Measure App Apple's Measure app now features a Liquid Glass design for the level, with two Liquid Glass bubbles instead of white circles. Games App There's now an option to sort games in the Games app Library by size, in addition to Name and Recent. CarPlay The...
apple intelligence erroneous support list

Apple Intelligence Apparently Too Smart for M1 Macs After Listing Error

Wednesday November 12, 2025 2:49 am PST by
Update: It took a day, but Apple has now corrected its Apple Intelligence device compatibility list to show support for the earliest Apple silicon Macs. The original article follows. Apple's website is causing some confusion among Mac owners, and for good reason – its device compatibility listing for Apple Intelligence appears to have dropped support for M1 Macs. The U.S. version...
iphone pocket%402x

Apple Debuts iPhone Pocket, a Limited Edition iPod Sock-Style Accessory

Tuesday November 11, 2025 1:23 am PST by
Apple has teamed up with Japanese fashion house ISSEY MIYAKE to launch iPhone Pocket, a 3D-knitted limited edition accessory designed to carry an iPhone, AirPods, and other everyday items. The accessory is like a stretchy pocket, not unlike an iPod Sock, but elongated to form a strap made of a ribbed, elastic textile that fully encloses an iPhone yet allows you to glimpse the display...

Top Rated Comments

thisisnotmyname Avatar
84 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
Score: 12 Votes (Like | Disagree)
cocky jeremy Avatar
84 months ago
I mean... duh?
Score: 6 Votes (Like | Disagree)
cmaier Avatar
84 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
It is, if you use the appropriate control. But developer’s may want to integrate it into their own canvas or controls. In which case it is harder to expose it since things you do in your own code can interfere with the ability of the pen code to get the cycles it needs from the GPU and CPU.
Score: 4 Votes (Like | Disagree)
nexusrule Avatar
84 months ago
How nice of Apple. You would think they would limit functionality improvements to their own apps.
I think you don’t know how development works. When you start creating code you can’t always abstract it in a way that’s usable by third party devs through an API. What Federighi meant is right now the code that allow for that part of delay reduction is split between different of Apple software technologies. To be made safely accessible to others devs it needs to be abstracted, made indipendent, because private frameworks can’t be exposed for security reasons. You build these frameworks after you have the working feature, it’s simply impossible to abstract a solution that doesn’t exist. And this sort of work can require a massive rewrite of some parts of the relevant underlying technologies, and it requires time.
Score: 4 Votes (Like | Disagree)
NickName99 Avatar
84 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Score: 4 Votes (Like | Disagree)
Cayden Avatar
84 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Now I’m not sure so take this with a grain of salt, but as an engineer I’m inclined to believe “mid-frame event processing” means they are updating some pixel information (likely just the pixels associated with the pencil) in between frame updates in which all pixel information is updated and displayed. In other words, in between hardware detections of the pencil location, software would update where it presicts the pencil to be on the next update, and it can start looking for the pencil there instead of looking arbitrarily, mean the location can (usually) be found quicker. What I’m not sure about is if these pixels are actually being updated mid-frame or if the processing is simply keeping this information stored until the next frame is ready to update. I can’t see how the pixels could be updated mid-frame unless they had an individual refresh rate, so I’m inclined to believe the second case. If it’s the second case, it would make sense why Apple doesn’t want to give developers access to this, as this could quickly lead to timing errors between the software and hardware interrupts, such that it would only work within Apple’s framework and not an arbitrary code framework.
Score: 3 Votes (Like | Disagree)