Third-Party Devs Will Be Able to Access iPadOS Apple Pencil Latency Improvements for Art Apps

Apple in iPadOS introduced some performance improvements between the iPad Pro and the Apple Pencil, cutting latency from 20ms to 9ms with the new software.

Third-party developers who make apps that use the ‌Apple Pencil‌ will also be able to take advantage of some of these latency improvements, Apple software development chief Craig Federighi confirmed last week.

ipadproapplepencil
Federighi shared the information in a response to an email sent by Artstudio Pro developer Cladio Juliano, who tweeted what Federighi had to say last week. The info was highlighted today in a tweet by developer Steve Troughton-Smith.

In the email, Federighi explains that third-party developers have had access to predicted touches via UIKit since iOS 9, and with iOS 13, developers will receive the "latest and greatest" touch prediction advancements in minimizing PencilKit drawing latency.

Federighi explains just how Apple introduced the latency improvements, and he points out that there's a small gap of 4ms that developers won't have access to at the current time because Apple didn't have a way to safely expose the capability to developers. From Federighi's email:

Note that we achieve low latency through a combination of several techniques: Metal rendering optimizations, touch prediction, and mid-frame event processing. Third-party developers can achieve similar low-latency drawing experiences by taking advantage of Metal rendering and touch prediction best practices covered in the WWDC Sessions I've referenced below.

With these you can achieve nearly all of the improvements you've seen in PencilKit drawing with your own renderer. (There does remain a small gap: 4 ms of our improvement comes from a technique called mid-frame event processing; we are looking for ways to expose this capability to third party engines in the future, but for this year this one was only safely achievable through tight integration within our frameworks).

For developers, the WWDC sessions Federighi suggests include PencilKit, Adopting Predicted Touches, and Metal Performance Optimization.

In a nutshell, the information shared by Federighi confirms that third-party apps that take advantage of the ‌Apple Pencil‌ will be getting some of the same latency improvements that we'll be seeing when using the ‌Apple Pencil‌ within native functions like Markup.

The ‌Apple Pencil‌ latency improvements are built into iPadOS, the version of iOS 13 that is designed to run on the iPad. All of Apple's current iPads support the ‌Apple Pencil‌. ‌iPad Pro‌ models work with the ‌Apple Pencil‌ 2, while the 6th-generation ‌iPad‌, iPad mini, and iPad Air work with the original ‌Apple Pencil‌.

Related Forum: iOS 13

Popular Stories

M5 MacBook Pro

Apple Announces New 14-Inch MacBook Pro With M5 Chip

Wednesday October 15, 2025 6:07 am PDT by
Apple today updated the 14-inch MacBook Pro base model with its new M5 chip, which is also available in updated iPad Pro and Vision Pro models. In addition, the base 14-inch MacBook Pro can now be configured with up to 4TB of storage on Apple's online store, whereas the previous model maxed out at 2TB. However, the maximum amount of unified RAM available for this model remains 32GB. Like...
Apple iPad Pro hero M5

Apple Debuts New iPad Pro With M5 Chip, Faster Charging, and More

Wednesday October 15, 2025 6:16 am PDT by
Apple today announced the next-generation iPad Pro, featuring the custom-designed M5, C1X, and N1 chips. The M5 chip has up to a 10-core CPU, with four performance cores and six efficiency cores. It features a next-generation GPU with Neural Accelerator in each core, allowing the new iPad Pro to deliver up to 3.5x the AI performance than the previous model, and a third-generation ray-tracing ...
apple oct 2024 mac tease

Apple Expected to Announce These Two to Three Products 'This Week'

Sunday October 12, 2025 7:05 am PDT by
Apple plans to announce new products "this week," according to Bloomberg's Mark Gurman. Apple's "Mac Your Calendars" teaser last October In his Power On newsletter today, Gurman said the products set to be updated this week include the iPad Pro, Vision Pro, and "likely" the base 14-inch MacBook Pro, with all three likely to receive a spec bump with Apple's next-generation M5 chip. Gurman...
maxresdefault

Here's Everything Apple Announced Today

Wednesday October 15, 2025 3:54 pm PDT by
We didn't get a second fall event this year, but Apple did unveil updated products with a series of press releases that went out today. The M5 chip made an appearance in new MacBook Pro, Vision Pro, and iPad Pro models. Subscribe to the MacRumors YouTube channel for more videos. We've rounded up our coverage and highlighted the main feature changes for each device below. MacBook Pro M5...
joz macbook tease

Apple Teases Upcoming M5 MacBook Pro Launch: 'Something Powerful is Coming'

Tuesday October 14, 2025 11:59 am PDT by
Apple marketing chief Greg Joswiak today teased the launch of an upcoming product, saying "something powerful is coming" on social media. Subscribe to the MacRumors YouTube channel for more videos. A short animation accompanying Joswiak's teaser reveals a brief glimpse of a MacBook Pro along with the words "coming soon." The shape of the MacBook Pro is a V, which is the Roman numeral...
airpods max 2024 colors

AirPods Max 2: Everything We Know So Far

Tuesday October 14, 2025 8:43 am PDT by
Apple's AirPods Max have now been available for almost five years, so what do we know about the second-generation version? According to Apple supply chain analyst Ming-Chi Kuo, the new AirPods Max will be lighter than the current ones, but exactly how much is as yet known. The current AirPods Max weigh 0.85 pounds (386.2 grams), excluding the charging case, making it one of the heavier...
Vision Pro M5 Announcement

Apple Updates Vision Pro With M5 Chip, Dual Knit Band, and 120Hz Support

Wednesday October 15, 2025 6:14 am PDT by
Apple today updated the Vision Pro headset with its next-generation M5 chip for faster performance, and a more comfortable Dual Knit Band. The M5 chip has a 10-core CPU, a 10-core GPU with Neural Accelerators, and a 16-core Neural Engine, and we have confirmed the Vision Pro still has 16GB of RAM. With the M5 chip, the Vision Pro offers faster performance and longer battery life compared...
macbook pro blue

Apple's M5 MacBook Pro Imminent: What to Expect

Tuesday October 14, 2025 4:35 pm PDT by
Apple is going to launch a new version of the MacBook Pro as soon as tomorrow, so we thought we'd go over what to expect from Apple's upcoming Mac. M5 Chip The MacBook Pro will be one of the first new devices to use the next-generation M5 chip, which will replace the M4 chip. The M5 is built on TSMC's more advanced 3-nanometer process, and it will bring speed and efficiency improvements. ...
MacBook Pro M5 Screen

New MacBook Pro Does Not Include a Charger in the Box in Europe

Wednesday October 15, 2025 6:59 am PDT by
The new 14-inch MacBook Pro with an M5 chip does not include a charger in the box in European countries, including the U.K., Ireland, Germany, Italy, France, Spain, the Netherlands, Norway, and others, according to Apple's online store. In the U.S. and all other countries outside of Europe, the new MacBook Pro comes with Apple's 70W USB-C Power Adapter, but European customers miss out....
HomePod mini and Apple TV

Apple's Next Rumored Products: New HomePod Mini, Apple TV, and More

Thursday October 16, 2025 9:13 am PDT by
Apple on Wednesday updated the 14-inch MacBook Pro, iPad Pro, and Vision Pro with its next-generation M5 chip, but previous rumors have indicated that the company still plans to announce at least a few additional products before the end of the year. The following Apple products have at one point been rumored to be updated in 2025, although it is unclear if the timeframe for any of them has...

Top Rated Comments

thisisnotmyname Avatar
83 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
Score: 12 Votes (Like | Disagree)
cocky jeremy Avatar
83 months ago
I mean... duh?
Score: 6 Votes (Like | Disagree)
cmaier Avatar
83 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
It is, if you use the appropriate control. But developer’s may want to integrate it into their own canvas or controls. In which case it is harder to expose it since things you do in your own code can interfere with the ability of the pen code to get the cycles it needs from the GPU and CPU.
Score: 4 Votes (Like | Disagree)
nexusrule Avatar
83 months ago
How nice of Apple. You would think they would limit functionality improvements to their own apps.
I think you don’t know how development works. When you start creating code you can’t always abstract it in a way that’s usable by third party devs through an API. What Federighi meant is right now the code that allow for that part of delay reduction is split between different of Apple software technologies. To be made safely accessible to others devs it needs to be abstracted, made indipendent, because private frameworks can’t be exposed for security reasons. You build these frameworks after you have the working feature, it’s simply impossible to abstract a solution that doesn’t exist. And this sort of work can require a massive rewrite of some parts of the relevant underlying technologies, and it requires time.
Score: 4 Votes (Like | Disagree)
NickName99 Avatar
83 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Score: 4 Votes (Like | Disagree)
Cayden Avatar
83 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Now I’m not sure so take this with a grain of salt, but as an engineer I’m inclined to believe “mid-frame event processing” means they are updating some pixel information (likely just the pixels associated with the pencil) in between frame updates in which all pixel information is updated and displayed. In other words, in between hardware detections of the pencil location, software would update where it presicts the pencil to be on the next update, and it can start looking for the pencil there instead of looking arbitrarily, mean the location can (usually) be found quicker. What I’m not sure about is if these pixels are actually being updated mid-frame or if the processing is simply keeping this information stored until the next frame is ready to update. I can’t see how the pixels could be updated mid-frame unless they had an individual refresh rate, so I’m inclined to believe the second case. If it’s the second case, it would make sense why Apple doesn’t want to give developers access to this, as this could quickly lead to timing errors between the software and hardware interrupts, such that it would only work within Apple’s framework and not an arbitrary code framework.
Score: 3 Votes (Like | Disagree)