Third-Party Devs Will Be Able to Access iPadOS Apple Pencil Latency Improvements for Art Apps

Apple in iPadOS introduced some performance improvements between the iPad Pro and the Apple Pencil, cutting latency from 20ms to 9ms with the new software.

Third-party developers who make apps that use the ‌Apple Pencil‌ will also be able to take advantage of some of these latency improvements, Apple software development chief Craig Federighi confirmed last week.

ipadproapplepencil
Federighi shared the information in a response to an email sent by Artstudio Pro developer Cladio Juliano, who tweeted what Federighi had to say last week. The info was highlighted today in a tweet by developer Steve Troughton-Smith.

In the email, Federighi explains that third-party developers have had access to predicted touches via UIKit since iOS 9, and with iOS 13, developers will receive the "latest and greatest" touch prediction advancements in minimizing PencilKit drawing latency.

Federighi explains just how Apple introduced the latency improvements, and he points out that there's a small gap of 4ms that developers won't have access to at the current time because Apple didn't have a way to safely expose the capability to developers. From Federighi's email:

Note that we achieve low latency through a combination of several techniques: Metal rendering optimizations, touch prediction, and mid-frame event processing. Third-party developers can achieve similar low-latency drawing experiences by taking advantage of Metal rendering and touch prediction best practices covered in the WWDC Sessions I've referenced below.

With these you can achieve nearly all of the improvements you've seen in PencilKit drawing with your own renderer. (There does remain a small gap: 4 ms of our improvement comes from a technique called mid-frame event processing; we are looking for ways to expose this capability to third party engines in the future, but for this year this one was only safely achievable through tight integration within our frameworks).

For developers, the WWDC sessions Federighi suggests include PencilKit, Adopting Predicted Touches, and Metal Performance Optimization.

In a nutshell, the information shared by Federighi confirms that third-party apps that take advantage of the ‌Apple Pencil‌ will be getting some of the same latency improvements that we'll be seeing when using the ‌Apple Pencil‌ within native functions like Markup.

The ‌Apple Pencil‌ latency improvements are built into iPadOS, the version of iOS 13 that is designed to run on the iPad. All of Apple's current iPads support the ‌Apple Pencil‌. ‌iPad Pro‌ models work with the ‌Apple Pencil‌ 2, while the 6th-generation ‌iPad‌, iPad mini, and iPad Air work with the original ‌Apple Pencil‌.

Related Forum: iOS 13

Popular Stories

Golden Apple Logo

Every Apple Secret That Leaked Wednesday

Thursday August 14, 2025 4:13 am PDT by
Apple made a major slip Wednesday when it accidentally included hardware identifiers in software code linking to numerous unannounced products. The leaked information provided MacRumors with concrete evidence of Apple's hardware development across multiple product categories. Here's everything that was confirmed through the code discoveries: New HomePod mini with updated chip – New...
iPhone 17 Pro Dark Blue and Orange

iPhone 17 Pro to Start at $1,049 With Doubled Base Storage

Wednesday August 13, 2025 1:45 am PDT by
Apple's upcoming iPhone 17 Pro will have a starting price that is $50 more than the iPhone 16 Pro but it will come with a minimum 256GB of storage, doubling the base capacity compared to last year's model. The information comes from Chinese leaker Instant Digital, posting on Weibo. The account, which has 1.5 million followers, has now made the claim three separate times in recent weeks....
iPhone 17 Pro 3 4ths Perspective Aluminum Camera Module 1

Alleged iPhone 17 Pro Chassis Offers First Look at All-Aluminum Body

Thursday August 14, 2025 3:40 am PDT by
An alleged iPhone 17 Pro production leak may provide a first look at the device's milled all-aluminum chassis, which this year includes the camera bump – in contrast to last year's iPhone 16 Pro model that features a glass camera module attached to an all-glass back panel. Originally shared by leaker Majin Bu, the image below could be of a moulding, but it still lines up with rumors that...
iPhone 17 Pro Feature Dual

When Will Apple Announce the iPhone 17 Event?

Tuesday August 12, 2025 12:46 pm PDT by
It is now mid-August, meaning that Apple's annual iPhone event is just around the corner. This year, Apple is expected to unveil the iPhone 17, the all-new iPhone 17 Air, the iPhone 17 Pro, and the iPhone 17 Pro Max. Here are some of the key rumors for those devices:iPhone 17: Same design as iPhone 16, but with an A19 chip, a larger 6.3-inch display, an upgraded 24-megapixel front camera, ...
maxresdefault

Top 5 Features Coming to the Apple Watch Ultra 3

Tuesday August 12, 2025 11:48 am PDT by
We're just about a month away from Apple's annual September event, and we're going to get a new version of the Apple Watch Ultra for the first time since 2023. There are some useful new features rumored for the Apple Watch Ultra 3, which we've summarized below. Subscribe to the MacRumors YouTube channel for more videos. Satellite Connectivity - The Apple Watch Ultra 3 will be the first...
Apple TV 2025 Thumb 2

New Apple TV Coming Later This Year With A17 Pro Chip

Wednesday August 13, 2025 5:29 pm PDT by
Rumors suggest that Apple is working on an updated version of the Apple TV that's slated for launch later this year. Information about the upcoming device that was found in Apple code indicates that it will be equipped with the A17 Pro chip. There have been multiple rumors about a new Apple TV coming in 2025 with a new A-series processor, but it hasn't been clear which chip Apple would use...
Generic iOS 18

Apple Says iOS 18.6.1 is Coming Today

Thursday August 14, 2025 7:29 am PDT by
In case you missed it — this is the post for people who mainly only read headlines — Apple has announced that it will be releasing iOS 18.6.1 and watchOS 11.6.1 later today. Apple shared this information in a press release on its Newsroom website. The software updates will re-enable the Blood Oxygen feature on Apple Watch Series 9, Series 10, and Ultra 2 models sold in the United States....
ios 26 liquid glass lock screen beta 6

Apple Changes Liquid Glass Again in iOS 26 Beta 6

Monday August 11, 2025 12:09 pm PDT by
Apple is continuing to tweak the way that the Liquid Glass design looks ahead of the iOS 26 launch, and the latest beta makes a change to the Lock Screen. The Lock Screen clock has been updated with additional transparency, allowing more of the background to peek through. Beta 6 on left, beta 5 on right The clock also has more of a 3D, floating look, which is in line with the rest of the ...
iPhone 17 Pro on Desk Centered 1

iPhone 17 Pro Just Weeks Away — Here Are the Top 4 Rumored Features

Wednesday August 13, 2025 7:59 am PDT by
Apple's annual iPhone event is just around the corner, with the iPhone 17 series expected to be announced in early September, and availability to follow later in the month. As always, the Pro and Pro Max models will have the most new features. Below, we have recapped rumors about four of the most interesting iPhone 17 Pro features. This list is subjective, of course, so sound off in the...

Top Rated Comments

thisisnotmyname Avatar
80 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
Score: 12 Votes (Like | Disagree)
cocky jeremy Avatar
80 months ago
I mean... duh?
Score: 6 Votes (Like | Disagree)
cmaier Avatar
80 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
It is, if you use the appropriate control. But developer’s may want to integrate it into their own canvas or controls. In which case it is harder to expose it since things you do in your own code can interfere with the ability of the pen code to get the cycles it needs from the GPU and CPU.
Score: 4 Votes (Like | Disagree)
nexusrule Avatar
80 months ago
How nice of Apple. You would think they would limit functionality improvements to their own apps.
I think you don’t know how development works. When you start creating code you can’t always abstract it in a way that’s usable by third party devs through an API. What Federighi meant is right now the code that allow for that part of delay reduction is split between different of Apple software technologies. To be made safely accessible to others devs it needs to be abstracted, made indipendent, because private frameworks can’t be exposed for security reasons. You build these frameworks after you have the working feature, it’s simply impossible to abstract a solution that doesn’t exist. And this sort of work can require a massive rewrite of some parts of the relevant underlying technologies, and it requires time.
Score: 4 Votes (Like | Disagree)
NickName99 Avatar
80 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Score: 4 Votes (Like | Disagree)
Cayden Avatar
80 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Now I’m not sure so take this with a grain of salt, but as an engineer I’m inclined to believe “mid-frame event processing” means they are updating some pixel information (likely just the pixels associated with the pencil) in between frame updates in which all pixel information is updated and displayed. In other words, in between hardware detections of the pencil location, software would update where it presicts the pencil to be on the next update, and it can start looking for the pencil there instead of looking arbitrarily, mean the location can (usually) be found quicker. What I’m not sure about is if these pixels are actually being updated mid-frame or if the processing is simply keeping this information stored until the next frame is ready to update. I can’t see how the pixels could be updated mid-frame unless they had an individual refresh rate, so I’m inclined to believe the second case. If it’s the second case, it would make sense why Apple doesn’t want to give developers access to this, as this could quickly lead to timing errors between the software and hardware interrupts, such that it would only work within Apple’s framework and not an arbitrary code framework.
Score: 3 Votes (Like | Disagree)