Third-Party Devs Will Be Able to Access iPadOS Apple Pencil Latency Improvements for Art Apps

Apple in iPadOS introduced some performance improvements between the iPad Pro and the Apple Pencil, cutting latency from 20ms to 9ms with the new software.

Third-party developers who make apps that use the ‌Apple Pencil‌ will also be able to take advantage of some of these latency improvements, Apple software development chief Craig Federighi confirmed last week.

ipadproapplepencil
Federighi shared the information in a response to an email sent by Artstudio Pro developer Cladio Juliano, who tweeted what Federighi had to say last week. The info was highlighted today in a tweet by developer Steve Troughton-Smith.

In the email, Federighi explains that third-party developers have had access to predicted touches via UIKit since iOS 9, and with iOS 13, developers will receive the "latest and greatest" touch prediction advancements in minimizing PencilKit drawing latency.

Federighi explains just how Apple introduced the latency improvements, and he points out that there's a small gap of 4ms that developers won't have access to at the current time because Apple didn't have a way to safely expose the capability to developers. From Federighi's email:

Note that we achieve low latency through a combination of several techniques: Metal rendering optimizations, touch prediction, and mid-frame event processing. Third-party developers can achieve similar low-latency drawing experiences by taking advantage of Metal rendering and touch prediction best practices covered in the WWDC Sessions I've referenced below.

With these you can achieve nearly all of the improvements you've seen in PencilKit drawing with your own renderer. (There does remain a small gap: 4 ms of our improvement comes from a technique called mid-frame event processing; we are looking for ways to expose this capability to third party engines in the future, but for this year this one was only safely achievable through tight integration within our frameworks).

For developers, the WWDC sessions Federighi suggests include PencilKit, Adopting Predicted Touches, and Metal Performance Optimization.

In a nutshell, the information shared by Federighi confirms that third-party apps that take advantage of the ‌Apple Pencil‌ will be getting some of the same latency improvements that we'll be seeing when using the ‌Apple Pencil‌ within native functions like Markup.

The ‌Apple Pencil‌ latency improvements are built into iPadOS, the version of iOS 13 that is designed to run on the iPad. All of Apple's current iPads support the ‌Apple Pencil‌. ‌iPad Pro‌ models work with the ‌Apple Pencil‌ 2, while the 6th-generation ‌iPad‌, iPad mini, and iPad Air work with the original ‌Apple Pencil‌.

Related Forum: iOS 13

Popular Stories

iCloud General Feature Redux

iPhone Users Who Pay for iCloud Storage Receive a New Perk

Thursday March 20, 2025 12:01 am PDT by
If you pay for iCloud storage on your iPhone, Apple has a new perk for you, at no additional cost. The new perk is the ability to create invitations in the Apple Invites app for the iPhone, which launched in the App Store last month. In the Apple Invites app, iCloud+ subscribers can create invitations for any occasion, such as birthday parties, graduations, baby showers, and more. Anyone ...
apple wallet drivers license feature iPhone 15 pro teal 1

Apple Says iPhone Driver's Licenses Coming to These 8 U.S. States, But Rollout Remains Slow

Wednesday March 19, 2025 6:55 am PDT by
In select U.S. states, residents can add their driver's license or state ID to the Wallet app on the iPhone and Apple Watch, providing a convenient and contactless way to display proof of identity or age at select airports and businesses, and in select apps. Unfortunately, this feature continues to roll out very slowly. It has been three and a half years since Apple first announced the...
Generic iOS 19 Feature Mock

iOS 19 Coming in June With These New Features

Thursday March 20, 2025 2:04 pm PDT by
While the first iOS 19 beta is still more than two months away, there are already plenty of rumors about the upcoming software update. Below, we recap the key iOS 19 rumors so far. visionOS-Like Design In January, the YouTube channel Front Page Tech revealed a redesigned Camera app that is allegedly planned for iOS 19. According to Front Page Tech host Jon Prosser, the Camera app...
Windows Vista

Apple Might Be Having Its Windows Vista Moment, Says Analyst

Thursday March 20, 2025 6:52 am PDT by
Is Apple experiencing a "Vista-like drift into systemically poor execution?" That was a question posed by well-known technology analyst Benedict Evans, in a recent blog post covering Apple's innovation and execution, or seemingly lack thereof as of late. He is referring to Microsoft's Windows Vista operating system, which was widely criticized when it launched in 2007 due to software bugs,...
iPhone 17 Air Fanned Feature

First iPhone 17 Air Case Has Camera Bar, Camera Control Button Cutouts

Wednesday March 19, 2025 5:29 am PDT by
Serial leaker Sonny Dickson today shared an image of what he claims is a first look at a third-party case for Apple's iPhone 17 Air. "If you didn’t know an Air was coming, you'd swear it was a Google Pixel case," he said. Case manufacturers often obtain design specifications of upcoming iPhone models before their release by collaborating with Apple through official partnerships or...
iPhone 17 Pro Render Front Page Tech

Latest iPhone 17 Pro Dummies Highlight Apple's New Part-Glass Design

Thursday March 20, 2025 5:27 am PDT by
Seasoned leaker Sonny Dickson has shared more dummy models of Apple's upcoming iPhone 17 series, with the latest lot revealing a noticeable shift in Apple's iPhone Pro model design that goes beyond the much-talked-about new rear camera bar. Dickson points out that the iPhone 17 Pro dummy models feature an outlined area on the back, beginning just below the camera module and extending to the...
iphone 16 pro ghost hand

Next Year's iPhone 18 Pro Already Rumored to Have Five New Features

Tuesday March 18, 2025 1:00 pm PDT by
While the iPhone 18 Pro models are still around a year and a half away from launching, there are already some early rumors about the devices. Below, we recap some key iPhone 18 Pro rumors so far. Under-Screen Face ID In April 2023, display industry analyst Ross Young shared a roadmap showing that iPhone 17 Pro models would feature under-display Face ID. In May 2024, however, Young said ...
Generic iOS 19 Feature Mock

iOS 19 and iOS 20 Must Include a Long List of Major Changes, EU Says

Wednesday March 19, 2025 10:26 am PDT by
The European Commission today announced a long list of changes that Apple is legally required to implement in future iOS 19 and iOS 20 updates. The announcement clarifies interoperability requirements that Apple is required to adhere to in the EU, under the Digital Markets Act, which has been fully enforced since March 2024. The changes will further open up the iPhone and its technologies to ...
airpods pro 2 gradient

AirPods Pro 3 Launch Now Just Months Away: Here's What We Know

Tuesday March 18, 2025 9:13 am PDT by
Despite being released over two years ago, Apple's AirPods Pro 2 continue to dominate the wireless earbud market. However, with the AirPods Pro 3 expected to launch in 2025, anyone thinking of buying Apple's premium earbuds may be wondering if the next generation is worth holding out for. Apart from their audio and noise-canceling performance, which are generally regarded as excellent for...

Top Rated Comments

thisisnotmyname Avatar
75 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
Score: 12 Votes (Like | Disagree)
cocky jeremy Avatar
75 months ago
I mean... duh?
Score: 6 Votes (Like | Disagree)
cmaier Avatar
75 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
It is, if you use the appropriate control. But developer’s may want to integrate it into their own canvas or controls. In which case it is harder to expose it since things you do in your own code can interfere with the ability of the pen code to get the cycles it needs from the GPU and CPU.
Score: 4 Votes (Like | Disagree)
nexusrule Avatar
75 months ago
How nice of Apple. You would think they would limit functionality improvements to their own apps.
I think you don’t know how development works. When you start creating code you can’t always abstract it in a way that’s usable by third party devs through an API. What Federighi meant is right now the code that allow for that part of delay reduction is split between different of Apple software technologies. To be made safely accessible to others devs it needs to be abstracted, made indipendent, because private frameworks can’t be exposed for security reasons. You build these frameworks after you have the working feature, it’s simply impossible to abstract a solution that doesn’t exist. And this sort of work can require a massive rewrite of some parts of the relevant underlying technologies, and it requires time.
Score: 4 Votes (Like | Disagree)
NickName99 Avatar
75 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Score: 4 Votes (Like | Disagree)
Cayden Avatar
75 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Now I’m not sure so take this with a grain of salt, but as an engineer I’m inclined to believe “mid-frame event processing” means they are updating some pixel information (likely just the pixels associated with the pencil) in between frame updates in which all pixel information is updated and displayed. In other words, in between hardware detections of the pencil location, software would update where it presicts the pencil to be on the next update, and it can start looking for the pencil there instead of looking arbitrarily, mean the location can (usually) be found quicker. What I’m not sure about is if these pixels are actually being updated mid-frame or if the processing is simply keeping this information stored until the next frame is ready to update. I can’t see how the pixels could be updated mid-frame unless they had an individual refresh rate, so I’m inclined to believe the second case. If it’s the second case, it would make sense why Apple doesn’t want to give developers access to this, as this could quickly lead to timing errors between the software and hardware interrupts, such that it would only work within Apple’s framework and not an arbitrary code framework.
Score: 3 Votes (Like | Disagree)