Third-Party Devs Will Be Able to Access iPadOS Apple Pencil Latency Improvements for Art Apps

Apple in iPadOS introduced some performance improvements between the iPad Pro and the Apple Pencil, cutting latency from 20ms to 9ms with the new software.

Third-party developers who make apps that use the ‌Apple Pencil‌ will also be able to take advantage of some of these latency improvements, Apple software development chief Craig Federighi confirmed last week.

ipadproapplepencil
Federighi shared the information in a response to an email sent by Artstudio Pro developer Cladio Juliano, who tweeted what Federighi had to say last week. The info was highlighted today in a tweet by developer Steve Troughton-Smith.

In the email, Federighi explains that third-party developers have had access to predicted touches via UIKit since iOS 9, and with iOS 13, developers will receive the "latest and greatest" touch prediction advancements in minimizing PencilKit drawing latency.

Federighi explains just how Apple introduced the latency improvements, and he points out that there's a small gap of 4ms that developers won't have access to at the current time because Apple didn't have a way to safely expose the capability to developers. From Federighi's email:

Note that we achieve low latency through a combination of several techniques: Metal rendering optimizations, touch prediction, and mid-frame event processing. Third-party developers can achieve similar low-latency drawing experiences by taking advantage of Metal rendering and touch prediction best practices covered in the WWDC Sessions I've referenced below.

With these you can achieve nearly all of the improvements you've seen in PencilKit drawing with your own renderer. (There does remain a small gap: 4 ms of our improvement comes from a technique called mid-frame event processing; we are looking for ways to expose this capability to third party engines in the future, but for this year this one was only safely achievable through tight integration within our frameworks).

For developers, the WWDC sessions Federighi suggests include PencilKit, Adopting Predicted Touches, and Metal Performance Optimization.

In a nutshell, the information shared by Federighi confirms that third-party apps that take advantage of the ‌Apple Pencil‌ will be getting some of the same latency improvements that we'll be seeing when using the ‌Apple Pencil‌ within native functions like Markup.

The ‌Apple Pencil‌ latency improvements are built into iPadOS, the version of iOS 13 that is designed to run on the iPad. All of Apple's current iPads support the ‌Apple Pencil‌. ‌iPad Pro‌ models work with the ‌Apple Pencil‌ 2, while the 6th-generation ‌iPad‌, iPad mini, and iPad Air work with the original ‌Apple Pencil‌.

Related Forum: iOS 13

Popular Stories

Generic iOS 18

Apple Seeds Second Release Candidate Versions of iOS 18.2 and More With Genmoji, Image Playground and ChatGPT Integration

Monday December 9, 2024 10:06 am PST by
Apple today seeded the second release candidate versions of upcoming iOS 18.2, iPadOS 18.2, and macOS 15.2 updates to developers and public beta testers for testing purposes, a week after releasing the first RCs. The first iOS 18.2 RC had a build number of 22C150, while the second RC's build number is 22C151. Release candidates represent the final version of beta software that's expected to see a ...
iOS 18

Here Are Apple's Full Release Notes for iOS 18.2

Thursday December 5, 2024 11:48 am PST by
Apple seeded the release candidate version of iOS 18.2 today, which means it's going to see a public launch imminently. Release candidates represent the final version of new software that will be provided to the public should no last minute bugs be found, and Apple includes release notes with the RC launch. The iOS 18.2 release notes provide a look at all of the new features that are coming...
iPhone 17 Slim Feature

iPhone 17 'Air' Expected to Be ~2mm Thinner Than iPhone 16 Pro

Friday December 6, 2024 4:07 pm PST by
In 2025, Apple is planning to debut a thinner version of the iPhone that will be sold alongside the iPhone 17, iPhone 17 Pro, and iPhone 17 Pro Max. This iPhone 17 "Air" will be about two millimeters thinner than the current iPhone 16 Pro, according to Bloomberg's Mark Gurman. The iPhone 16 Pro is 8.25mm thick, so an iPhone 17 that is 2mm thinner would come in at around 6.25mm. At 6.25mm,...
New Things Your iPhone Can Do in iOS 18

20 New Things Your iPhone Can Do in iOS 18.2

Friday December 6, 2024 4:42 am PST by
Apple is set to release iOS 18.2 in the second week of December, bringing the second round of Apple Intelligence features to iPhone 15 Pro and iPhone 16 models. This update brings several major advancements to Apple's AI integration, including completely new image generation tools and a range of Visual Intelligence-based enhancements. There are a handful of new non-AI related feature controls...
airpods pro 2 gradient

AirPods Pro 3 Expected Next Year: Here's What We Know

Thursday November 28, 2024 3:30 am PST by
Despite being released over two years ago, Apple's AirPods Pro 2 continue to dominate the wireless earbud market. However, with the AirPods Pro 3 expected to launch sometime in 2025, anyone thinking of buying Apple's premium earbuds may be wondering if the next generation is worth holding out for. Apart from their audio and noise-canceling performance, which are generally regarded as...
iPhone SE 4 Single Camera Thumb 3

iPhone SE 4 Said to Feature 48MP Rear Lens, 12MP TrueDepth Camera

Monday December 9, 2024 4:48 am PST by
Apple's forthcoming iPhone SE 4 will feature a single 48-megapixel rear camera and a 12-megapixel TrueDepth camera on the front, according to details revealed in a new Korean supply chain report. ET News reports that Korea-based LG Innotek is the main supplier of the front and rear camera modules for the more budget-friendly ~$400 device, which is expected to launch in the first quarter of...
top stories 7dec2024

Top Stories: iOS 18.2 Coming Soon, iPhone 17 Rumors, and More

Saturday December 7, 2024 6:00 am PST by
2024 is rapidly drawing to a close, but Apple isn't quite done with releases for the year as iOS 18.2 and related operating system updates are arriving very shortly. Apple Intelligence is a major focus for these updates, but there are some other tweaks and improvements arriving for all users. Looking toward hardware rumors, discussion continues around Apple's work on a foldable iPhone, an...
vipps nfc tap to pay iphone

World's First Apple Pay Alternative for iPhone Launches in Norway

Monday December 9, 2024 1:28 am PST by
Norwegian payment service Vipps has become the world's first company to launch a competing tap-to-pay solution to Apple Pay on iPhone, following Apple's agreement with European regulators to open up its NFC technology to third parties. Starting December 9, Vipps users in Norway can make contactless payments in stores using their iPhones. The service initially supports customers of SpareBank...

Top Rated Comments

thisisnotmyname Avatar
71 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
Score: 12 Votes (Like | Disagree)
cocky jeremy Avatar
71 months ago
I mean... duh?
Score: 6 Votes (Like | Disagree)
cmaier Avatar
71 months ago
I would have expected that was automatically exposed from the OS. I'm a bit surprised that they had to explicitly make some of those capabilities available to third party developers.
It is, if you use the appropriate control. But developer’s may want to integrate it into their own canvas or controls. In which case it is harder to expose it since things you do in your own code can interfere with the ability of the pen code to get the cycles it needs from the GPU and CPU.
Score: 4 Votes (Like | Disagree)
nexusrule Avatar
71 months ago
How nice of Apple. You would think they would limit functionality improvements to their own apps.
I think you don’t know how development works. When you start creating code you can’t always abstract it in a way that’s usable by third party devs through an API. What Federighi meant is right now the code that allow for that part of delay reduction is split between different of Apple software technologies. To be made safely accessible to others devs it needs to be abstracted, made indipendent, because private frameworks can’t be exposed for security reasons. You build these frameworks after you have the working feature, it’s simply impossible to abstract a solution that doesn’t exist. And this sort of work can require a massive rewrite of some parts of the relevant underlying technologies, and it requires time.
Score: 4 Votes (Like | Disagree)
NickName99 Avatar
71 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Score: 4 Votes (Like | Disagree)
Cayden Avatar
71 months ago
I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.

Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
Now I’m not sure so take this with a grain of salt, but as an engineer I’m inclined to believe “mid-frame event processing” means they are updating some pixel information (likely just the pixels associated with the pencil) in between frame updates in which all pixel information is updated and displayed. In other words, in between hardware detections of the pencil location, software would update where it presicts the pencil to be on the next update, and it can start looking for the pencil there instead of looking arbitrarily, mean the location can (usually) be found quicker. What I’m not sure about is if these pixels are actually being updated mid-frame or if the processing is simply keeping this information stored until the next frame is ready to update. I can’t see how the pixels could be updated mid-frame unless they had an individual refresh rate, so I’m inclined to believe the second case. If it’s the second case, it would make sense why Apple doesn’t want to give developers access to this, as this could quickly lead to timing errors between the software and hardware interrupts, such that it would only work within Apple’s framework and not an arbitrary code framework.
Score: 3 Votes (Like | Disagree)