New in OS X: Get MacRumors Push Notifications on your Mac

Resubscribe Now Close

Third-Party macOS Security Tools Vulnerable to Malware Code-Signing Bypasses for Years

Hackers have had an "easy way" to get certain malware past signature checks in third-party security tools since Apple's OS X Leopard operating system in 2007, according to a detailed new report today by Ars Technica. Researchers discovered that hackers could essentially trick the security tools -- designed to sniff out suspiciously signed software -- into thinking the malware was officially signed by Apple while they in fact hid malicious software.


The researchers said that the signature bypassing method is so "easy" and "trivial" that pretty much any hacker who discovered it could pass off malicious code as an app that appeared to be signed by Apple. These digital signatures are core security functions that let users know the app in question was signed with the private key of a trusted party, like Apple does with its first-party apps.

Joshua Pitts, senior penetration testing engineer for security firm Okta, said he discovered the technique in February and informed Apple and the third-party developers about it soon after. Okta today also published information about the bypass, including a detailed disclosure timeline that began on February 22 with a report submitted to Apple and continues to today's public disclosure.

Ars Technica broke down how the method was used and which third-party tools are affected:
The technique worked using a binary format, alternatively known as a Fat or Universal file, that contained several files that were written for different CPUs used in Macs over the years, such as i386, x86_64, or PPC. Only the first so-called Mach-O file in the bundle had to be signed by Apple. At least eight third-party tools would show other non-signed executable code included in the same bundle as being signed by Apple, too.

Affected third-party tools included VirusTotal, Google Santa, Facebook OSQuery, the Little Snitch Firewall, Yelp, OSXCollector, Carbon Black’s db Response, and several tools from Objective-See. Many companies and individuals rely on some of the tools to help implement whitelisting processes that permit only approved applications to be installed on a computer, while forbidding all others.
Developer Patrick Wardle spoke on the topic, explaining that the bypass was due to ambiguous documentation and comments provided by Apple regarding the use of publicly available programming interfaces that make digital signature checks function: "To be clear, this is not a vulnerability or bug in Apple's code... basically just unclear/confusing documentation that led to people using their API incorrectly." It's also not an issue exclusive to Apple and macOS third-party security tools, as Wardle pointed out: "If a hacker wants to bypass your tool and targets it directly, they will win."

For its part, Apple was said to have stated on March 20 that it did not see the bypass as a security issue that needed to be directly addressed. On March 29, the company updated its documentation to be more clear on the matter, stating that "third-party developers will need to do additional work to verify that all of the identities in a universal binary are the same if they want to present a meaningful result."



Top Rated Comments

(View all)

2 weeks ago
These companies are prioritizing speed for security. We can assume they'll now implement proper checks, but it will come at the cost of speed.

I'm sure most won't bother to read this article and blame Apple, but the real blame here is with developers including Little Snitch, xFence, and Facebook's OSquery. They're the ones that failed to properly check these signatures.
Rating: 12 Votes
2 weeks ago
Wow, but somehow, I'm less concerned about the security threat than I am excited to have discovered the job title "Senior Penetration Testing Engineer". ...someone's up for a performance review & promotion!
Rating: 6 Votes
2 weeks ago
Does Apple give a damn?? Obviously not. It's focused now on important kindergarten stuff like animojis and AR gimmicks.
Rating: 5 Votes
2 weeks ago


I'm sure most won't bother to read this article and blame Apple, but the real blame here is with developers including Little Snitch, xFence, and Facebook's OSquery. They're the ones that failed to properly check these signatures.


It's Apple's fault. When 8 separate developers use the API in the wrong way, there's an issue with the API and instructions.
Rating: 4 Votes
2 weeks ago
This is very bad. Thank goodness for white-hats who find this stuff out.
Rating: 4 Votes
2 weeks ago

I'm not sure I'd blame the devs here. The problem is the documentation. Once again reminding us that tech writers are an underappreciated bunch.


The current Apple documentation insists on the need to vet all certificates. But that slows things down, which is why some developers have chosen not to do so.

Is it the states fault if people don't follow speed limit signs?
Rating: 2 Votes
2 weeks ago

It's Apple's fault. When 8 separate developers use the API in the wrong way, there's an issue with the API and instructions.


No, it's really not. It's the developers responsibility to use the proper security procedures in their app. Is it the states fault that people fail to follow speed limit signs?
Rating: 2 Votes
2 weeks ago

Can someone explain the issue at hand please, I don't really understand the problem here.
I have Little snitch, is Little snitch easily hacked or what?
Bit confusing article.:confused:

Not easily hacked - see this discussion
https://forums.obdev.at/viewtopic.php?f=1&t=11372

It seems that LS 4.1 addressed the problem but we will need to wait for the developers to respond to this speculation
Rating: 1 Votes
2 weeks ago
"stating that "third-party developers will need to do additional work to verify that all of the identities in a universal binary are the same if they want to present a meaningful result."
Rating: 1 Votes
2 weeks ago

So... I'm confused. Doesn't macOS itself check whether an app is properly signed or not before it's allowed to run? Is that feature working properly (and does it actually exist?) If so, then this really doesn't matter.

If not... then perhaps there's actually a need for virus checkers in macOS that I wasn't aware of?


That is the inherent problem with X509 and Gatekeeper. It defaults to trusting anything in it's trust store. With Gatekeeper you just pay for a developer account and get your signing key, or steal someone elses and your code will run on any Mac by default. Determining code authenticity is a solved problem with TOFU ('https://en.wikipedia.org/wiki/Trust_on_first_use') tools like PGP/GPG. There is some really critical software out there that does not have GPG signatures, https://mostvulnerable.com.
Rating: 1 Votes

[ Read All Comments ]