Jonathan Zdziarski


'Jonathan Zdziarski' Articles

F-Secure Acquires Jonathan Zdziarski's Mac Security App 'Little Flocker'

Cyber security company F-Secure has acquired Little Flocker, the behavioral analysis-based monitoring app for Macs, developed by iPhone forensics expert and security researcher Jonathan Zdziarski, who joined Apple last month. The Helsinki-based firm announced the news in a press release posted to its site, where it revealed that Little Flocker would be built into a new security product it's releasing, called XFENCE. Little Flocker protects Macs by using advanced behavioral based analysis, and monitors apps that attempt to access confidential files and system resources. It also detects and blocks Mac ransomware. F-Secure will build Little Flocker's next-generation security engine into its new XFENCE technology. XFENCE will complement F-Secure's existing endpoint solutions to provide advanced behavioral Mac protection for both corporate and consumer customers.F-Secure said that the "myth" of Macs not requiring protection against ransomware, backdoors, and other software was fading away, due to "Apple's popularity among senior-level employees and other high-value targets". By acquiring Little Flocker, it said it hoped to further enhance its products' existing cyber security capabilities for the sophisticated detection of zero-day attacks. For businesses, the core technology is to be combined with F-Secure’s security cloud and packaged into its Protection Service for Business, a security solution with centrally managed computer, mobile and server security with integrated patch management and mobile device management. Consumer customers can make use of the Flocker

Apple Hires iPhone Security Expert Jonathan Zdziarski

iPhone forensics expert, security researcher, and former jailbreak community developer Jonathan Zdziarski today announced he has accepted a position with Apple's Security Engineering and Architecture team. He did not reveal his official starting date or responsibilities at the company. I’m pleased to announce that I’ve accepted a position with Apple’s Security Engineering and Architecture team, and am very excited to be working with a group of like minded individuals so passionate about protecting the security and privacy of others. This decision marks the conclusion of what I feel has been a matter of conscience for me over time. Privacy is sacred; our digital lives can reveal so much about us – our interests, our deepest thoughts, and even who we love. I am thrilled to be working with such an exceptional group of people who share a passion to protect that.Zdziarski has provided input on a number of important iOS-related security matters over the years, ranging from Apple's high-profile battle with the FBI over unlocking an iPhone used by a shooter in the 2015 San Bernardino attack to smaller incidents such as a potential WhatsApp flaw uncovered last year. Zdziarski was known as "NerveGas" within the jailbreaking community. He was formerly part of both the iPhone Dev Team and Chronic Dev Team. Zdziarski used to be an active Twitter user, but it appears he has disabled his account recently, possibly due to his employment at

Apple Says it Syncs Call Logs on iCloud As a 'Convenience to Customers' Amid Security Concerns

Earlier today, reports surfaced on The Intercept and Forbes claiming Apple "secretly" syncs Phone and FaceTime call history logs on iCloud, complete with phone numbers, dates and times, and duration. The info comes from Russian software firm Elcomsoft, which said the call history logs are stored for up to four months. Likewise, on iOS 10, Elcomsoft said incoming missed calls that are made through third-party VoIP apps using Apple's CallKit framework, such as Skype, WhatsApp, and Viber, also get synced to iCloud. The call logs have been collected since at least iOS 8.2, released in March 2015, so long as a user has iCloud enabled. Elcomsoft said the call logs are automatically synced, even if backups are turned off, with no way to opt out beyond disabling iCloud entirely.“You can only disable uploading/syncing notes, contacts, calendars and web history, but the calls are always there,” said Vladimir Katalov, CEO of Elcomsoft. "One way call logs will disappear from the cloud, is if a user deletes a particular call record from the log on their device; then it will also get deleted from their iCloud account during the next automatic synchronization.Given that Apple possesses the encryption keys to unlock an iCloud account for now, U.S. law enforcement agencies can obtain direct access to the logs with a court order. Worse, The Intercept claims the information could be exposed to hackers and anyone else who might be able to obtain a user's iCloud credentials. In some cases, hackers could access an iCloud account even without account credentials, such as by using

WhatsApp Security Flaw Leaves 'Trace of All Your Chats' Even After Deletion

Popular third-party chat app WhatsApp is leaving a "forensic trace" of every supposedly deleted chat log, meaning anyone with access to your smartphone -- or another device connected through the cloud -- could potentially access data from the app. The discovery comes from iOS researcher Jonathan Zdziarski, who shared the information in a blog post after discovering the potential security flaw in the latest version of WhatsApp (via The Verge). Zdziarski tested out his theory by beginning a few chat threads, then archiving, clearing, and deleting them, but found that none of the app's deletion methods, even Clear All Chats, "made any difference in how deleted records were preserved." The central flaw appeared to be in the app's SQLite records, which retained the deleted chats in its database that could be accessed by a harmful individual with the right "popular forensics tools." In his post, Zdziarski mentioned that the problem isn't unique to WhatsApp, and has even gone into detail about "forensic trace leakage" in Messages on iOS and OS X, and ways Apple could address such privacy issues, in a separate blog post. He explained succinctly that short-lived chats between friends and family using these apps are "not ephemeral on disk," which not only could be a cause for concern with users, but could allow law enforcement legal access to thought-to-be-deleted WhatsApp messages thanks to the lack of encrypted communication between WhatsApp and iCloud. The core issue here is that ephemeral communication is not ephemeral on disk. This is a problem that Apple has struggled

Apple Confirms Unencrypted Kernel in iOS 10 Beta is Intentional

Yesterday it was discovered that iOS 10 does not feature an encrypted kernel, allowing users and researchers access to the core of the operating system and its inner workings. It was unclear at the time whether the lack of encryption was an accident or intentional, but today Apple confirmed to TechCrunch that the company did not encrypt the kernel for a reason. “The kernel cache doesn’t contain any user info, and by unencrypting it we’re able to optimize the operating system’s performance without compromising security,” an Apple spokesperson told TechCrunch.The kernel, which dictates how software can use hardware and keeps the device secure, is unencrypted so that developers and researchers can "poke around" and find potential security flaws. Because the kernel is easier to access and flaws may be easier to find, Apple can more easily and more quickly patch potential issues. The move is a shift for Apple, who had encrypted the kernel in past versions of iOS, leaving developers and researchers out of the loop on the inner workings of the operating system. As noted by security expert Jonathan Zdziarski, it's likely that Apple has made this shift to prevent groups from "hoarding" vulnerabilities in Apple's software, like the vulnerability used by the FBI to break into the iPhone 5c of the San Bernardino

iOS 10 Beta Features Unencrypted Kernel Making it Easier to Discover Vulnerabilities

Apple's iOS 10 preview, seeded to developers last week, does not feature an encrypted kernel and thus gives users access to the inner workings of the operating system and potential security flaws, reports MIT Technology Review. It is not known if this was an unintentional mistake or done deliberately to encourage more bug reports. Security experts say the famously secretive company may have adopted a bold new strategy intended to encourage more people to report bugs in its software--or perhaps made an embarrassing mistake.In past versions of iOS, Apple has encrypted the kernel, aka the core of the operating system, which dictates how software uses the iPhone's hardware and keeps it secure. According to experts who spoke to the MIT Technology Review, leaving iOS unencrypted doesn't leave the security of iOS 10 compromised, but it makes it easier to find flaws in the operating system. Security flaws in iOS can be used to create jailbreaks or create malware.The goodies exposed publicly for the first time include a security measure designed to protect the kernel from being modified, says security researcher Mathew Solnik. "Now that it is public, people will be able to study it [and] potentially find ways around it," he says.Apple has declined to comment on whether the lack of encryption was intentional or a mistake, but security expert Jonathan Zdziarski believes it was done by choice because it's not a mistake Apple is likely to have made. "This would have been an incredibly glaring oversight, like forgetting to put doors on an elevator," he told MIT Technology Review.

Senate Draft Encryption Bill Called 'Absurd,' 'Dangerous,' and Technically Inept

A draft of an encryption bill created by Senate Intelligence Committee leaders Richard Burr and Dianne Feinstein was released last night, revealing the scope of the legislation that would require technology companies to decrypt data and share it in an "intelligible format" when served with a legal order. The Compliance with Court Orders Act of 2016, a copy of which was shared by Re/code, starts out by declaring "no person or entity is above the law." It says that all providers of communication services and products, from hardware to software, must both protect the privacy of residents of the United States through "implementation of appropriate data security," while still respecting the "rule of law" and complying with legal requirements and court orders to provide information stored either on devices or remotely. To uphold both the rule of law and protect the interests and security of the United States, all persons receiving an authorized judicial order for information or data must provide, in a timely manner, responsive, intelligible information or data, or appropriate technical assistance to obtain such information.In acknowledgement of the disagreement between the FBI and Apple, the legislation does include a clause that prevents it from authorizing "any government officer to require or prohibit any specific design or operating system to be adopted by any covered entity," and it shies away from specific technical demands, but the wording of the act itself, with no contingencies for inaccessible data, makes end-to-end encryption impossible. Any data encrypted by

Shooter's iPhone Could Harbor 'Dormant Cyber Pathogen', Claims San Bernardino DA

The iPhone at the center of the ongoing encryption dispute between Apple and the FBI may contain a "dormant cyber pathogen", according to the San Bernardino county District Attorney. The curious claim appears in an amicus brief filed by Michael Ramos with a California court on Thursday. In the document, Ramos speculates that the iPhone used by terror suspect Syed Rizwan Farook "may contain evidence that can only be found on the seized phone that it was used as a weapon to introduce a lying dormant cyber pathogen that endangers San Bernardino's infrastructure." The apparent threat is cited as a violation of California Penal Code Section §502, covering protections against tampering, interference, damage and unauthorized access to computer systems. The reference suggests Ramos believes that some sort of malware may be contained on the iPhone, but offers no justification for the claim, nor the odd nature of its wording. "It sounds like he's making up these terms as he goes," said iPhone forensics expert Jonathan Zdziarski, speaking to Ars Technica about the filing. "We've never used these terms in computer science." Zdziarski believes that the amicus is simply designed to mislead the courts and manipulate a decision in the FBI's favor. "It offers no evidence whatsoever that the device has, or even might have, malware on it. It offers no evidence that their network was ever compromised." The claim in the court filing is the first time that a law enforcement agency has alluded to what may be contained on the iPhone at the center of the federal investigation. It also

Twitter, eBay, Airbnb, Reddit and More Officially Supporting Apple in FBI Fight [Updated]

Sixteen technology companies today teamed up to officially support Apple in its ongoing encryption dispute with the FBI, a copy of which has been shared by Apple. Twitter, Airbnb, eBay, LinkedIn, Square, Atlassian, Automattic, Cloudflare, GitHub, Kickstarter, Mapbox, Meetup, Reddit, Squarespace, Twilio, and Wickr filed an amicus brief [PDF] backing Apple's assertion that the FBI's use of the All Writs Act to force Apple to help the government unlock the iPhone used by San Bernardino shooter Syed Farook is both unprecedented and dangerous. The government's demand here, at its core, is unbound by any legal limits. It would set a dangerous precedent, in which the government could sidestep established legal procedures authorized by thorough, nuanced statutes to obtain users' data in ways not contemplated by lawmakers."The filing, which urges the court to vacate the government's motion to compel Apple to unlock the phone, argues that handling user data in a "safe, secure, and transparent manner" that protects privacy is of the "utmost importance" to protect consumers from hackers and other wrongdoers, while also recognizing the government's "important work" in law enforcement and national security. It says the companies oppose forced backdoors, but will continue to comply with "proper and reasonable" requests for data. Dozens of technology companies, industry trade groups, and encryption experts have been submitting documents to support Apple, all catalogued on Apple's website. AT&T, Intel, and the Electronic Frontier Foundation filed separate amicus briefs this morning,

Hackers Using Law Enforcement Tools to Access iCloud Backups Unprotected by Two-Factor Authentication

Earlier today, Apple issued a press release stating that an iCloud/Find My iPhone breach had not been responsible for the leak of several private celebrity photos over the weekend, instead pointing towards a "very targeted attack on user names, passwords, and security questions" hackers used to gain access to celebrity accounts. The company did not divulge specific details on how hackers accessed the iCloud accounts, leading Wired writer Andy Greenberg to investigate the methods that hackers might possibly have used to acquire the stolen media. Greenberg visited Anon-IB, a popular anonymous image board where some of the celebrity photos first originated, and discovered that hackers openly discuss exploiting software designed for law enforcement and government officials. Called ElcomSoft Phone Password Breaker (EPPB), the software in question lets hackers enter a stolen username and password to obtain a victim's full iPhone/iPad backup."Use the script to hack her passwd...use eppb to download the backup," wrote one anonymous user on Anon-IB explaining the process to a less-experienced hacker. "Post your wins here ;-)"Acquiring just a user name and password allows hackers access to content on iCloud.com, but with the accompaniment of the ElcomSoft software, a complete backup can reportedly be downloaded into easy-to-access folders filled with the device's contents. According to security researcher Jonathan Zdziarski, who spoke to Wired, metadata from some of the leaked photos is in line with the use of the ElcomSoft software and possibly the iBrute software, which

Apple Addresses iOS 'Backdoor' Concerns by Outlining Legitimate Uses for Targeted Services [Updated]

Earlier this week, forensic expert Jonathan Zdziarski attracted attention for his disclosures of what appeared to be "backdoors" in iOS that could allow for covert data collection of users' information from their devices. While Apple issued a statement denying that anything nefarious was involved, the company has now posted a new support document (via Cabel Sasser) offering a limited description of the three services highlighted in Zdziarski's talk.Each of these diagnostic capabilities requires the user to have unlocked their device and agreed to trust another computer. Any data transmitted between the iOS device and trusted computer is encrypted with keys not shared with Apple. For users who have enabled iTunes Wi-Fi Sync on a trusted computer, these services may also be accessed wirelessly by that computer.The three processes include: - com.apple.mobile.pcapd: Diagnostic packet capture to a trusted computer, used for diagnosing app issues and enterprise VPN connection problems. - com.apple.mobile.file_relay: Used on internal devices and can be accessed (with user permission) by AppleCare for diagnostic purposes on the user's device. - com.apple.mobile.house_arrest: Used by iTunes for document transfer and by Xcode during app development and testing. Security experts will undoubtedly have additional questions about just how these services work and whether there are better and more secure ways of accomplishing the tasks they handle. At the very least, however, today's disclosure demonstrates a willingness by Apple to share information about the legitimate need

Forensic Expert Questions Covert 'Backdoor' Services Included in iOS by Apple

As part of a recent Hackers On Planet Earth (HOPE/X) conference presentation, forensic scientist and iPhone jailbreak expert Jonathan Zdziarski detailed several backdoor security mechanisms that are secretly included in iOS by Apple. These mechanisms make covert data collection easier for Apple and governmental authorities, reports Zdziarski via ZDNet. Zdziarski confirms that iOS is reasonably secure from attack by a malicious hacker, but notes that the mobile OS includes several forensic services and noticeable design omissions that make the OS vulnerable to snooping by forensic tools. These services, such as "lockdownd," "pcapd" and "mobile.file_relay," can bypass encrypted backups to obtain data and can be utilized via USB, Wi-Fi and possibly cellular. They also are not documented by Apple and are not developer or carrier tools as they access personal data that would be not used for network testing or app debugging purposes. While detailing these backdoors, Zdziarski makes it clear he is not a conspiracy theorist, but does want to know why Apple appears to be deliberately compromising the security of the iPhone and opening the door to professional, covert data access. I am not suggesting some grand conspiracy; there are, however, some services running in iOS that shouldn’t be there, that were intentionally added by Apple as part of the firmware, and that bypass backup encryption while copying more of your personal data than ever should come off the phone for the average consumer. I think at the very least, this warrants an explanation and disclosure to