Apple Shares Updated iOS Security Guide With Info on Shortcuts, Siri Suggestions, Screen Time and More

Apple today published an updated version of its iOS security white paper [PDF] for iOS 12, with information on new features and updates introduced with the iOS 12 software.

According to Apple's Document Revision History, the updated guide covers iOS 12 features like Siri Suggestions, Siri Shortcuts, the Shortcuts app, Screen Time, Password AutoFill Student ID cards, and more.


On Siri Suggestions, for example, Apple explains that suggestions for apps and shortcuts are generated using on-device machine learning, with no data going to Apple except info that can't be used to identify the user.

On the Shortcuts app, Apple explains that shortcuts can be optionally synced across Apple devices using iCloud or shared with other users. Apple protects against malicious JavaScript within shortcuts by updating malware definitions to identify malicious scripts at run-time.
Custom shortcuts can also run user-specified JavaScript on websites in Safari when invoked from the share sheet. In order to protect against malicious JavaScript that, for example, trick the user into running a script on a social media website that harvests their data, updated malware definitions are downloaded to identify malicious scripts at run-time. The first time that a user runs Javascript on a domain, the user is prompted to allow Shortcuts containing javascript to run on the current webpage for that domain.
Screen Time, meanwhile uses CloudKit's end-to-end encryption to protect usage data. Apple only collects Screen Time statistics if iPhone and Apple Watch analytics is turned on, with Apple monitoring whether Screen Time was turned on during Setup Assistant, whether Screen Time is turned on, whether Downtime is enabled, the number of times the "Ask for more" feature is used, and the number of app limits applied.

One interesting bit in the document relates to the new feature that lets a second appearance be added to Face ID in iOS 12. Adding a secondary appearance, says Apple, will decrease the probability that a random person can unlock the iPhone from 1 in 1,000,000 to 1 in 500,000.
The probability that a random person in the population could unlock your iPhone is 1 in 50,000 with Touch ID or 1 in 1,000,000 with Face ID. This probability increases with multiple enrolled fingerprints (up to 1 in 10,000 with five fingerprints) or appearances (up to 1 in 500,000 with two appearances).
Apple's security document explains in detail how each and every iOS 12 feature works and how it's protected. The guide is filled with many small but significant details on iOS 12 features, and for anyone interested in the security of the iPhone and the iPad, the full document is worth checking out.

Related Roundup: iOS 12


Top Rated Comments

(View all)
Avatar
13 weeks ago

I still don’t understand any of that. Shouldn’t increasing the appearances or fingerprints decrease the probability of false positives, as in make it harder for unauthorized access because there’s more data to screen against before granting access?


No, because adding an additional face or fingerprint isn't giving additional data to an existing entry - it is adding a second entry.

Think of it this way. Let's say you use a 4 digit pin to unlock your phone. The chances of a person guessing that pin is 1 in 10000. Now let's say you can unlock your phone not with just the one pin code, but another pin code. Suddenly, the chance of a person guessing your pin becomes 2 in 10000, or 1 in 5000.

The same idea goes for FaceID and TouchID, the difference being that someone isn't going to be "guessing" your fingerprint or face - but that a person with similar fingerprints or face may be able to unlock the phone. This is called a false positive - someone is able to unlock the phone when they shouldn't be able to (versus a false negative, when someone should be able to unlock the phone but they can't).

At the moment, the false positive rate for FaceID is 1 in 1000000 - i.e. the chance of a person who looks similar enough to you unlocking your phone is 1 in a million. If you add a second appearance (either of your own face or of someone else), then the false positive rate will double to 2 in 1000000, or 1 in 500000.
Rating: 5 Votes
Avatar
13 weeks ago

Adding a secondary appearance, says Apple, will decrease the probability that a random person can unlock the iPhone from 1 in 1,000,000 to 1 in 500,000.

That is a misinterpretation of the information. Adding additional fingerprints or appearances increases the probability of false positives. It is stated in the quote just after:

The probability that a random person in the population could unlock your iPhone is 1 in 50,000 with Touch ID or 1 in 1,000,000 with Face ID. This probability increases with multiple enrolled fingerprints (up to 1 in 10,000 with five fingerprints) or appearances (up to 1 in 500,000 with two appearances).

Rating: 5 Votes
Avatar
13 weeks ago

I came, I saw, I closed the App.

I am a power user and I can't think of a single use case.

Then that makes you an average user. :p
Rating: 2 Votes
Avatar
13 weeks ago

I came, I saw, I closed the App.

I am a power user and I can't think of a single use case.


Then you aren't really a power user.
Rating: 2 Votes
Avatar
12 weeks ago

I'm sorry but whats the difference between a mathematical representation and a pixel representation? They're both unique so there isn't anything more secure about one vs the other.. You might save some storage space at most.


It’s a huge difference! A mathematical representation isn’t going to be reversed engineered back into your face. Their are no identifying markers to trace it back to you.

Thats what they say, but you don't know if thats what they do. They don't let you verify their software... It's proprietary they can say one thing and do the other... Just like any malicious entity.
It's easy to make a software button look unselected but make the internal choice selected. It's also easy to write around the signed authorization... The fact that they have the ability to do this period should concern people.
If they didn't want people to have access to this information they wouldn't build a door to get it... Which is suspicious in my view. And is probably designed for abuse from the beginning...


Do you really think that Apple would secretly do this? Do you know what kind of PR nightmare that would turn into if someone found out?

Seriously, if you’re this paranoid, just don’t use FaceID. But stop spreading FUD.
Rating: 2 Votes
Avatar
12 weeks ago

There certainly are identifying markers in a mathematical model... Thats what makes each face unique.. And im sure a government can determine with the help of Apple who each person is by looking at the model or some national database.



Its spreading awareness... These technologies are already abused and continue to be abused to the detriment of our liberties.

Yeah I think Apple would and has done this.. I'm sure they've had to bow down to US intelligence services in one form or another.. They were a PRISM member after all.. Of course it would be a PR nightmare. But they documented a capability in plain sight and no one is talking about it.. who's to say Apple doesn't have a secret certificate and authentication team swapping out code to enable governments to take advantage of?

I have no doubt Apple goes to extreme levels to protect their PR. Faux court cases for one example.


Believe what you will, but you’re basing your assumptions on zero evidence. Like I said, if you don’t trust Apple or are truly paranoid, don’t use FaceID - but through your wild accusations without any proof you’re simply spreading fear, uncertainty and doubt.
Rating: 1 Votes
Avatar
13 weeks ago

I still don’t understand any of that. Shouldn’t increasing the appearances or fingerprints decrease the probability of false positives, as in make it harder for unauthorized access because there’s more data to screen against before granting access?


No, if you have 5 fingerprints to match instead of 1 then there is now 5 times the chance a fingerprint will match to unlock the phone. Same with Face ID where now 2 faces unlocks the phone instead of 1 , more of a chance. So more fingerprints/faces to match against random people = more of a probability of getting a false positive.

I think you're confusing that with chances of you unlocking the phone, this increases as you have more 5 fingerprints to match instead of 1. They are two different things
Rating: 1 Votes
Avatar
13 weeks ago
I came, I saw, I closed the App.

I am a power user and I can't think of a single use case.
Rating: 1 Votes
Avatar
12 weeks ago

You realize that a picture of your face isn't actually stored on the phone? It's only a mathematical representation of your face, just like with TouchID (TouchID never stores actually images of your fingerprints, just a mathematical representation).


I'm sorry but whats the difference between a mathematical representation and a pixel representation? They're both unique so there isn't anything more secure about one vs the other.. You might save some storage space at most.


In addition, in the paragraph you posted, it explicitly says that you must sign a digitally signed authorization before enabling FaceID - i.e. It is not on by default, and can only be turned on by the user when explicit permission is given.


Thats what they say, but you don't know if thats what they do. They don't let you verify their software... It's proprietary they can say one thing and do the other... Just like any malicious entity.
It's easy to make a software button look unselected but make the internal choice selected. It's also easy to write around the signed authorization... The fact that they have the ability to do this period should concern people.
If they didn't want people to have access to this information they wouldn't build a door to get it... Which is suspicious in my view. And is probably designed for abuse from the beginning...
Rating: 1 Votes
Avatar
12 weeks ago

Lots of evidence points to this kind of track record. Just google search ios backdoor research and you'll find people who've studied the OS and found evidence of data gathering. Apple isn't your friend and I think you're making a mistake if you think they have your privacy in mind.

And I do have evidence, the mere fact that Apple has the ability to pull FaceID information with a valid certificate is enough to raise accusation that this system is rigged for those who Apple is obliged to grant access to.

So your accusation that its FUD is more ridiculous than anything. People should be aware of this and these injustices.


I think, at this point, the only response I can give to you is simply:

OK
Rating: 1 Votes
[ Read All Comments ]