Deep Neural Networks for Face Detection Explained on Apple's Machine Learning Journal

Apple today published a new entry in its online Machine Learning Journal, this time covering an on-device deep neural network for face detection, aka the technology that's used to power the facial recognition feature used in Photos and other apps.

Facial detection features were first introduced as part of iOS 10 in the Core Image framework, and it was used on-device to detect faces in photos so people could view their images by person in the Photos app.


Implementing this technology was no small feat, says Apple, as it required "orders of magnitude more memory, much more disk storage, and more computational resources."
Apple's iCloud Photo Library is a cloud-based solution for photo and video storage. However, due to Apple's strong commitment to user privacy, we couldn't use iCloud servers for computer vision computations. Every photo and video sent to iCloud Photo Library is encrypted on the device before it is sent to cloud storage, and can only be decrypted by devices that are registered with the iCloud account. Therefore, to bring deep learning based computer vision solutions to our customers, we had to address directly the challenges of getting deep learning algorithms running on iPhone.
Apple's Machine Learning Journal entry describes how Apple overcame these challenges by leveraging GPU and CPU in iOS devices, developing memory optimizations for network interference, image loading, and caching, and implementing the network in a way that did not interfere with other tasks expected on iPhone.

The new entry is well worth reading if you're interested in the specific details behind how Apple overcame these challenges to successfully implemented the feature. The technical details are dense, but understandable, and it provides some interesting insight into how facial recognition works.

With its Machine Learning Journal, Apple aims to share the complex concepts behind its technology so the users of its products can get a look behind the curtain. It also serves as a way for Apple's engineers to participate in the AI community.

Apple has previously shared several articles on Siri, including how "Hey Siri," works, and a piece on using machine learning and neural networks for refining synthetic images.



Top Rated Comments

(View all)
Avatar
12 months ago

But bahgawd, the technology Apple is pulling off really is falling squarely into the realm of magical.

The entire AI community is working on some voodoo. If you couple it with advances in robotics... we are about 2 hours and 23 minutes away from the uprising. This mofo is untethered.
[MEDIA=youtube]fRj34o4hN4I[/MEDIA]
Rating: 1 Votes
Avatar
12 months ago

Anyone else just wowed by the amount of technology embedded into this new iPhone? Our phones are learning more about us then we ever knew Before.

Yet people keep complaining about price, because all it has is the same hardware as the iPhone 8, make childish comments by attributing extra cost to the OLED screen and claiming that no innovation in the smartphone industry is to be had. These high school minded kids have got to grow up.
Rating: 1 Votes
Avatar
12 months ago
Anyone else just wowed by the amount of technology embedded into this new iPhone? Our phones are learning more about us then we ever knew Before.
Rating: 1 Votes
Avatar
12 months ago
Do I enjoy paying over $1000 for a phone (as I have for what, 3 straight years now?)? No. Will I continue to do so? Until they make me a 512gb phone south of that mark, yes. Yes I will.

But bahgawd, the technology Apple is pulling off really is falling squarely into the realm of magical.
Rating: 1 Votes
[ Read All Comments ]