Apple Launches New Blog to Share Details on Machine Learning Research

Apple today debuted a new blog called the "Apple Machine Learning Journal," with a welcome message for readers and an in-depth look at the blog's first topic: "Improving the Realism of Synthetic Images." Apple describes the Machine Learning Journal as a place where users can read posts written by the company's engineers, related to all of the work and progress they've made for technologies in Apple's products.

In the welcome message, Apple encourages those interested in machine learning to contact the company at an email address for its new blog, machine-learning@apple.com.

apple machine learning journal

Welcome to the Apple Machine Learning Journal. Here, you can read posts written by Apple engineers about their work using machine learning technologies to help build innovative products for millions of people around the world. If you’re a machine learning researcher or student, an engineer or developer, we’d love to hear your questions and feedback. Write us at machine-learning@apple.com

In the first post -- described as Vol. 1, Issue 1 -- Apple's engineers delve into machine learning related to neural nets that can create a program to intelligently refine synthetic images in order to make them more realistic. Using synthetic images reduces cost, Apple's engineers pointed out, but "may not be realistic enough" and could result in "poor generalization" on real test images. Because of this, Apple set out to find a way to enhance synthetic images using machine learning.

Most successful examples of neural nets today are trained with supervision. However, to achieve high accuracy, the training sets need to be large, diverse, and accurately annotated, which is costly. An alternative to labelling huge amounts of data is to use synthetic images from a simulator. This is cheap as there is no labeling cost, but the synthetic images may not be realistic enough, resulting in poor generalization on real test images. To help close this performance gap, we’ve developed a method for refining synthetic images to make them look more realistic. We show that training models on these refined images leads to significant improvements in accuracy on various machine learning tasks.

In December 2016, Apple's artificial intelligence team released its first research paper, which had the same focus on advanced image recognition as the first volume of the Apple Machine Learning Journal does today.

The new blog represents Apple's latest step in its progress surrounding AI and machine learning. During an AI conference in Barcelona last year, the company's head of machine learning Russ Salakhutdinov provided a peek behind the scenes of some of Apple's initiatives in these fields, including health and vital signs, volumetric detection of LiDAR, prediction with structured outputs, image processing and colorization, intelligent assistant and language modeling, and activity recognition, all of which could be potential subjects for research papers and blog posts in the future.

Check out the full first post in the Apple Machine Learning Journal right here.

Popular Stories

Apple Vision Pro 2 Feature 2

Apple Reportedly Suspends Work on Vision Pro 2

Tuesday June 18, 2024 8:17 am PDT by
Apple has suspended work on the second-generation Vision Pro headset to singularly focus on a cheaper model, The Information reports. Apple was widely believed to have plans to divide its Vision product line into two models, with one "Pro" model and one lower-cost standard model. The company is said to have been deprioritizing the next Vision Pro headset over the past year, gradually...
Apple WWDC24 Apple Intelligence hero 240610

Apple Explains iPhone 15 Pro Requirement for Apple Intelligence

Wednesday June 19, 2024 4:48 am PDT by
With iOS 18, iPadOS 18, and macOS Sequoia, Apple is introducing a new personalized AI experience called Apple Intelligence that uses on-device, generative large-language models to enhance the user experience across iPhone, iPad, and Mac. These new AI features require Apple's latest iPhone 15 Pro and iPhone 15 Pro Max models to work, while only Macs and iPads with M1 or later chips will...
2022 back to school apple feature

Apple's 2024 Back to School Sale Launching This Week

Monday June 17, 2024 12:27 pm PDT by
Apple will launch its annual Back to School promotion for university students in the United States and Canada this week, according to Bloomberg's Mark Gurman. Apple's back to school sales provide students with a free Apple gift card when purchasing a Mac or an iPad, and this year's promotion could help Apple push the new M2 iPad Air and M4 iPad Pro models. Last year, Apple offered U.S....
apple watch series 9 display

Kuo: Apple Watch Series 10 to Get Larger Screen and Thinner Design

Monday June 17, 2024 1:20 am PDT by
This year's Apple Watch Series 10 will be thinner and come in larger screen sizes than previous models, according to Apple analyst Ming-Chi Kuo. In his latest industry note -10-and-98075c44ce92">shared on Medium, Kuo said the screen size options on the next-generation Apple Watch will increase from 41mm to 45mm, and from 45mm to 49mm, while being encased in a thinner design. For reference,...
M4 Real Feature Red

M4 MacBook Pro Models Expected to Launch in Late 2024

Tuesday June 18, 2024 10:50 am PDT by
MacBook Pro models with an M4 chip are expected to launch in the fourth quarter of 2024, according to display analyst Ross Young. In a tweet for subscribers, Young said that panel shipments for new 14-inch and 16-inch MacBook Pro models are set to begin in the third quarter of 2024, which suggests a launch toward the end of the year. Apple started its M4 chip refresh in May with the launch...
Apple Pay Later feature 1

Apple Discontinuing Apple Pay Later

Monday June 17, 2024 11:44 am PDT by
Apple is discontinuing Apple Pay Later, the buy now, pay later feature that it just launched last October. Apple Pay Later is being discontinued as of today, but people who have existing Apple Pay Later loans will be able to continue to pay them off and manage them through the Wallet app. Apple announced plans to end the feature in a statement provided to 9to5Mac, which also notes that...
watchOS 11 Thumb 2 1

watchOS 11 Supports Automatic Nap Detection

Monday June 17, 2024 4:05 pm PDT by
watchOS 11 appears to include a new feature that allows an Apple Watch to automatically detect and record when you're taking a nap. As shared on Reddit, an Apple Watch owner took a nap and was able to see the sleep data recorded in the Health app, despite not putting the device in Sleep Mode. Right now, the Apple Watch only tracks and records sleep when it is in Sleep Mode, and there is no...
iPod Nano vs iPod Pro Ad Feature 1

Apple Developing Thinner MacBook Pro, Apple Watch, and iPhone

Monday June 17, 2024 2:22 am PDT by
Apple intends to slim down the MacBook Pro, Apple Watch, and iPhone, with the new ultra-thin M4 iPad Pro a sign of the company's new design trajectory, according to Bloomberg's Mark Gurman. When the M4 iPad Pro was unveiled last month, Apple touted it as the company's thinnest product ever, and even compared it to the 2012 iPod nano to emphasize its slim dimensions. Writing in the latest ...

Top Rated Comments

MikhailT Avatar
90 months ago
Except no, it isn’t. It isn’t in this area or other areas.

Apple is sharing their knowledge, and others? They aren’t! Except informercials.
I think he meant that Apple is having a hard time recruiting more AI researchers/scientists that needs to be able to publish their works (they're not the engineers type). In order for Apple to benefit from their minds, they have to start opening up to the public. This isn't your traditional CS work, this is purely scientific research that has a long history of journal-based reviews and public access.

There were many rumors that many AI researchers turned down jobs at Apple simply because they would not be able to publish their works. For these scientists, it is not about the money or the company, it is all about having their work published with their name on it.

In addition, this is one of the areas where knowing other research benefits everyone at the same time.

Google, Facebook, Microsoft and others are in fact publishing their works throughout various mediums (magazines, research papers, etc).

In fact, they all started a partnership to share research among each other, Partnership on AI here: https://www.partnershiponai.org (Apple is founding member along with Microsoft, IBM, Google, Facebook, Amazon, etc.
Score: 11 Votes (Like | Disagree)
AngerDanger Avatar
90 months ago
In the interest of being all scientific and sharing stuff, I read about half of the blogpost and realized some of the implications of its content. The blog specifically uses the example of human eye recognition in its explanation of machine learning and refined synthetic machine-based learning. Hmmmm, I wonder what thing Apple could be using all of this ocular information for? ;)

Assessing Gaze
Part of the blog places emphasis on knowing which direction the sampled eyes are looking. In fact, if the refinement process moves the iris too much, that output is (I think) weighted as less accurate. In the rumors leading up to the iP8 release, many commenters have voiced concern over the device's ability to understand whether or not you actually want it to unlock; it seems Apple might be attempting to address that concern.



Use of Monochrome Samples
Folks have also discussed the potential inability for iris/eye scanning technology to work in the dark, but perhaps they're not considering that your iPhone (or Android) can already see you in the dark. When held to your face during a call in a dark environment, it will shut the screen off. Next to the earpiece, there's a little IR LED that illuminates objects held close to it, and when the phone sees that particular of IR light, it shuts the screen off.



If that light were brighter, it could illuminate the user's entire face. However, because it's only IR light, it wouldn't see the full visible spectrum of light (RGB); it would only see monochrome faces in the dark. It just so happens that the sample images Apple is using are already monochrome.

Anyway, I gotta go buy more tinfoil for my hat!

Attachment Image
Score: 6 Votes (Like | Disagree)
Crzyrio Avatar
90 months ago
Wait?!

Apple launches a blog with employees talking about how they are doing their job???????

The hell froze over, Steve Jobs DEFINITIVELY wouldn’t allow THAT!
It is a must in the AI field
Score: 5 Votes (Like | Disagree)
alwaysbeincontact Avatar
90 months ago
Neat, interesting stuff, nice to Apple getting into blogging now and posting about this future tech.
Score: 4 Votes (Like | Disagree)
dabirdwell Avatar
90 months ago
Interesting! I didn't know about this partnership. I wonder how Elon Musk feels, and why Tesla hasn't joined.
He has OpenAI.

https://www.wired.com/2016/04/openai-elon-musk-sam-altman-plan-to-set-artificial-intelligence-free/
Score: 2 Votes (Like | Disagree)
Zirel Avatar
90 months ago
Wait?!

Apple launches a blog with employees talking about how they are doing their job???????

The hell froze over, Steve Jobs DEFINITIVELY wouldn’t allow THAT!
Score: 1 Votes (Like | Disagree)