Apple Aims to Prevent Blurry or Underexposed iPhone Photos with Automatic Image Buffering and Comparison
A newly-published patent application from Apple discovered by AppleInsider describes methods that would allow an iPhone to buffer a series of photos before the user presses the shutter button for the device's camera and then automatically select the best one.
It is not uncommon for camera-shake or a less than optimal angle to result in blurry or dark photos in low-light conditions, even on the relatively capable camera on the iPhone. What the patent allows for is for the camera to start taking a series of photos before the user presses the shutter release, then automatically compare them with the one taken at the moment the button was pressed. If the system judges that one of the buffered photos is better, it stores that one in place of the one taken at shutter release.
In particular, the system seeks to minimize the camera shake that can accompany press the iPhone's volume button or tapping the screen to trigger the shutter by capturing images before the button or screen is even touched.
The algorithm described in the patent application uses a scoring system which measures contrast (the usual method used to judge focus), image resolution, dynamic range (the balance of light and dark tones in the image) and color rendering properties to determine which is the best version of the photo. The others are then discarded.
While the selection of the image is an automatic process, the system could allow the user to confirm the device's choice of the best available photo.
The patent application was filed in October of last year but references an earlier application filed in 2009, so it is possible that elements of this approach are used in current iPhones and iPads, although it is clear that the current Camera app for iOS does not include all aspects of the system.
Popular Stories
Apple has announced it will be holding a special event on Tuesday, May 7 at 7 a.m. Pacific Time (10 a.m. Eastern Time), with a live stream to be available on Apple.com and on YouTube as usual. The event invitation has a tagline of "Let Loose" and shows an artistic render of an Apple Pencil, suggesting that iPads will be a focus of the event. Subscribe to the MacRumors YouTube channel for more ...
Apple today released several open source large language models (LLMs) that are designed to run on-device rather than through cloud servers. Called OpenELM (Open-source Efficient Language Models), the LLMs are available on the Hugging Face Hub, a community for sharing AI code. As outlined in a white paper [PDF], there are eight total OpenELM models, four of which were pre-trained using the...
Apple is set to unveil iOS 18 during its WWDC keynote on June 10, so the software update is a little over six weeks away from being announced. Below, we recap rumored features and changes planned for the iPhone with iOS 18. iOS 18 will reportedly be the "biggest" update in the iPhone's history, with new ChatGPT-inspired generative AI features, a more customizable Home Screen, and much more....
Apple has dropped the number of Vision Pro units that it plans to ship in 2024, going from an expected 700 to 800k units to just 400k to 450k units, according to Apple analyst Ming-Chi Kuo. Orders have been scaled back before the Vision Pro has launched in markets outside of the United States, which Kuo says is a sign that demand in the U.S. has "fallen sharply beyond expectations." As a...
Apple is finally planning a Calculator app for the iPad, over 14 years after launching the device, according to a source familiar with the matter. iPadOS 18 will include a built-in Calculator app for all iPad models that are compatible with the software update, which is expected to be unveiled during the opening keynote of Apple's annual developers conference WWDC on June 10. AppleInsider...
Top Rated Comments
Time for Apple to start innovating and stop rehashing other people's ideas...
Yes, many other physical cameras and apps already do this. They take quick series of pictures either side of the shutter button press then attempt to work out which one is the best by analysing them for motion blur. The one with the least motion blur is then saved out. It's often mis-marketed as an "image stabiliser" function but effectively achieves the same thing most of the time.
Some cameras with face detection go further and attempt to discard photos where people are blinking. Chances are, it's taken 10 photos either side of the shutter press, that at least one will be crisp with nobody blinking. I'm not sure why Apple are now trying to patent this common technique - it's hardly new.