Apple has been awarded a patent by the U.S. Patent and Trademark Office (via AppleInsider) for a digital camera including a refocusable imaging mode adapter, with the document also discussing the potential use of a similar camera system in a device like the iPhone.
The patent details a camera that is able to be configured to operate in a lower-resolution mode that includes refocusing capability in addition to a high-resolution non-refocusable mode, with the camera's body containing an image mode adaptor to switch between the two.
Also cited in the patent is the plenoptic imaging system used in the Lytro light-field camera, which Apple draws inspiration from but points out that its own microlens array can produce higher-quality images because of a higher spatial resolution. Apple also cites the Lytro's camera system as prior art in the patent.
Microlens (440) inserted into light path for lower-resolution refocusable images
A digital camera system configurable to operate in a low-resolution refocusable mode and a high-resolution non-refocusable mode comprising: a camera body; an image sensor mounted in the camera body having a plurality of sensor pixels for capturing a digital image;
An imaging lens for forming an image of a scene onto an image plane, the imaging lens having an aperture; and an adaptor that can be inserted between the imaging lens and the image sensor to provide the low-resolution refocusable mode and can be removed to provide the high-resolution non-refocusable mode,
The adaptor including a microlens array with a plurality of microlenses; wherein when the adaptor is inserted to provide the low-resolution refocusable mode, the microlens array is positioned between the imaging lens and the image sensor.
Microlens (440) removed from light path for higher-resolution standard images
Apple's patent outlines how such a lens system could be integrated with a more complete camera solution incorporating image correction and other features, either in a standalone product or within a mobile device.
The Lytro-like technology naturally leads to speculation that it could be used in Apple's rumored standalone point-and-shoot digital camera, which was first rumored in 2012 after Steve Jobs was quoted his biography done by Walter Isaacson stating his desires for the future involved the reinvention of three industries, with one of them being photography. Isaacson's biography also noted that Jobs had met with the CEO of Lytro, although it has been unclear how much direct interest Apple had in Lytro's technology.
The iPhone 17 Pro and iPhone 17 Pro Max are three months away, and there are plenty of rumors about the devices.
Below, we recap key changes rumored for the iPhone 17 Pro models as of June 2025:Aluminum frame: iPhone 17 Pro models are rumored to have an aluminum frame, whereas the iPhone 15 Pro and iPhone 16 Pro models have a titanium frame, and the iPhone X through iPhone 14 Pro have a...
iPadOS 26 allows iPads to function much more like Macs, with a new app windowing system, a swipe-down menu bar at the top of the screen, and more. However, Apple has stopped short of allowing iPads to run macOS, and it has now explained why.
In an interview this week with Swiss tech journalist Rafael Zeier, Apple's software engineering chief Craig Federighi said that iPadOS 26's new Mac-like ...
Alongside WWDC this week, Logitech announced notable new accessories for the iPad and Apple Vision Pro.
The Logitech Muse is a spatially-tracked stylus developed for use with the Apple Vision Pro. Introduced during the WWDC 2025 keynote address, Muse is intended to support the next generation of spatial computing workflows enabled by visionOS 26. The device incorporates six degrees of...
Thursday June 12, 2025 8:58 am PDT by Tim Hardwick
Apple's iPhone development roadmap runs several years into the future and the company is continually working with suppliers on several successive iPhone models simultaneously, which is why we often get rumored features months ahead of launch. The iPhone 17 series is no different, and we already have a good idea of what to expect from Apple's 2025 smartphone lineup.
If you skipped the iPhone...
Apple today provided developers with a revised version of the first iOS 26 beta for testing purposes. The update is only available for the iPhone 15 and iPhone 16 models, so if you're running iOS 26 on an iPhone 14 or earlier, you won't see the revised beta.
Registered developers can download the new beta software through the Settings app on each device.
The revised beta addresses an...
Apple will finally deliver the Apple Watch Ultra 3 sometime this year, according to analyst Jeff Pu of GF Securities Hong Kong (via @jukanlosreve).
The analyst expects both the Apple Watch Series 11 and Apple Watch Ultra 3 to arrive this year (likely alongside the new iPhone 17 lineup, if previous launches are anything to go by), according to his latest product roadmap shared with...
Thursday June 12, 2025 10:14 am PDT by Joe Rossignol
Apple today added Mac Studio models with M4 Max and M3 Ultra chips to its online certified refurbished store in the United States, Canada, Japan, Singapore, and many European countries, for the first time since they were released in March.
As usual for refurbished Macs, prices are discounted by approximately 15% compared to the equivalent new models on Apple's online store. Note that Apple's ...
This is stupid. Nobody has ever had a need to refocus after the shot, because you can focus when you TAKE the shot in the first place. Also, smartphones small sensors have a huge depth-of-field anyways. You only have shallow/unfocused images in large sensors....
This is what I was getting at. A human user would not likely want to re-focus a shot but a computer might. The computer would do the re-focus in order to gain depth information. With such info it could create a wire frame and a texture map.
Combine this wire frame 3D image with the 3D sensor they reported yesterday can you can drop a real person into a video game.
Today if you tried that with a still image you'd have a "cardboard cut out" dropped into the game. It would look bad. But a real 3D character. People would by that.
You could turn it around backwards too. Take a re-focusable image of a room. Now you can drop a virtual camera into the scene and move the camera around. In a game you could place the chargers in your environment but for way a real-estate sales you can make better presentations because you have the 3D data to allow perspective changes with viewpoint changes.
So YES,I agree, who would want to refocus an image? Answer software would.
This is stupid. Nobody has ever had a need to refocus after the shot, because you can focus when you TAKE the shot in the first place. Also, smartphones small sensors have a huge depth-of-field anyways. You only have shallow/unfocused images in large sensors.
It's a dead end technology.
The most important and useful photography technology that Apple could implement would be to add optical image stabilization. The next would be larger sensors.
Other options would be to allow for interchangeable lenses, and to provide Aperture capability on a mobile device.
A professional photographer has a need to edit and publish photos as quickly as possible. The genius thing about smart phones is that they allow the editing/publishing part to happen in mobile devices in field. The next step would be to implement a higher-quality imaging system (35mm full-frame sensors, various lenses, flash/strobe mounts, other SLR features, etc..)
No need to get silly with light field tech. Just look at what's needed (high-end imaging and rapid publishing) and implement a solution for that.
Although this is very cool, I would much appreciate a megapixel update on the next iphone if possible apple... even just a little to keep up with the nokia lumia!!
It's not the number of pixels, but the size that counts.
Although this is very cool, I would much appreciate a megapixel update on the next iphone if possible apple... even just a little to keep up with the nokia lumia!!
good that you're not in charge of Apple. Megapixels are not everything.
Light field technology is the only way smart phone cameras can continue to shrink in size and increase in quality. Good news.
Quality? The lightfeild image has much lower resolution. That is why Apple's patent allows you use to switch from normal to light field. One way you get good images with one plane focused and the other mode allows refocusabl image but with much lower resolution.
If the sensor has only so many pixels you can use those pixels in two ways. A light field camera might use 100 sensor pixels per image pixel.
How could Apple use this? The technology makes for a good 3D camera too. I doubt many people will want to re-focus their images but they might want 3D and stereo images with one click. Light field can do that.