Apple has been awarded a patent by the U.S. Patent and Trademark Office (via AppleInsider) for a digital camera including a refocusable imaging mode adapter, with the document also discussing the potential use of a similar camera system in a device like the iPhone.
The patent details a camera that is able to be configured to operate in a lower-resolution mode that includes refocusing capability in addition to a high-resolution non-refocusable mode, with the camera's body containing an image mode adaptor to switch between the two.
Also cited in the patent is the plenoptic imaging system used in the Lytro light-field camera, which Apple draws inspiration from but points out that its own microlens array can produce higher-quality images because of a higher spatial resolution. Apple also cites the Lytro's camera system as prior art in the patent.
Microlens (440) inserted into light path for lower-resolution refocusable images
A digital camera system configurable to operate in a low-resolution refocusable mode and a high-resolution non-refocusable mode comprising: a camera body; an image sensor mounted in the camera body having a plurality of sensor pixels for capturing a digital image;
An imaging lens for forming an image of a scene onto an image plane, the imaging lens having an aperture; and an adaptor that can be inserted between the imaging lens and the image sensor to provide the low-resolution refocusable mode and can be removed to provide the high-resolution non-refocusable mode,
The adaptor including a microlens array with a plurality of microlenses; wherein when the adaptor is inserted to provide the low-resolution refocusable mode, the microlens array is positioned between the imaging lens and the image sensor.
Microlens (440) removed from light path for higher-resolution standard images
Apple's patent outlines how such a lens system could be integrated with a more complete camera solution incorporating image correction and other features, either in a standalone product or within a mobile device.
The Lytro-like technology naturally leads to speculation that it could be used in Apple's rumored standalone point-and-shoot digital camera, which was first rumored in 2012 after Steve Jobs was quoted his biography done by Walter Isaacson stating his desires for the future involved the reinvention of three industries, with one of them being photography. Isaacson's biography also noted that Jobs had met with the CEO of Lytro, although it has been unclear how much direct interest Apple had in Lytro's technology.
This is stupid. Nobody has ever had a need to refocus after the shot, because you can focus when you TAKE the shot in the first place. Also, smartphones small sensors have a huge depth-of-field anyways. You only have shallow/unfocused images in large sensors....
This is what I was getting at. A human user would not likely want to re-focus a shot but a computer might. The computer would do the re-focus in order to gain depth information. With such info it could create a wire frame and a texture map.
Combine this wire frame 3D image with the 3D sensor they reported yesterday can you can drop a real person into a video game.
Today if you tried that with a still image you'd have a "cardboard cut out" dropped into the game. It would look bad. But a real 3D character. People would by that.
You could turn it around backwards too. Take a re-focusable image of a room. Now you can drop a virtual camera into the scene and move the camera around. In a game you could place the chargers in your environment but for way a real-estate sales you can make better presentations because you have the 3D data to allow perspective changes with viewpoint changes.
So YES,I agree, who would want to refocus an image? Answer software would.
This is stupid. Nobody has ever had a need to refocus after the shot, because you can focus when you TAKE the shot in the first place. Also, smartphones small sensors have a huge depth-of-field anyways. You only have shallow/unfocused images in large sensors.
It's a dead end technology.
The most important and useful photography technology that Apple could implement would be to add optical image stabilization. The next would be larger sensors.
Other options would be to allow for interchangeable lenses, and to provide Aperture capability on a mobile device.
A professional photographer has a need to edit and publish photos as quickly as possible. The genius thing about smart phones is that they allow the editing/publishing part to happen in mobile devices in field. The next step would be to implement a higher-quality imaging system (35mm full-frame sensors, various lenses, flash/strobe mounts, other SLR features, etc..)
No need to get silly with light field tech. Just look at what's needed (high-end imaging and rapid publishing) and implement a solution for that.
Although this is very cool, I would much appreciate a megapixel update on the next iphone if possible apple... even just a little to keep up with the nokia lumia!!
It's not the number of pixels, but the size that counts.
Although this is very cool, I would much appreciate a megapixel update on the next iphone if possible apple... even just a little to keep up with the nokia lumia!!
good that you're not in charge of Apple. Megapixels are not everything.
Light field technology is the only way smart phone cameras can continue to shrink in size and increase in quality. Good news.
Quality? The lightfeild image has much lower resolution. That is why Apple's patent allows you use to switch from normal to light field. One way you get good images with one plane focused and the other mode allows refocusabl image but with much lower resolution.
If the sensor has only so many pixels you can use those pixels in two ways. A light field camera might use 100 sensor pixels per image pixel.
How could Apple use this? The technology makes for a good 3D camera too. I doubt many people will want to re-focus their images but they might want 3D and stereo images with one click. Light field can do that.
The iPhone is Apple's top-selling product, and it gets an update every year. In 2024, we're expecting the iPhone 16 and iPhone 16 Pro lineup, with an arguably more interesting feature set than we got with the iPhone 15 and iPhone 15 Pro. Subscribe to the MacRumors YouTube channel for more videos. Capture Button All four iPhone 16 models are set to get a whole new button, which will be...
Apple is widely expected to release new iPad Air and OLED iPad Pro models in the next few weeks. According to new rumors coming out of Asia, the company will announce its new iPads on Tuesday, March 26. Chinese leaker Instant Digital on Weibo this morning 日发布%23">claimed that the date will see some sort of announcement from Apple related to new iPads, but stopped short of calling it an...
Apple suppliers will begin production of two new fourth-generation AirPods models in May, according to Bloomberg's Mark Gurman. Based on this production timeframe, he expects the headphones to be released in September or October. Gurman expects both fourth-generation AirPods models to feature a new design with better fit, improved sound quality, and an updated charging case with a USB-C...
Apple's new iPad Pro models with OLED displays will likely begin shipping to customers in April, according to information shared today by Ross Young, CEO of display industry research firm Display Supply Chain Consultants. Bloomberg's Mark Gurman also said the new iPad Pro models might not ship until "deeper" into April in his Power On newsletter on Sunday:I've repeatedly said that new...
Resale value trends suggest the iPhone SE 4 may not hold its value as well as Apple's flagship models, according to SellCell. According to the report, Apple's iPhone SE models have historically depreciated much more rapidly than the company's more premium offerings. The third-generation iPhone SE, which launched in March 2022, experienced a significant drop in resale value, losing 42.6%...
iOS 17.4.1 and iPadOS 17.4.1 should be released within the next few days, with a build number of 21E235, according to a source with a proven track record. MacRumors previously reported that Apple was internally testing iOS 17.4.1. As a minor update for the iPhone, it will likely address software bugs and/or security vulnerabilities. It is unclear if the update will include any other changes. ...
Since Apple unveiled macOS Sonoma 14.4 on March 7, the transition to the latest software update has not been entirely smooth for everyone, and a number of issues have been reported by users that significantly impact their daily workflow. This article lists the most prominent challenges users have faced since updating to macOS Sonoma 14.4, and offers potential solutions where available. USB...
Top Rated Comments
Combine this wire frame 3D image with the 3D sensor they reported yesterday can you can drop a real person into a video game.
Today if you tried that with a still image you'd have a "cardboard cut out" dropped into the game. It would look bad. But a real 3D character. People would by that.
You could turn it around backwards too. Take a re-focusable image of a room. Now you can drop a virtual camera into the scene and move the camera around. In a game you could place the chargers in your environment but for way a real-estate sales you can make better presentations because you have the 3D data to allow perspective changes with viewpoint changes.
So YES,I agree, who would want to refocus an image? Answer software would.
It's a dead end technology.
The most important and useful photography technology that Apple could implement would be to add optical image stabilization. The next would be larger sensors.
Other options would be to allow for interchangeable lenses, and to provide Aperture capability on a mobile device.
A professional photographer has a need to edit and publish photos as quickly as possible. The genius thing about smart phones is that they allow the editing/publishing part to happen in mobile devices in field. The next step would be to implement a higher-quality imaging system (35mm full-frame sensors, various lenses, flash/strobe mounts, other SLR features, etc..)
No need to get silly with light field tech. Just look at what's needed (high-end imaging and rapid publishing) and implement a solution for that.
It's not the number of pixels, but the size that counts.
Quality? The lightfeild image has much lower resolution. That is why Apple's patent allows you use to switch from normal to light field. One way you get good images with one plane focused and the other mode allows refocusabl image but with much lower resolution.
If the sensor has only so many pixels you can use those pixels in two ways. A light field camera might use 100 sensor pixels per image pixel.
How could Apple use this? The technology makes for a good 3D camera too. I doubt many people will want to re-focus their images but they might want 3D and stereo images with one click. Light field can do that.
read the article.....not necessarily a stand alone camera......