Chris Chang of Nomura Securities has issued a note to investors claiming that Sony may be running behind schedule in terms of supplying Apple with dual-lens camera modules for the next-generation 5.5-inch iPhone, which rumors suggest may be called the iPhone 7 Plus or iPhone Pro.
We think Sony may not be able to deliver its full share of dual cameras to Apple due to: (1) lower-than-expected yield, and (2) damage to its production facility from the April earthquake in Kumamoto.
Chang believes that Apple will instead turn to LG as its primary supplier of dual-lens camera modules for the larger-sized iPhone expected to launch in September.
Both investment bank Nomura Securities and research firm Citi Research believe all 5.5-inch iPhones will be equipped with dual-lens camera modules, rather than just one model, echoing comments made by KGI Securities analyst Ming-Chi Kuo earlier this month.
Kuo previously said that Apple had two 5.5-inch versions of the iPhone 7 Plus in development, including one with a single iSight rear-facing camera and another boasting a dual-lens camera module, but he has since recanted. Meanwhile, the 4.7-inch iPhone 7 is widely expected to retain a single-lens camera.
Nomura Securities also believes that Apple will include optical image stabilization (OIS) on both the 4.7-inch and 5.5-inch iPhone, whereas the feature has been exclusive to the larger iPhone 6 Plus and iPhone 6s Plus over the past two generations.
Leaked images, components, and renders potentially offer a first look at Apple's dual-lens camera system, but rumors have been conflicting about the exact design.
The switch to dual-lens camera modules has been linked to Apple's acquisition of LinX Technology, which could lead to "DSLR-quality" photos on iPhones. LinX's multi-aperture cameras are also smaller sized than single-aperture cameras, meaning the iPhone 7 Plus could have a slightly less protruding camera lens.
LinX camera modules offer a number of other benefits, including 3D depth mapping, better color accuracy and uniformity, ultra HDR, low noise levels, higher resolution, low costs, zero shutter lag, and a compact design that allows for edge-to-edge displays. A recent video demo provides a good overview of dual-camera technology.
Apple recently patented a dual-camera system consisting of one standard wide-angle lens, similar to what is found in the latest iPhones, and a second telephoto lens capable of capturing zoomed-in video and photos. In a recent video, we visualized what the interface could look like on future iOS devices.
Dual-camera smartphones like the Huawei P9 and P9 Plus are expected to inspire "killer apps" from smartphone manufacturers and third-party developers.
Top Rated Comments
This is a tech company, not a clothing company. Adding color and size options is just a distraction for stalled innovation.
I like it better when they cater to me.
The term "Optical Zoom," e.g., is the most misused in all of mobile photography. OPTICAL zoom changes the angle of view and the perspective/visual relationship between foreground and background objects as you move in or out by changing the focal length of the lens.
OTOH, "Digital Zooming," no matter how you cut it, is simply in-camera cropping, NOT true optical zoom. It may have been accomplished in some phone or other, but even so, virtually non-existent on cell cams to date - although external clip-on lenses CAN change the perspective (at the cost of some optical degredation).
Once we go to multiple lenses, here are two of the approaches used to date, and from the article/video, it looks like Apple might be going in a third.
1. The LG G5's two lens array combines a "normal" cell perspective lens and a super-wide angle lens, and doesn't blend the images at all - it's just two cams with different focal lengths (and different sensors, one 16MP and one 8MP), so gives you an "optical JUMP" between the two - each of which can be digitally zoomed, so not a smooth zoom across many focal lengths. However, in the interface, it does give the illusion of OZ by digitally zooming as much as one lens can and then jumping abruptly into using the other focal length.
2. The Huawei P9 dual lens module works on a different principle, which resembles the LAB color space in PhotoShop - and which sounds more like what I THOUGHT Apple seems to be up to until I read/watched the article.
In this system, both lenses are the same focal length, but one is used to gather only color (chrominance) information while the other is strictly B/W and gathers only light levels information (luminance) - so a very sharp, colorless rendering... ....and then the chroma and luminance data can be combined into a much sharper picture.
With two lenses shooting from slightly different angles, there are also a variety of interesting possibilities for recording and utilizing the stereoscopic (3D) data generated by the two lenses slightly different view of the image - including some beyond a 3D effect - which likely wouldn't be a primary result if the two lenses are gathering two different classes of data.
3. According to the demo video here, you also wouldn't be getting digital blending, rather the ability to shoot split-screen videos (or stills) at two focal lengths simultaneously - more like the LG G5 but without the "semi-fake OZ." To me this sounds like a waste of what could be done like the above two approaches.
For one thing, if both focal lengths are the same and used much like the principle behind multi-mirror telescopes, then inherently, gathering twice the light in the same exposure should result in much improved low light shots.
So I hope it's not as shown in the video, or at least that there will be considerably more to it THAN that....
Would you prefer different iterations of OS X too?