Apple today shared a gallery of photos shot by customers using the iPhone 12 mini, iPhone 12, iPhone 12 Pro, and iPhone 12 Pro Max, with scenes including cityscapes, landscapes, portraits of people, and more at day and night.
Shot on iPhone 12 Pro Max by "NKCHU" in China (top) and shot on iPhone 12 Pro Max by Rohit Vohra in India (bottom)
iPhone 12 mini and iPhone 12 models have a dual camera system with Ultra Wide and Wide lenses, while iPhone 12 Pro models have an additional Telephoto lens for optical zoom. Apple explains some of the key camera features across the lineup:
iPhone 12 and iPhone 12 mini feature a powerful dual-camera system with an expansive Ultra Wide camera and a new Wide camera with an ƒ/1.6 aperture that provides 27 percent more light for improved photos and videos in low-light environments. Both models also introduce new computational photography features, which include Night mode and faster-performing Deep Fusion on all cameras, for improved photos in any environment. Smart HDR 3 uses machine learning to intelligently adjust the white balance, contrast, texture, and saturation of a photo for remarkably natural-looking images.
The reimagined pro camera system on iPhone 12 Pro and iPhone 12 Pro Max is even more versatile with Ultra Wide, Wide, and Telephoto cameras, and provides even more creative control to users. iPhone 12 Pro Max takes the pro camera experience even further with a 65 mm focal length Telephoto camera for increased flexibility and 5x optical zoom range, as well as an advanced Wide camera boasting a 47 percent larger sensor with 1.7μm pixels for a massive 87 percent improvement in low-light conditions. A LiDAR Scanner also unlocks advanced capabilities for Pro models, including up to 6x faster autofocus in low-light scenes and the introduction of Night mode portraits.
iPadOS 26 allows iPads to function much more like Macs, with a new app windowing system, a swipe-down menu bar at the top of the screen, and more. However, Apple has stopped short of allowing iPads to run macOS, and it has now explained why.
In an interview this week with Swiss tech journalist Rafael Zeier, Apple's software engineering chief Craig Federighi said that iPadOS 26's new Mac-like ...
Thursday June 12, 2025 8:58 am PDT by Tim Hardwick
Apple's iPhone development roadmap runs several years into the future and the company is continually working with suppliers on several successive iPhone models simultaneously, which is why we often get rumored features months ahead of launch. The iPhone 17 series is no different, and we already have a good idea of what to expect from Apple's 2025 smartphone lineup.
If you skipped the iPhone...
Alongside WWDC this week, Logitech announced notable new accessories for the iPad and Apple Vision Pro.
The Logitech Muse is a spatially-tracked stylus developed for use with the Apple Vision Pro. Introduced during the WWDC 2025 keynote address, Muse is intended to support the next generation of spatial computing workflows enabled by visionOS 26. The device incorporates six degrees of...
The iPhone 17 Pro and iPhone 17 Pro Max are three months away, and there are plenty of rumors about the devices.
Below, we recap key changes rumored for the iPhone 17 Pro models as of June 2025:Aluminum frame: iPhone 17 Pro models are rumored to have an aluminum frame, whereas the iPhone 15 Pro and iPhone 16 Pro models have a titanium frame, and the iPhone X through iPhone 14 Pro have a...
Thursday June 12, 2025 4:53 am PDT by Tim Hardwick
With iOS 26, Apple has introduced some major changes to the iPhone experience, headlined by the new Liquid Glass redesign that's available across all compatible devices. However, several of the update's features are exclusive to iPhone 15 Pro and iPhone 16 models, since they rely on Apple Intelligence.
The following features are powered by on-device large language models and machine...
Apple this week announced that iPhone users will soon be able to watch videos right on the CarPlay screen in supported vehicles.
iPhone users will be able to wirelessly stream videos to the CarPlay screen using AirPlay, according to Apple. For safety reasons, video playback will only be available when the vehicle is parked, to prevent distracted driving. The connected iPhone will be able to...
Wednesday June 11, 2025 4:22 pm PDT by Juli Clover
iOS 26 features a whole new design material that Apple calls Liquid Glass, with a focus on transparency that lets the content on your display shine through the controls. If you're not a fan of the look, or are having trouble with readability, there is a step that you can take to make things more opaque without entirely losing out on the new look.
Apple has multiple Accessibility options that ...
Thursday June 12, 2025 10:14 am PDT by Joe Rossignol
Apple today added Mac Studio models with M4 Max and M3 Ultra chips to its online certified refurbished store in the United States, Canada, Japan, Singapore, and many European countries, for the first time since they were released in March.
As usual for refurbished Macs, prices are discounted by approximately 15% compared to the equivalent new models on Apple's online store. Note that Apple's ...
Apple today provided developers with a revised version of the first iOS 26 beta for testing purposes. The update is only available for the iPhone 15 and iPhone 16 models, so if you're running iOS 26 on an iPhone 14 or earlier, you won't see the revised beta.
Registered developers can download the new beta software through the Settings app on each device.
The revised beta addresses an...
I always wondered, mostly because I never had an iPhone... Are "regular" users capable of taking such photos? I'm assuming they aren't just "point and shoot." Are they retouched in Photoshop? Or are the advanced camera settings (iso, exposure, etc) adjusted so that these photos come out so pretty?
Absolutely. It has little to do with the camera. It's more about imagination and what moves you. Even though I have dSLRs and mirrorless cameras, I've been shooting with iPhones since 2011 and exclusively for about the last 6 years. The key is always having a camera (for me an iPhone) with me in my pocket. I use them as point-n-shoot devices. I do a little post-processing in Lightroom. But that's something I always did when using "regular" cameras.
Here's a photo I made looking upwards at some trees in a residential neighborhood in Palo Alto, California. The view reminded me of person's carotid artery that feeds one's brain. So I snapped a photo with the phone I had at the time, an iPhone X.
Looking back at my pictures from the iPhone 6/6S, they just get better and better through the years. However, when I view the photos on something bigger than a phone screen, you can really tell. My fervent wish is for Apple to partner with someone like FujiFilm to bring the smarts and ease of taking photos on the iPhone to a much larger sensor. Sometimes when taking photos of my kid crawling around with my camera, I wish the process could be as easy as it is on my phone, but with the greater light-gathering/detail/information that comes with a larger sensor. Sigh.
I always wondered, mostly because I never had an iPhone... Are "regular" users capable of taking such photos? I'm assuming they aren't just "point and shoot." Are they retouched in Photoshop? Or are the advanced camera settings (iso, exposure, etc) adjusted so that these photos come out so pretty?
I would expect it to be the later. Sure, a 'mere mortal' can do incredible things, but so much of that process is knowing how to setup the tools used to do incredible things. The guy that did most of the cabinetry in my house (20 years ago) had a lot of the same power tools I did. I felt better about buying those tools, and could do 'average things' with them, but he was an artist. He took those tools, and made some incredible cabinets. He knew how to really use those tools. He asked me why I didn't do the cabinets. Well, I wanted them square, even, solid, not a Picasso experience...:oops:?:cool:Not trying to trash my woodworking skills, but I recognize my limits.
I'm sure Apple engineers setup the devices for optimum image creation. We mere mortals at least have a chance...
I always wondered, mostly because I never had an iPhone... Are "regular" users capable of taking such photos? I'm assuming they aren't just "point and shoot." Are they retouched in Photoshop? Or are the advanced camera settings (iso, exposure, etc) adjusted so that these photos come out so pretty?
These are shot with the 11 pro at night ... just to test, no special composition, no special settings, just point and shoot and no retouch ...
I've haven't been keeping an eye on camera makers in the last few years, but surely they must be looking into processors for computational photography for their cameras otherwise they are in danger of being left behind.