Samsung's "Space Zoom" feature has come under fire amid complaints that images of the moon are being artificially enhanced to an extreme extent.
Samsung introduced a 100x zoom feature with the Galaxy S20 Ultra in 2020, becoming a mainstay on recent flagship handsets from the company. Since its debut, Samsung has touted its devices' ability to take impressive pictures of the moon. Unlike brands such as Huawei, which simply overlay a PNG of the moon on such images, Samsung says that no overlays or texture effects are applied.
Yet on Friday, a Samsung user on the subreddit r/Android shared a detailed post purporting to "prove" that Samsung's moon shots are "fake." Their methodology involved downloading a high-resolution image of the moon, downsizing it to just 170 by 170 pixels, clipping the highlights, and applying a gaussian blur to heavily obscure the moon's surface details. This low-resolution image was then displayed on a monitor and captured at a distance from a Samsung Galaxy device. The resulting image has considerably more detail than its source.
Samsung devices seemingly achieve this effect by applying machine learning trained on a large number of moon images, making the photography effect purely computational. This has led to accusations that a texture is functionally still being applied to images of the moon and that the feature is a disingenuous representation of the camera hardware's actual capabilities, triggering heated debate online, even bringing into question the iPhone's reliance on computational photography.
Saturday February 7, 2026 9:26 am PST by Joe Rossignol
Apple today shared an ad that shows how the upgraded Center Stage front camera on the latest iPhones improves the process of taking a group selfie.
"Watch how the new front facing camera on iPhone 17 Pro takes group selfies that automatically expand and rotate as more people come into frame," says Apple. While the ad is focused on the iPhone 17 Pro and iPhone 17 Pro Max, the regular iPhone...
Monday February 9, 2026 6:24 am PST by Joe Rossignol
In select U.S. states, residents can add their driver's license or state ID to the Apple Wallet app on the iPhone and Apple Watch, and then use it to display proof of identity or age at select airports and businesses, and in select apps.
The feature is currently available in 13 U.S. states and Puerto Rico, and it is expected to launch in at least seven more in the future.
To set up the...
Tuesday February 10, 2026 4:27 pm PST by Juli Clover
Apple is planning to launch new MacBook Pro models as soon as early March, but if you can, this is one generation you should skip because there's something much better in the works.
We're waiting on 14-inch and 16-inch MacBook Pro models with M5 Pro and M5 Max chips, with few changes other than the processor upgrade. There won't be any tweaks to the design or the display, but later this...
Tuesday February 10, 2026 6:33 am PST by Joe Rossignol
It has been a slow start to 2026 for Apple product launches, with only a new AirTag and a special Apple Watch band released so far. We are still waiting for MacBook Pro models with M5 Pro and M5 Max chips, the iPhone 17e, a lower-cost MacBook with an iPhone chip, long-rumored updates to the Apple TV and HomePod mini, and much more.
Apple is expected to release/update the following products...
Wednesday February 11, 2026 10:07 am PST by Juli Clover
Apple today released iOS 26.3 and iPadOS 26.3, the latest updates to the iOS 26 and iPadOS 26 operating systems that came out in September. The new software comes almost two months after Apple released iOS 26.2 and iPadOS 26.2.
The new software can be downloaded on eligible iPhones and iPads over-the-air by going to Settings > General > Software Update.
According to Apple's release notes, ...
Oh, that explains it. I was having surf and turf at a restaurant and took a photo of my food on this Samsung phone, but when I looked at the photo it turned into a picture of a cow, a potato, a cod, and a bag of panko.
One of the things Samsung claimed about their moon shot feature is that it's using multiple frames to create the final image, but since this experiment used a permanently blurred image it's impossible for the phone to be using multiple frames to reconstruct a sharper image. I'd be interested to hear how Samsung explain that discrepancy.
I think we can expect more and more of computational / AI-enhanced photography.
I just wish we get more control over whether all these features are enabled, or to which extent.
As a photographer, I already find the images from the iPhone quite unnatural looking / HDR-ry. You can mitigate some of that by playing with Photographic Styles, or using a third party app, but I'd prefer something more straightforward.