Samsung's "Space Zoom" feature has come under fire amid complaints that images of the moon are being artificially enhanced to an extreme extent.
Samsung introduced a 100x zoom feature with the Galaxy S20 Ultra in 2020, becoming a mainstay on recent flagship handsets from the company. Since its debut, Samsung has touted its devices' ability to take impressive pictures of the moon. Unlike brands such as Huawei, which simply overlay a PNG of the moon on such images, Samsung says that no overlays or texture effects are applied.
Yet on Friday, a Samsung user on the subreddit r/Android shared a detailed post purporting to "prove" that Samsung's moon shots are "fake." Their methodology involved downloading a high-resolution image of the moon, downsizing it to just 170 by 170 pixels, clipping the highlights, and applying a gaussian blur to heavily obscure the moon's surface details. This low-resolution image was then displayed on a monitor and captured at a distance from a Samsung Galaxy device. The resulting image has considerably more detail than its source.
Samsung devices seemingly achieve this effect by applying machine learning trained on a large number of moon images, making the photography effect purely computational. This has led to accusations that a texture is functionally still being applied to images of the moon and that the feature is a disingenuous representation of the camera hardware's actual capabilities, triggering heated debate online, even bringing into question the iPhone's reliance on computational photography.
Tuesday February 10, 2026 4:27 pm PST by Juli Clover
Apple is planning to launch new MacBook Pro models as soon as early March, but if you can, this is one generation you should skip because there's something much better in the works.
We're waiting on 14-inch and 16-inch MacBook Pro models with M5 Pro and M5 Max chips, with few changes other than the processor upgrade. There won't be any tweaks to the design or the display, but later this...
Wednesday February 11, 2026 10:07 am PST by Juli Clover
Apple today released iOS 26.3 and iPadOS 26.3, the latest updates to the iOS 26 and iPadOS 26 operating systems that came out in September. The new software comes almost two months after Apple released iOS 26.2 and iPadOS 26.2.
The new software can be downloaded on eligible iPhones and iPads over-the-air by going to Settings > General > Software Update.
According to Apple's release notes, ...
Tuesday February 10, 2026 6:33 am PST by Joe Rossignol
It has been a slow start to 2026 for Apple product launches, with only a new AirTag and a special Apple Watch band released so far. We are still waiting for MacBook Pro models with M5 Pro and M5 Max chips, the iPhone 17e, a lower-cost MacBook with an iPhone chip, long-rumored updates to the Apple TV and HomePod mini, and much more.
Apple is expected to release/update the following products...
Tuesday February 10, 2026 1:51 pm PST by Joe Rossignol
Apple plans to announce the iPhone 17e on Thursday, February 19, according to Macwelt, the German equivalent of Macworld.
The report said the iPhone 17e will be announced in a press release on the Apple Newsroom website, so do not expect an event for this device specifically.
The iPhone 17e will be a spec-bumped successor to the iPhone 16e. Rumors claim the device will have four key...
Apple acquired Canadian graph database company Kuzu last year, it has emerged.
The acquisition, spotted by AppleInsider, was completed in October 2025 for an undisclosed sum. The company's website was subsequently taken down and its Github repository was archived, as is commonplace for Apple acquisitions.
Kuzu was "an embedded graph database built for query speed, scalability, and easy of ...
Oh, that explains it. I was having surf and turf at a restaurant and took a photo of my food on this Samsung phone, but when I looked at the photo it turned into a picture of a cow, a potato, a cod, and a bag of panko.
One of the things Samsung claimed about their moon shot feature is that it's using multiple frames to create the final image, but since this experiment used a permanently blurred image it's impossible for the phone to be using multiple frames to reconstruct a sharper image. I'd be interested to hear how Samsung explain that discrepancy.
I think we can expect more and more of computational / AI-enhanced photography.
I just wish we get more control over whether all these features are enabled, or to which extent.
As a photographer, I already find the images from the iPhone quite unnatural looking / HDR-ry. You can mitigate some of that by playing with Photographic Styles, or using a third party app, but I'd prefer something more straightforward.