Samsung's "Space Zoom" feature has come under fire amid complaints that images of the moon are being artificially enhanced to an extreme extent.
Samsung introduced a 100x zoom feature with the Galaxy S20 Ultra in 2020, becoming a mainstay on recent flagship handsets from the company. Since its debut, Samsung has touted its devices' ability to take impressive pictures of the moon. Unlike brands such as Huawei, which simply overlay a PNG of the moon on such images, Samsung says that no overlays or texture effects are applied.
Yet on Friday, a Samsung user on the subreddit r/Android shared a detailed post purporting to "prove" that Samsung's moon shots are "fake." Their methodology involved downloading a high-resolution image of the moon, downsizing it to just 170 by 170 pixels, clipping the highlights, and applying a gaussian blur to heavily obscure the moon's surface details. This low-resolution image was then displayed on a monitor and captured at a distance from a Samsung Galaxy device. The resulting image has considerably more detail than its source.
Samsung devices seemingly achieve this effect by applying machine learning trained on a large number of moon images, making the photography effect purely computational. This has led to accusations that a texture is functionally still being applied to images of the moon and that the feature is a disingenuous representation of the camera hardware's actual capabilities, triggering heated debate online, even bringing into question the iPhone's reliance on computational photography.
iPadOS 26 allows iPads to function much more like Macs, with a new app windowing system, a swipe-down menu bar at the top of the screen, and more. However, Apple has stopped short of allowing iPads to run macOS, and it has now explained why.
In an interview this week with Swiss tech journalist Rafael Zeier, Apple's software engineering chief Craig Federighi said that iPadOS 26's new Mac-like ...
Thursday June 12, 2025 8:58 am PDT by Tim Hardwick
Apple's iPhone development roadmap runs several years into the future and the company is continually working with suppliers on several successive iPhone models simultaneously, which is why we often get rumored features months ahead of launch. The iPhone 17 series is no different, and we already have a good idea of what to expect from Apple's 2025 smartphone lineup.
If you skipped the iPhone...
Alongside WWDC this week, Logitech announced notable new accessories for the iPad and Apple Vision Pro.
The Logitech Muse is a spatially-tracked stylus developed for use with the Apple Vision Pro. Introduced during the WWDC 2025 keynote address, Muse is intended to support the next generation of spatial computing workflows enabled by visionOS 26. The device incorporates six degrees of...
Thursday June 12, 2025 4:53 am PDT by Tim Hardwick
With iOS 26, Apple has introduced some major changes to the iPhone experience, headlined by the new Liquid Glass redesign that's available across all compatible devices. However, several of the update's features are exclusive to iPhone 15 Pro and iPhone 16 models, since they rely on Apple Intelligence.
The following features are powered by on-device large language models and machine...
Wednesday June 11, 2025 7:14 am PDT by Tim Hardwick
Apple at WWDC previewed a bunch of new features coming in its updated operating systems, but certain changes will have been met with dismay by third-party developers who already offer apps with equivalent or similar features. In other words, their product has been "sherlocked" by Apple.
When Apple creates an app or a feature that has functionality found in a third-party app, it is referred...
At today's WWDC 2025 keynote event, Apple unveiled a new design that will inform the next decade of iOS, iPadOS, and macOS development, so needless to say, it was a busy day. Apple also unveiled a ton of new features for the iPhone, an overhauled Spotlight interface for the Mac, and a ton of updates that make the iPad more like a Mac than ever before.
Subscribe to the MacRumors YouTube channel ...
Wednesday June 11, 2025 4:22 pm PDT by Juli Clover
iOS 26 features a whole new design material that Apple calls Liquid Glass, with a focus on transparency that lets the content on your display shine through the controls. If you're not a fan of the look, or are having trouble with readability, there is a step that you can take to make things more opaque without entirely losing out on the new look.
Apple has multiple Accessibility options that ...
Apple this week announced that iPhone users will soon be able to watch videos right on the CarPlay screen in supported vehicles.
iPhone users will be able to wirelessly stream videos to the CarPlay screen using AirPlay, according to Apple. For safety reasons, video playback will only be available when the vehicle is parked, to prevent distracted driving. The connected iPhone will be able to...
Oh, that explains it. I was having surf and turf at a restaurant and took a photo of my food on this Samsung phone, but when I looked at the photo it turned into a picture of a cow, a potato, a cod, and a bag of panko.
One of the things Samsung claimed about their moon shot feature is that it's using multiple frames to create the final image, but since this experiment used a permanently blurred image it's impossible for the phone to be using multiple frames to reconstruct a sharper image. I'd be interested to hear how Samsung explain that discrepancy.
I think we can expect more and more of computational / AI-enhanced photography.
I just wish we get more control over whether all these features are enabled, or to which extent.
As a photographer, I already find the images from the iPhone quite unnatural looking / HDR-ry. You can mitigate some of that by playing with Photographic Styles, or using a third party app, but I'd prefer something more straightforward.