Samsung's "Space Zoom" feature has come under fire amid complaints that images of the moon are being artificially enhanced to an extreme extent.
Samsung introduced a 100x zoom feature with the Galaxy S20 Ultra in 2020, becoming a mainstay on recent flagship handsets from the company. Since its debut, Samsung has touted its devices' ability to take impressive pictures of the moon. Unlike brands such as Huawei, which simply overlay a PNG of the moon on such images, Samsung says that no overlays or texture effects are applied.
Yet on Friday, a Samsung user on the subreddit r/Android shared a detailed post purporting to "prove" that Samsung's moon shots are "fake." Their methodology involved downloading a high-resolution image of the moon, downsizing it to just 170 by 170 pixels, clipping the highlights, and applying a gaussian blur to heavily obscure the moon's surface details. This low-resolution image was then displayed on a monitor and captured at a distance from a Samsung Galaxy device. The resulting image has considerably more detail than its source.
Samsung devices seemingly achieve this effect by applying machine learning trained on a large number of moon images, making the photography effect purely computational. This has led to accusations that a texture is functionally still being applied to images of the moon and that the feature is a disingenuous representation of the camera hardware's actual capabilities, triggering heated debate online, even bringing into question the iPhone's reliance on computational photography.
Following nearly two years of rumors about a fourth-generation iPhone SE, The Information today reported that Apple suppliers are finally planning to begin ramping up mass production of the device in October of this year. If accurate, that timeframe would mean that the next iPhone SE would not be announced alongside the iPhone 16 series in September, as expected. Instead, the report...
Key details about the overall specifications of the iPhone 17 lineup have been shared by the leaker known as "Ice Universe," clarifying several important aspects of next year's devices. Reports in recent months have converged in agreement that Apple will discontinue the "Plus" iPhone model in 2025 while introducing an all-new iPhone 17 "Slim" model as an even more high-end option sitting...
Wednesday July 24, 2024 9:06 am PDT by Joe Rossignol
Apple supply chain analyst Ming-Chi Kuo today shared alleged specifications for a new ultra-thin iPhone 17 model rumored to launch next year. Kuo expects the device to be equipped with a 6.6-inch display with a current-size Dynamic Island, a standard A19 chip rather than an A19 Pro chip, a single rear camera, and an Apple-designed 5G chip. He also expects the device to have a...
Thursday July 25, 2024 5:43 am PDT by Tim Hardwick
Apple typically releases its new iPhone series around mid-September, which means we are about two months out from the launch of the iPhone 16. Like the iPhone 15 series, this year's lineup is expected to stick with four models – iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max – although there are plenty of design differences and new features to take into account. To bring ...
Apple’s iCloud Private Relay service is down for some users, according to Apple’s System Status page. Apple says that the iCloud Private Relay service may be slow or unavailable. The outage started at 2:34 p.m. Eastern Time, but it does not appear to be affecting all iCloud users. Some impacted users are unable to browse the web without turning iCloud Private Relay off, while others are...
Oh, that explains it. I was having surf and turf at a restaurant and took a photo of my food on this Samsung phone, but when I looked at the photo it turned into a picture of a cow, a potato, a cod, and a bag of panko.
One of the things Samsung claimed about their moon shot feature is that it's using multiple frames to create the final image, but since this experiment used a permanently blurred image it's impossible for the phone to be using multiple frames to reconstruct a sharper image. I'd be interested to hear how Samsung explain that discrepancy.
I think we can expect more and more of computational / AI-enhanced photography.
I just wish we get more control over whether all these features are enabled, or to which extent.
As a photographer, I already find the images from the iPhone quite unnatural looking / HDR-ry. You can mitigate some of that by playing with Photographic Styles, or using a third party app, but I'd prefer something more straightforward.