Adobe Lightroom Gains AI-Based Generative Remove Tool for Eliminating Unwanted Objects

Adobe today updated its Lightroom software for the iPad and Mac with Generative Remove, an AI-powered feature that can remove any element from a photo with a single click. Generative Remove eliminates unwanted objects in images non-destructively, intelligently matching the removed area with AI-generated content that matches the surrounding photo.

lightroom generative remove
Generative Remove is able to fix everything from an unwanted reflection in water to wrinkles in a tablecloth in food photography. It works with complicated backgrounds, and can be used for removing distractions from images or retouching photos.

According to Adobe, users can expect to see "high-quality, realistic and stunning results." The Generate Remove feature is powered by Adobe Firefly, generative AI features that have previously been available in Photoshop. Generative Remove is available today as an early access feature for Lightroom for mobile, desktop, ‌iPad‌, Web, and Classic.

Along with Generative Remove, Adobe is also adding automatic presets for Lens Blur, a feature that is able to add blur effects to any part of a photograph, plus Lens Blur includes improved subject detection. Other improvements include HDR optimization, a streamlined mobile toolbar on the ‌iPad‌, and instant access to photo libraries in the ‌iPad‌ and Mac apps for faster editing.

The new features are free for existing Lightroom users, with Lightroom access available starting at $9.99 per month.

Note: MacRumors is an affiliate partner with Adobe. When you click a link and make a purchase, we may receive a small payment, which helps us keep the site running.

Tag: Adobe

Popular Stories

m5 macbook pro deal

Why You Shouldn't Buy the Next MacBook Pro

Tuesday February 10, 2026 4:27 pm PST by
Apple is planning to launch new MacBook Pro models as soon as early March, but if you can, this is one generation you should skip because there's something much better in the works. We're waiting on 14-inch and 16-inch MacBook Pro models with M5 Pro and M5 Max chips, with few changes other than the processor upgrade. There won't be any tweaks to the design or the display, but later this...
iOS 26

Apple Releases iOS 26.3 and iPadOS 26.3

Wednesday February 11, 2026 10:07 am PST by
Apple today released iOS 26.3 and iPadOS 26.3, the latest updates to the iOS 26 and iPadOS 26 operating systems that came out in September. The new software comes almost two months after Apple released iOS 26.2 and iPadOS 26.2. The new software can be downloaded on eligible iPhones and iPads over-the-air by going to Settings > General > Software Update. According to Apple's release notes, ...
Apple Logo Zoomed

Apple Expected to Launch These 10+ Products Over the Coming Months

Tuesday February 10, 2026 6:33 am PST by
It has been a slow start to 2026 for Apple product launches, with only a new AirTag and a special Apple Watch band released so far. We are still waiting for MacBook Pro models with M5 Pro and M5 Max chips, the iPhone 17e, a lower-cost MacBook with an iPhone chip, long-rumored updates to the Apple TV and HomePod mini, and much more. Apple is expected to release/update the following products...
iPhone 16e Bottom Crop

Apple Reportedly Unveiling a New iPhone Next Week

Tuesday February 10, 2026 1:51 pm PST by
Apple plans to announce the iPhone 17e on Thursday, February 19, according to Macwelt, the German equivalent of Macworld. The report said the iPhone 17e will be announced in a press release on the Apple Newsroom website, so do not expect an event for this device specifically. The iPhone 17e will be a spec-bumped successor to the iPhone 16e. Rumors claim the device will have four key...
Apple Logo Black

Apple Acquires New Database App

Wednesday February 11, 2026 6:44 am PST by
Apple acquired Canadian graph database company Kuzu last year, it has emerged. The acquisition, spotted by AppleInsider, was completed in October 2025 for an undisclosed sum. The company's website was subsequently taken down and its Github repository was archived, as is commonplace for Apple acquisitions. Kuzu was "an embedded graph database built for query speed, scalability, and easy of ...

Top Rated Comments

turbineseaplane Avatar
23 months ago
This should be handy after divorces and breakups

Click + Delete the Ex!
Score: 21 Votes (Like | Disagree)
23 months ago
All these degenerative apps look great in marketing demos and edited YouTube videos but 80% of the time when you use them yourself the results are WTF levels of entertainment.
Score: 13 Votes (Like | Disagree)
MrGimper Avatar
23 months ago

This should be handy after divorces and breakups

Click + Delete the Ex!
Yep, she makes money disappear from your bank accounts

You make her disappear from photos.
Score: 8 Votes (Like | Disagree)
23 months ago
Retouching has been done since the invention of photography. Paintings always have been an interpretation of what is real way before that. We might argue that film, developing chemicals treatment, even exposure and fstop, framing, the lens you chose… and even choosing a situation and a specific moment to capture from a specific viewpoint is distorting the objective truth. Every detail about photography is not about representation of reality but a subjective chain of decisions. Look at any Cartier-Bresson, who was a photo reporter and you will see decisions that turn reality into a subjective authorship. Whether black and white or Polaroid, these images always were chemical transcriptions of reality, 2D simulacra.
Same with photography. Even an untouched, unedited, straight from the Camera, no-filters photo is already edited, not truthful anymore. It is important to understand that images represent, re-interpret and see that as what IS photography. It never was about reality.

Personally, as someone who took or edited images daily since the early 90s, this is an important distinction. Images are narrative and the job is to capture or find that narrative and use tools like Photoshop to elevate and fine-tune the image. Over the top colors or edits, black and white, subtle minor repairs… it always depends on what you want the photo to be. But even a «real» photo is a narrative choice and edited to look untouched.

One of the reasons for that is that cameras can capture stuff your brain will filter out. People will see things in photos they have not seen in real life even in their daily surroundings, we once had a client that seemed to notice a door in his shop for the first time in the photos we took of his interior.

So the job is to find a narrative. When you do portrait the job is to find the beauty and essence of the person captured. In architecture you might want to edit out stuff that distracts from the building, cars, people, signs etc. I personally have the tendency to take out distractions, the granular stuff that distracts the eye, to make the image a bit cleaner and lighter. Reflections, creases, fire alarms, buttons, cigarette butts, signage - the detritus, the distractions to the eye. And if the tools get better to do that, as they have done since the invention of photography, if we have more choice how to represent reality, all the better.
Score: 7 Votes (Like | Disagree)
23 months ago
For the mods/article writer - might be worth including the fact that they've added support for the neural engine on AS Macs when AI denoising! Quite topical given AI stuff lately
Score: 6 Votes (Like | Disagree)
steggerwoof Avatar
23 months ago

Luminar Neo has a great feature that can remove lens/sensor dust spots in a single click. I'd love to see Adobe copy this so I don't have to round-trip into Photoshop to fire up Luminar to do the removal. Right now it looks like I would have to still click on each dust spot.
Or you could clean your sensor ?
Score: 6 Votes (Like | Disagree)