19 Days 01 Hour 27 Minutes

Google made waves yesterday by showcasing a set of lightweight smart glasses featuring deep Gemini integration and an optional in-lens display. The demo has reignited interest in Apple's own smart glasses project, which has been the subject of rumors for nearly a decade. Here's a recap of where things stand.

Apple Glass

Current Development Status

Apple is actively working on new chips specifically designed for smart glasses that would compete with Google's XR glasses and Meta's Ray-Ban smart glasses. According to Bloomberg, these chips are currently in development, with Apple targeting mass production in 2026 or 2027 for a potential launch within the next two years.

Bloomberg's Mark Gurman notes that while such a product wouldn't be a proper augmented reality device like Apple Vision Pro, it would include AI capabilities, microphones, and cameras to create a "pretty good user experience."

The smart glasses Apple is designing will reportedly include multiple cameras, microphones, and integrated AI functionality, similar to Meta's Ray-Ban offering. They would likely support features such as capturing photos, recording video, and offering translation options. Apple could potentially integrate a Visual Intelligence-like feature that would allow wearers to scan their environment, obtain product information, and receive directions.

meta ray bans feature

Meta Ray-Bans

The custom chip Apple is developing for these glasses is based on Apple Watch SoCs, which consume less energy than iPhone chips. Apple has reportedly already made optimizations to improve power efficiency for this application.

The Journey to Smart Glasses

Apple's smart glasses project has been anything but smooth. The company had been exploring true augmented reality glasses designed to pair with Mac computers for power, but Bloomberg reported in January that this project was halted. The initial concept involved glasses that looked like regular eyewear but offered AR capabilities.

Apple engineers determined that the AR glasses would need to provide the performance of an iPhone with a tenth of the power consumption, otherwise the chip would simply run too hot. Adding a battery to the glasses would also be problematic because of the weight.

Initially, Apple wanted the glasses to connect to iPhones, but iPhones lack sufficient power and battery life. The company then pivoted to using Macs as a power source, but executives ultimately weren't convinced this approach would succeed, leading to the project's cancellation.

google smart glasses

Google XR glasses demo

The Vision of Apple Glass

Despite setbacks, Apple CEO Tim Cook remains "hell bent" on bringing true augmented reality glasses to market before Meta can achieve the same feat, according to Bloomberg. Cook has apparently made glasses a "top priority" for the company and is personally invested in product development efforts.

Bloomberg has said that it will take "many years" for true AR glasses to be ready, with several technologies still requiring perfection, including high-resolution displays, high-performance chips, and tiny batteries capable of all-day power. In the meantime, Apple is pushing ahead development of less-ambitious Ray-Ban-style smart glasses.

Development Continues

Apple is conducting user studies at its offices to gauge the appeal of various features and interfaces. Codenamed "Atlas," these studies are reportedly led by Apple's Product Systems Quality team within the hardware engineering division. The company is also developing a version of visionOS that will run on glasses.

facebook smart glasses prototype

Meta smart glasses prototype

Work continues at a secretive facility in Santa Clara, a town away from Apple's Cupertino headquarters, where staff focus on AR technology. Apple also maintains a manufacturing facility there for testing future display technologies.

Expected Timeline

Gurman believes Apple could create a "smash hit" if it can "bring its design prowess, offer AirPods-level audio quality and tightly integrate the glasses with the iPhone." While Apple is actively pursuing these Meta Ray-Ban competitors, it continues parallel development of true augmented reality glasses, though that more ambitious product remains years away from market readiness. Based on current information, we can expect Apple's smart glasses to potentially reach consumers around 2026-2027 at the earliest.

Amazon today has $50 discounts on both Wi-Fi and cellular models of Apple's 11th generation iPad. Prices start at $299.00 for the 128GB Wi-Fi iPad, down from $349.00, a second-best price on this model. Best Buy is matching this deal, and a few other of the iPad discounts as well.

11th gen ipad purpleNote: MacRumors is an affiliate partner with Amazon. When you click a link and make a purchase, we may receive a small payment, which helps us keep the site running.

Additionally, Amazon has the 256GB Wi-Fi iPad for $399.00 ($50 off) and the 512GB Wi-Fi iPad for $594.95 ($55 off). These are also both solid second-best prices on the 11th generation iPad.



There are also numerous discounts on cellular models, starting at $449.00 for the 128GB model. Amazon is providing an estimated delivery of May 26 for free shipping, while Prime members should be able to get the tablets a bit faster in most cases.



The 11th generation iPad is mainly a spec bump for the tablet line, now featuring the A16 chip and more storage, with the same design as the 10th generation iPad. The new ‌iPad‌ starts with 128GB of storage, and is also available in 256GB and a new 512GB configuration. The previous model was only available in 64GB and 256GB configurations.

If you're on the hunt for more discounts, be sure to visit our Apple Deals roundup where we recap the best Apple-related bargains of the past week.


Deals Newsletter

Interested in hearing more about the best deals you can find in 2025? Sign up for our Deals Newsletter and we'll keep you updated so you don't miss the biggest deals of the season!

Related Roundup: Apple Deals

If you have macOS 13 or later installed on your Mac, you can use a nearby iPhone as your computer's microphone input. Keep reading to learn how it works.

iphone mac mic tweak
When Apple released macOS Ventura in October 2022, it introduced a new take on its Continuity Camera feature by letting users use their iPhone's camera as a webcam for their Mac. Continuity Camera works wirelessly or wired in FaceTime, Zoom, and other apps, and delivers video directly from a user's nearby ‌iPhone‌ camera, which has significantly better quality than the built-in camera on Macs.

Another function of Continuity Camera is the ability to use a nearby iPhone as the microphone input for your Mac. As long as the iPhone is running iOS 16 or later and signed into the same Apple Account, you can speak into it and the audio will be seamlessly delivered to your Mac, sans video.

The following steps show you how to set it up in macOS Ventura and later.

  1. On your Mac, click the Apple () symbol in the menu bar and select System Settings....
  2. Click Sound in the sidebar.
  3. Under "Output & Input," click the Input tab.
  4. Select the name of your nearby iPhone, listed as Type "Continuity Camera."

settings

Your nearby iPhone will ping with a sound and show a "Connected to..." Screen indicating that it has successfully connected. You can now use your iPhone as a microphone for your Mac.

You can pause the connection at any time using the Pause button. When you're finished, simply tap the red Disconnect button on your iPhone's screen.

Epson has announced AirPlay 2 and HomeKit support for its newest projector lineup, reports HomeKitNews. The added support means users can wirelessly stream content from their Apple devices and control projector functions through the Home app or Siri.

epson projector airplay
The AirPlay 2 functionality enables streaming videos, photos, presentations, and audio from iPhone, iPad, and Mac apps including Safari, while HomeKit compatibility lets users power projectors on or off via voice commands or include them in automated smart home scenes.

"By integrating Apple AirPlay 2 and HomeKit into our projector line-up, we're addressing the needs of teachers, business professionals, and home users who value ease of use," said Massimo Pizzocri, vice president of Epson Europe's video projector division.

Apple AirPlay 2 and Apple HomeKit come pre-installed on select Epson projectors, including models from the PowerLite EB-L6, EB-L7, and EB-L8 series. These are fixed-lens laser projectors offering WUXGA resolution and up to 8,000 lumens of brightness. Epson also offers more affordable options for business and education settings, such as the EB-994F, EB-FH54, and EB-L690SU.

The Feiyang Times is an unassuming tower in Shenzhen's Huaqiangbei district, but it has earned the nickname "the stolen iPhone building" in Apple community forums. According to a Financial Times investigation (paywalled), the building has become a major hub in a global network trafficking stolen iPhones.

iPhone 16
When London tech entrepreneur Sam Amrani had his iPhone 15 Pro snatched by two men on electric bicycles, he tracked its journey via Find My to a repair shop in London, then to Hong Kong, before it finally settled in Huaqiangbei. "It was very quick, very organised and kind of targeted," Amrani told the FT.

Law enforcement in London estimates phone theft represents a £50 million ($63.5 million) annual criminal industry, with similar rises reported in Paris and New York.

The fourth floor of the Feiyang building specializes in selling second-hand iPhones from Western countries. Many are legitimate trade-ins, but traders admit that even remotely locked devices have their "market price."

Hong Kong serves as the critical intermediary in this supply chain, according to the report. Specifically, an industrial building at 1 Hung To Road in Kwun Tong houses hundreds of wholesalers openly advertising phones labeled "iCloud locked" through various messaging platforms.

"The [passcode-locked] ones were probably stolen or snatched in the U.S. They are sold to Hong Kong and then on to other countries including the Middle East," explained one Shenzhen-based seller visiting Hong Kong.

What makes Huaqiangbei valuable to thieves is its specialized market that can find buyers for every iPhone component – from screens and circuit boards to chips. Even when devices can't be unlocked, they're profitable when stripped for parts.

Many theft victims receive messages from individuals in Shenzhen (when put into Lost Mode, a contact number for anybody finding the iPhone can be added) either cajoling or threatening them to remove their devices from Find My iPhone, which would substantially increase the device's resale value.

According to the report, the criminal network thrives on Hong Kong's status as a free trade port with no import taxes, allowing traders to move stolen devices into mainland China while avoiding electronics tariffs.

The Hong Kong police told FT that it "will take appropriate actions where necessary according to actual circumstances and in accordance with the law."

Warner Bros. Pictures today announced that it will be premiering Apple's upcoming F1 film in more IMAX theaters than it originally planned to.

Apple TV F1
There will now be advanced screenings of F1: The Movie at 400 IMAX theaters around the world on Monday, June 23 at 7:00 p.m. local time, ahead of the film's wider release on June 25 internationally and on June 27 in the U.S. and Canada. Warner Bros. Pictures decided to offer additional IMAX screenings due to "overwhelming popularity" for the film so far, with the initial 25 screenings that it offered all sold out.

Tickets for the early IMAX screenings and general showings go on sale starting tomorrow, May 21.

The film will tell a familiar underdog story. Brad Pitt stars as an F1 driver who was an up-and-coming talent in the 1990s, until an accident on the track nearly ended his career. Thirty years later, Pitt is invited to join a former teammate's struggling F1 team, in a last-shot bid to save the team and become the best in the world.


The film is directed by Joseph Kosinski, who is known for other popular action films, including "Tron: Legacy" and "Top Gun: Maverick."

F1: The Movie will be available to stream on Apple TV+ at a later date. In the U.S., Apple TV+ costs $9.99 per month, or $99 per year. The streaming service is available through the Apple TV app on a wide variety of devices, and on the web at tv.apple.com, with a free seven-day trial available. Apple TV+ is also available in Apple One bundles.

Google today held its annual I/O developer conference, where it shared a number of new features and tools that are coming to its products in the coming weeks and months. There was a heavy focus on AI capabilities, and Google is deeply integrating Gemini and other AI tools into its software.

New Gemini Features

Gemini is Google's AI product, equivalent to Anthropic's Claude and OpenAI's ChatGPT. Apple has no equivalent at the current time, but Gemini could soon be integrated into iOS like ChatGPT. There are multiple new capabilities coming to Gemini, some of which are available for iPhone users this week.

  • Gemini Live - Gemini Live is available in the Gemini app for iOS starting today, and it's free to use. It uses screen sharing or the ‌iPhone‌'s camera so that you can communicate with Gemini in real-time and ask questions about what you're seeing. It's useful for identifying objects, asking questions about things around you, getting help with DIY projects, shopping, organizing, and more. Gemini Live integrates with Google Calendar, Maps, Tasks, and Keep.
  • Gemini Agent Mode - Agent Mode in Gemini can do things like find sports game tickets at an ideal price, or help you locate the right apartment with specific requirements on price and layout. It's coming soon to Gemini.
  • Gemini Personal Context - Gemini will be able to incorporate search history for more personalized results, along with pulling information from other Google apps. Google is also aiming to make Gemini more proactive, with reminders about upcoming events and tools that can help with preparations. Gemini Personal Context sounds similar to the personalized features that Apple is planning to add to Siri at some point in the future, but Gemini is further along.
  • AI Ultra - Google has a new AI plan called AI Ultra, and it costs $250 per month. It includes expanded access to Google's latest AI tools with high rate limits, and all of the newest features, plus YouTube Premium and 30TB of storage. Google AI Premium, which is $19.99, has been renamed Google AI Pro. At $250, AI Ultra is more expensive than top tier plans from Anthropic and OpenAI. Apple doesn't charge for any Apple Intelligence functionality as of yet, and it's not clear if that'll happen in the future.
  • Veo 3 - Google is updating its Veo video generation model with new capabilities, and it is now able to create videos that include sound effects, background noise, and dialogue. It can, for example, generate a video of birds with realistic sounds, and it is designed to be good at mimicking real-world physics. Veo 3 is limited to AI Ultra subscribers, but it is available starting today.
  • Imagen 4 - Google's new image generating tool, Imagen 4 is coming to Gemini. Imagen 4 can create more photorealistic images with improved details for things like hair, fur, and fabric. It's also better at generating text, and incorporating creative ideas like making a word out of dinosaur bones when the image being created involves a dinosaur party. Imagen 4 is available in Gemini as of today.
  • Deep Research - Gemini supports uploading private PDFs and images for research reports, with Google Drive and Gmail integration coming soon.

Google Search

Gemini is being integrated more deeply into search, starting with a dedicated AI Mode that's rolling out in the U.S. this week.

  • AI Mode in Search - Google is adding a new AI Mode to search that is entirely AI, rather than just the AI suggestions that show up in the AI Overviews that are shown with Google searches. It uses a query fan-out technique that breaks down questions into multiple searches for a deeper dive than traditional search. AI Mode is rolling out to everyone in the United States starting today as a dedicated section in Search, and it will use the latest Gemini 2.5 model. It will soon incorporate more personalized suggestions that take into account your preferences and your actions in other Google apps, and agentic capabilities will let it do things like purchase concert tickets.
  • Deep Search - AI Mode has a Deep search option that uses the same query fan-out technique, but it is able to conduct hundreds of searches at one time, reason across multiple results, and craft an "expert-level" report in minutes. AI Mode will also be able to analyze complex datasets and create graphics and charts.
  • AI Mode for Shopping - AI Mode will be able to help you find specific items that you're looking for, and it can let you "try on" clothes that you're shopping for. It scans your body using a photo of you and then puts a realistic looking clothing item on you so you can see what it might look like. It can show depth, and how the material will look draped over a body. AI Mode can make purchases, alert you to deals, and more. These features will launch in the coming months.
  • Google Search Live - Google Search is getting a feature that's basically Gemini Live, and users will be able to ask questions about what's being viewed through a smartphone camera. It's an AI Mode feature coming this summer.

Google Apps

Gmail, Chrome, and Meet are all getting new Gemini capabilities that are rolling out starting today.

  • Gmail Personal Context - With permission, Gemini will be able to use relevant context across Google apps to make AI responses in Gmail sound more like you. Gemini will be able to scan past emails, look up notes, and view documents in Google Drive. With the info, it can match greetings, capture tone and style, and mimic favorite word choices. It's coming this summer for Gemini subscribers.
  • Google Meet - Google Meet is getting a real-time translation option. Initally, it'll work in English and Spanish, but Google has plans to add more languages. It's available to Google Pro and Ultra subscribers.
  • Google Chrome - Chrome is getting integrated Gemini starting tomorrow. It can answer questions about what you're doing and the tabs that you've got open, with the first version able to clarify complex information on any webpage or summarize long pages. You can get to Gemini from the task bar. Chrome is also being updated with a new password feature that can automatically change passwords compromised in a leak, for participating websites.
  • FireSat - Google is developing a FireSat feature that will watch for fires to break out in areas as small as 270 square feet. It could be particularly useful in California, where Google is headquartered.

Android XR Glasses

Google already announced Android XR as a platform for companies that are building VR headsets, but today, Google said that it's also developing Android XR for augmented reality glasses. Google last tried this kind of product with Google Glass, but it didn't go over so well and Google Glass was discontinued after several years.

Google showed off a set of lightweight glasses that incorporate an in-lens display. On stage, Google demonstrated the glasses offering a live translation feature with words that appeared on the lenses, and providing turn-by-turn directions.

The glasses have cameras, microphones, and speakers, and are connected to Gemini. The AI is able to see and hear what the wearer hears to answer questions, offer image recognition capabilities, provide tailored directions, and more.

The smart glasses could compete with Apple's future smart glasses, as Apple is rumored to be working on a pair of lightweight augmented reality glasses that could eventually replace the ‌iPhone‌. Apple is still far off from being able to release AR glasses, so the Android XR version is likely to come out first.

Gentle Monster and Warby Parker are partnering with Google for Android XR glasses that are lightweight and stylish.

Samsung's XR headset will still be the first device that runs Android XR, and it's launching later this year. Samsung will also build Android XR glasses.

Popular game Fortnite is once again available in the U.S. App Store, as Apple has finally approved Epic Games' app submission. This is the first time that Fortnite has been on the iOS ‌App Store‌ since 2020.

fortnite apple logo 2
Apple initially did not plan to allow Fortnite back in the ‌App Store‌, but the judge overseeing the ongoing legal battle between the two companies yesterday suggested that Apple and ‌Epic Games‌ work things out or face more time in court. The judge threatened to require the Apple official that oversees app decisions to appear in person in court, which apparently was enough to spur Apple to allow the game back on the ‌App Store‌.

Fortnite is available from the iOS ‌App Store‌ in the United States, and from the ‌Epic Games‌ Store alternative app marketplace in the European Union. It is not available on the ‌App Store‌ in other countries. The U.S. version of Fortnite includes an option for players to purchase in-app currency using the ‌Epic Games‌ website, with no in-app purchases.

Apple initially planned to prevent Fortnite from returning to the ‌App Store‌ until the legal dispute was entirely settled. As of now, Apple is appealing the court's ruling that forced it to change its U.S. ‌App Store‌ rules to allow developers to link customers to purchase options available outside of the ‌App Store‌.

Google today showed off a set of lightweight smart glasses that have deep Gemini integration and an optional in-lens display that can offer up relevant information like turn-by-turn directions.

google smart glasses
Made to rival the Meta Ray-Bans and smart glasses coming from Apple in the future, Google's XR glasses feature a camera, microphones, and speakers. They connect to a smartphone for app access, and with Gemini integration, the glasses can answer questions about the wearer's surroundings, provide directions, and offer up live translations.

Gemini is able to use the cameras in the glasses to see what's around the wearer to provide feedback, and Google says the glasses will "see and hear what you do" so they'll understand context and "help you throughout your day." On stage at Google I/O, Google executives demonstrated how the Android XR glasses will be able to send messages to friends, make appointments, snap photos, and translate conversations in real-time.

Google plans to work with companies like Warby Parker and Gentle Monster to create stylish smart glasses that consumers will want to wear.

With iOS 18.4, Apple added support for robot vacuums to HomeKit, and some of the companies that make robot vacuums have started offering products with Matter integration. Matter-compatible robot vacuums can be added to ‌HomeKit‌ and controlled via Siri voice commands and the Home app.

roborock saros 10r
Roborock is one of the companies that's adopting Matter, and the Roborock Saros 10R now works with ‌HomeKit‌. The Saros 10R is one of Roborock's flagship vacuums and it's relatively new, so like most of these Matter-enabled vacuum options, it does not come cheap. It's $1,600, but it does have Roborock's most advanced feature set.

Design and Size

The Saros 10R is just over three inches tall, so it's compact enough to fit under all of my furniture. I have a TV stand that's about four inches that a different robot vacuum isn't able to fit underneath, but I haven't run into that problem with the Saros 10R.

There are a lot of components in robot vacuums, so the thin build is a feat of engineering, and having a robot that can get under all of my furniture is a major plus. When I manually vacuum with a Dyson stick, I can't reach all of the areas that the Saros 10R can, so my house ends up cleaner when the robot handles the vacuuming and mopping.

saros 10r roborock sensors
While the Saros 10R is thin, it's still on the larger side, measuring in at around 13 inches. If you have a smaller space with a lot of furniture, it might be too large to effectively clean, but I've been impressed with the narrow spaces it can maneuver through. Since it's round, it can get itself into and out of some tight fits.

I mostly use the app to send the Saros 10R off to clean or to resume cleaning after a pause, but there are buttons on the device itself. The power button turns it on or sends it to clean if it's already on, and the dock button sends it home. A long press on the dock button activates a spot cleaning feature.

All of the robot vacuums have base stations, and the Saros 10R base station is fairly large, but it kind of blends into the background, like a vacuum or a mop might. The base station is black plastic and it includes several components. There's a dust bag that collects the dust and hair the robot picks up, a clean water bin for wetting and washing the mop, a section for floor cleaning solution, a dirty water bin for collecting the water used to wash the mop, a dryer for drying the mop, and a ramp that helps the robot get up onto the base to charge.

saros 10r dock top down
The dock weighs around 25 pounds, so it's probably not something you're going to want to cart around if you have multiple floors in your home. I'm not sure if people who have multi-story homes purchase more than one vacuum, but carrying the robot and base station upstairs would be a workout. The dock has an LED on it that turns red if there's a problem or white when the robot is charging, and it needs to be placed on a hard surface with about two feet around it to make sure there's space for the robot to dock itself.

saros 10r dust bin

The dust bin in the Saros 10R where dust and debris collect. It empties into the dock

Roborock says to use its proprietary cleaning solution because other solutions can damage the internals of the vacuum. Some companies like Bona make cleaning solutions specifically for robot vacuums, and I didn't have an issue using that kind of cleaning fluid.

There are rubber wheels at the bottom of the robot, and they come out far enough to help it get over thresholds and to lift it over carpets. I have a threshold that's about an inch in one spot, and it's able to navigate it fine. The mops are able to lift up enough that it can vacuum rugs while mopping without getting the rugs wet.

Navigation

The Saros 10R has what Roborock calls its "StarSight" autonomous system, which is basically several sensors. It has a front camera and sensor for detecting obstacles, and it has a wall sensor for determining where walls are located.

The first cleaning with the Saros 10R starts off with a mapping feature where the robot maps all of the rooms that it is able to access, and then while cleaning, it continually scans for and identifies pet messes, pets, cords, furniture that's hard to navigate, and other obstacles. You can opt in to have the robot take a picture whenever it encounters an obstacle, so you can see what it's avoiding.

saros 10r robot and dock
Its sensors do a good job mapping out a room and even identifying the furniture in the room, which helps it determine what a room is. If it sees a dining table, for example, it'll label that room as a dining room.

roborock obstacle detection
It's able to identify pet messes so that it doesn't drag a mess around the entire house, and it's also excellent at detecting cables that should be avoided, so it doesn't get tangled up. It doesn't run into walls or furniture, and the extending mops and the side brush let it get into nooks, crannies, corners, and the sides of walls when cleaning even as it's avoiding obstacles.

Cleaning Features

Most of the robot vacuums have the same general vacuuming features, with variations in suction and the roller bar design. The Saros 10R has what Roborock calls a "DuoDivide" anti-tangle brush, and I haven't seen it get tangled. I have wood floors, four cats, and two people with long hair, and there hasn't been clogging or tangling. Hair and fuzz can get wrapped around the bearings of the rubber wheels on the bottom and that's not fun to remove, but it hasn't stopped the vacuum from functioning.

roborock saros 10r top
As for suction, it picks up every speck of dust and cat hair from my floors, which are wood. It hasn't damaged my wood floors, which is always a concern with a vacuum, and it does a good job keeping them almost spotless. It also does well with my large rugs, but it can struggle with smaller rugs that get stuck. It's never been entirely tripped up by a rug unlike my other robot vacuum from a different brand, so it's not a problem that requires my interference. There is a side brush that's able to sweep dust from corners, curves, around furniture, and along walls so that it can be vacuumed up, and it's effective.

Of note, I have one large rug that sheds continually and the robot can keep it clean. My Dyson vacuum has a hard time with this rug because the fluff clogs it up. I have another rug in my laundry room that is a lint and fur magnet. The standard hard floor attachment on my Dyson that I use 99 percent of the time can't clean it, but the Saros 10R sucks up all the lint that's become one with the rug with no problem. It's at least on par with my Dyson (a V15 Detect), and better for some areas.

I want to point out that I have all hard flooring with just rugs and no rooms with carpet, so I haven't tested the vacuum on carpet. If you have carpet, check out other reviews too, because I don't want to speak to its carpet performance with just rugs to test with.

The Saros 10R doesn't just vacuum, it also mops, and the mopping functions are what distinguish robot vacuums from one another. Roborock is using a set of dual spinning mops for the Saros 10R, each of which is about-palm sized. The two mops work in unison, and the robot can cover a good amount of floor space with no real gaps, providing a thorough clean.

saros 10r dust bag dock

The dust bag that dust from the vacuum empties into. I've had the robot a couple of months now, but it hasn't filled up yet.

The mops can clean well around chair legs and other furniture, because they can extend out from the robot somewhat for a closer clean. Dried on stains like ketchup can take a few passes, but everything else comes up immediately. The Saros 10R goes back to the base station to wash the mop, and the default time is 15 minutes.

Noise Levels

The Saros 10R is quiet when mopping, but when it's vacuuming, it sounds like a vacuum. You're not going to want to run it while you're trying to work or watch TV, but noise cancelling headphones like the AirPods Pro block it out enough to run it while you're home.

Roborock App

Robot vacuums seem to have apps with endless settings, and the Roborock app is no exception. Setup is simple, and the app walks you through the steps (basically just connecting to Wi-Fi), and through the mapping, cleaning, and maintenance process.

roborock app
The built-in sensors in the Saros 10R can map all accessible areas, and can also detect furniture to determine room type. There are also settings that allow it to detect objects on the ground and snap photos, so you can see what it's avoiding and how the AI is working. After a map has been completed, the Saros 10R is ready to clean. There are multiple cleaning modes that combine vacuuming and mopping, and you can select different suction settings for the vacuum and water settings for the mopping. There are also modes for quick, standard, and deep cleaning.

You can follow along on the map to see where the robot has cleaned, the pattern it used, and areas that it might have skipped due to obstacles. It's super detailed, and the feature to show pictures of what the robot noticed and skipped is useful. It detects pets (and can avoid them), cables, furniture where it can become trapped, pet messes, fabric, extra dirty areas, and more. I have rarely had an issue with the Saros 10R getting itself stuck because the AI to avoid cables and other obstacles is excellent.

The Map created by the robot can be edited, so if it doesn't do a great job separating out rooms, it's possible to manually fix it. I have an open floor plan, so I went in and sectioned things out so that I could have it clean specific rooms. If you have multiple floors, you can create more than one map. You can add areas, remove areas, and block areas for customized cleaning.

roborock vacuum map
There is an option in the app to watch the Saros 10R camera live so you can see what it's doing and check in on pets. I didn't have many instances where there were issues, but if it does catch a cord or something, the app sends an alert so it can be addressed.

roborock app alerts
The app supports setting up a cleaning schedule, so I have assigned it to clean different areas on Monday, Tuesday, and Wednesday, which splits up the cleaning so it doesn't need to be running for hours a day. I also have different cleaning scenarios that group rooms together with variable cleaning parameters that change factors like suction power for rooms without rugs.

HomeKit Integration

Thanks to Matter support, the Roborock Saros 10R can be added to ‌HomeKit‌ and it shows up in the Home app. It can be controlled with the Home app, or with ‌Siri‌ voice commands.

roborock homekit
‌HomeKit‌ is more limited than what's possible with the Roborock app, but ‌Siri‌ integration has proven more useful than expected. When the kitchen is messy after eating, I can say "Hey ‌Siri‌, clean the kitchen," and the Saros 10R is able to do that based on the rooms I setup in the Roborock app.

In the Home app, there's support for automations that rely on other ‌HomeKit‌ products. If for some reason I want the Saros 10R to vacuum when I turn on a light, I can set that up. The Home app can also be used to have the robot clean at specific times, when a sensor detects something, or when people leave the home, which is perhaps the most useful option.

For starting a random cleaning, the Home app is less useful. It doesn't include all of the various cleaning modes and settings, but you can choose to vacuum, mop, or vacuum and mop. You can have the robot clean the entire house or specific rooms.

Maintenance

The Saros 10R is a complicated piece of machinery that does require some maintenance, though it hasn't been too overwhelming yet. So far, I've had to empty the dirty water tank, fill the clean water tank, replace cleaning fluid, clean the sensors, and clean the base.

roborock saros 10r bottom

The bottom of the Saros 10R after 61 total hours of operation. I haven't cleaned it.

Eventually I'll need to replace the filter and the dust bag (which should be done once a month), and swap in new mop heads. There are estimated lifespans for some of the hardware like the side brush, while others suggest replacing on an as-needed basis. The mop and bristles of the various brushes can wear out over time, with replacements available from Roborock.

roborock maintenance

Battery and Cleaning Time

The Saros 10R can't clean my entire house on a single charge, and it typically needs to charge at least once if I'm not splitting up the cleaning by room. Cleaning time can vary based on settings, such as suction power and whether it's set to "deep" clean with multiple passes, but here are a couple of scenarios.

Cleaning 1

  • Started at 12:30, full cleaning. 150 square meters total.
  • Ran out of battery at 14:47 and had to charge.
  • Resumed cleaning at 16:59.
  • Finished at 20:07. Total cleaning time: 244 minutes, 1 charge.

Cleaning 2

  • Started at 10:31. Deep clean, 199 square meters.
  • Ran out of battery at 13:21, and had to recharge.
  • Resumed at 15:32, finished at 17:49. Total cleaning time: 291 minutes.

Cleaning 3

  • Started at 18:23, three main rooms, 81 square meters. Standard clean.
  • Finished at 21:49. Total cleaning time: 133 minutes.

Cleaning 4

  • Started at 10:00 on schedule. Full clean, 101 square meters.
  • Finished at 14:38. Total cleaning time: 164 minutes.

The Saros 10R is not quick unless I have it in the quick mode that leaves more space between passes. It takes a significant amount of time for a clean, and it can and will run almost all day if I set it to clean the entire house and it needs to charge. Cleaning in sections means it doesn't need to charge as often.

It takes around 2.5 hours for it to charge up, but it will always resume when it runs out of battery.

Bottom Line

The Saros 10R is one of Roborock's newest cleaning bots, and of the few that I've tried, it's the most impressive. My floor looks and feels much cleaner, and it takes almost no interaction from me. There have been a few instances where the robot gets itself tangled in cords and loses a mop, but the obstacle avoidance is good enough that it avoids a lot of cable issues, stays away from any pet messes, and knows not to get itself trapped in furniture that's hard to navigate. It doesn't struggle with my rugs, or the thresholds between rooms.

It's the most hands-off cleaning experience that I've had, and that has value. I don't have to spend over an hour a week vacuuming and more when I mop, I don't have to fuss with the robot, and the floor is clean throughout the week. I do wish the battery lasted longer because it can't get through the house on a single charge when it's vacuuming and mopping, but that's manageable by splitting up the areas cleaned each day.

At this price, a robot vacuum needs to be essentially perfect, and able to hold up to longterm daily use. I tested the Saros 10R by using it daily for a little over two months, because I wanted to give it more time than I would the average review. It's so far held up and continues to work well, but I'm going to keep using it to give updates on its longevity.

It definitely cleans super well, navigates almost perfectly, and is easy to use, but I expect something priced this high to work for a long time, and that's not a metric I can judge just yet.

How to Buy

The Saros 10R can be purchased from the Roborock website or from Amazon for $1,600.

Note: Roborock provided MacRumors with a Saros 10R for the purpose of this review. No other compensation was received.

Nanoleaf, known for its range of iPhone-connected lighting products, today announced the launch of two new devices. The Nanoleaf Rope Light is a super flexible LED light strip that can be bent into almost any shape.

nanoleaf rope light
It is more flexible than Nanoleaf's traditional light strips, and it has a silicone cover that diffuses the light so it does not need to be under or behind a TV, display, or other device. It can be crafted into a specific shape, manipulated into an abstract design, or used to outline furniture and decor.

Each $70 rope light is five meters long (16 feet), and it can display multiple colors at one time. Nanoleaf ships it with mounting clips so that it can be securely mounted to a wall in a myriad of designs. There are 420 total LEDs inside the light, and it is 300 lumens, so more of an ambient lighting product than a lamp replacement.

The Matter Smart Multicolor Rope Light can be controlled via the Nanoleaf app, and it is also able to connect to HomeKit over Matter so that it can be controlled with the Home app and with Siri voice commands. Matter integration also allows it to be controlled alongside other ‌HomeKit‌ products in scenes and automations. More than 16 million colors are supported, with customizable colors and patterns available in the Nanoleaf app.

Nanoleaf's Solar Garden Lights are an affordable solar accent lighting option, priced at $50 for two. Unlike most Nanoleaf products, the Solar Garden Lights do not connect to ‌HomeKit‌ or the Nanoleaf app, and they are instead controlled via an included remote control.

nanoleaf solar garden light
There are eight bulbs per Solar Garden light, along with a solar panel that can be placed in the sun to keep the lights charged up. Alternatively, the lights include a USB-C port and can also be charged that way.

We were able to test out the Solar Garden Lights ahead of launch, and found that they worked well. A quick USB-C charge provided enough power to get them up and running, and charging in the sun kept them powered. The lights are not designed to come on during the day, and like many solar lights, will activate only at night.

While app controls would be nice, the remote cycles through 11 animated scenes with different colors, and eight solid color options. There are also warm and white light settings for those who don't care for multicolored lights. The stems for the bulbs can be manipulated into an ideal shape, and Nanoleaf ships stakes in two sizes so you can adjust height to your liking. The lights look nice positioned around plants given the spray design.

The Solar Garden Lights feature IP65 weatherproofing so they will hold up to rain and the elements, and there are controls to run them for 4, 6, and 8 hour increments. Compared to the inexpensive solar lights that are easy to find at big box stores, Nanoleaf's have a sturdier solar attachment and quality, flexible bulbs with a unique look. Up to 20 lights can be controlled at once, and Nanoleaf has multi-packs available.

Both of the new products can be purchased from Nanoleaf's website starting today. The Solar Garden Lights are priced starting at $50, and the Rope Light is $70.

In addition to introducing new products today, Nanoleaf also announced that it is expanding into 2,500 Walmart retail locations across the United States. Nanoleaf products will be available for purchase at Walmart in addition to Amazon, the Nanoleaf website, Best Buy, and other retailers.

Google today announced that it is adding a dedicated AI Mode to Google Search, which is an expansion of the existing AI Overviews feature that Google provides for standard searches. AI Mode is able to handle longer, harder search queries with multiple facets.

google ai mode 1
Questions are broken down into different parts, with multiple searches run simultaneously on separate databases to provide a timely answer that takes into account personal context to make results more relevant to each user. It can generate images, including custom charts, and it supports follow-up questions.

AI Mode is rolling out to all users in the United States starting today, and it will be accessible through a new tab in Google Search for desktop and the Google Search apps. It will let users make more complex queries that are designed to leverage AI capabilities.

Google CEO Sundar Pichai said that AI Mode's predecessor, AI Overviews, has been "one of the most successful launches for search in the last decade." Pichai did not address the impact that AI Overviews have had on publishers. AI Overviews and AI Mode pull content from websites, giving people little reason to click on links to source information directly where it came from. AI Mode will display links, much like AI Overviews, but with detailed summaries that are provided, there's little reason for people to visit websites from the search interface.

AI Mode will incorporate AI agent capabilities, so it will be able to field requests like "find two affordable tickets for this Saturday's Reds game," and it will also work for shopping. AI Mode will be able to help users find specific items, such as a rug, with refinements that take into account factors like children or pets, colors, and more. Google's AI Mode will be able to track prices on items, and let you know when something you want is on sale.

For clothing shopping, AI Mode includes a virtual try-on option. It can use a photo of the user to create a preview of what an item of clothing will look like when worn. The feature works by generating an AI model of the user's body, and displaying how clothing will drape in a realistic way. There's also a "buy it for me" option that can be used for making purchases directly from search. The new shopping features are set to arrive in the coming months, though try on is available in Labs now.

This summer, AI Mode will adopt personal context, aggregating past searches and information from apps like Gmail to provide more customized responses.

AI Mode will now power AI Overviews in the standard search interface, and some of the new AI Mode capabilities will be incorporated into AI Overviews.

Apple today released a new firmware update for the USB-C version of the AirPods Max headphones. The new firmware is version 7E108, up from the prior 7E101 firmware the device was previously running.

Airpods Max Feature Green Triad
There is no word on what's new in the firmware as of yet, if anything. It is likely that the update focuses on under-the-hood performance improvements.

With the prior 7E101 firmware, Apple added support for lossless audio and ultra low-latency audio for the USB-C ‌AirPods Max‌, so there could also be a bug fix related to this capability.

The USB-C ‌AirPods Max‌ now support 24-bit 48 kHz lossless audio, which is designed to allow listeners to experience music the way the artist created it in the studio.

Firmware can be installed by putting the ‌AirPods Max‌ in Bluetooth range of an iPhone, iPad, or Mac that's connected to Wi-Fi, and then plugging them in to charge. It can take up to 30 minutes for firmware to update.

You can check your firmware version by going to Settings > Bluetooth and selecting the Info button next to the ‌AirPods Max‌ when they are connected to an ‌iPhone‌, ‌iPad‌, or Mac.

Update: Apple's release notes state only that the update delivers "bug fixes and performance improvements" without providing any detail on what has changed.

Related Roundup: AirPods Max
Buyer's Guide: AirPods Max (Buy Now)
Related Forum: AirPods

Google is bringing a new live view feature to its Gemini apps on iOS and Android, allowing users to live stream their surroundings to Gemini to get feedback on what they're seeing. Integration is rolling out starting today.

gemini live
In a demo, Gemini identified objects that the user was looking at, corrected wrong assumptions about what an object was, and provided context by answering questions. Gemini's live view can identify objects in real-time, provide assistance with DIY or home improvement projects, help organize spaces, assist with shopping, and more.

You can also share your iPhone or iPad screen with Gemini to get feedback on something that you see online.

Gemini's Live option could be helpful for those who are blind or who have low vision, as it can provide a live vocal feed of a person's surroundings. Live is built on Project Astra, which Google previewed at I/O last year, and it previously rolled out to Pixel devices.

This summer, Google also plans to add personal context to Gemini for features like Gmail's smart reply. Gemini will be able to use relevant information across Google apps to provide a more personalized experience. For example, the AI reply feature in Gmail can sound like you, rather than having a generic voice. It is able to scan past emails to match tone, style, and word choice, plus it can look up notes, documents from Google Drive, and more to add context to emails.

In addition to sharing a complete WWDC 2025 schedule today, Apple has outlined details about developer labs and community events.

Apple WWDC25 event branding big
Apple will be hosting free group labs from Tuesday, June 10 through Friday, June 13. During these labs, developers will be able to join Apple engineers online to discuss and ask questions about WWDC 2025 announcements. Developers can register starting today, with available topics ranging from Apple Intelligence to SwiftUI.

Developers will also be able to set up one-on-one appointments with Apple experts online for guidance on various topics, such as app design, privacy, accessibility, and more. Apple says one-on-one lab requests will open immediately after the WWDC 2025 keynote on Monday, June 9 via this page and the Apple Developer app.

Apple has shared helpful tips for both the group and one-on-one labs.

During the week of WWDC, Apple's engineers will answer questions and help to solve technical issues in the Apple Developer Forums.

Apple has also listed various community events that will be held around the world leading up to, during, and shortly after WWDC 2025.

Related Roundup: WWDC 2025

Apple will make its artificial intelligence models available to developers to use in their apps, reports Bloomberg. The company plans to introduce a new software development kit (SDK) in iOS 19 that will make it easier for app creators to add AI features.

Apple Intelligence General Feature 2
The SDK will feature the same large language models that Apple is using for Apple Intelligence features like notification summaries, Writing Tools, Genmoji, and Image Playground, but Apple will first focus on the smaller models that are able to run on-device.

Apple has faced criticism for its failure to deliver ‌Apple Intelligence‌ Siri features in a timely manner. Apple announced a new personalized ‌Siri‌ experience at WWDC 2024 and intended to release the new capabilities as part of iOS 18, but the functionality was not ready in time and is now being held until ‌iOS 19‌.

Some of Apple's other features, like Writing Tools and ‌Image Playground‌, haven't seen widespread adoption. According to Bloomberg, Apple is hoping that opening up its AI models to developers will provide use cases that better attract consumers. Currently, developers can integrate notification summaries, Writing Tools, ‌Genmoji‌, and ‌Image Playground‌ into their apps, but they aren't able to create new AI features using Apple's framework. Instead, developers who want to include AI integrate third-party models, which Apple is aiming to change.

Apple's AI announcements will come at the Worldwide Developers Conference, which is set to take place on Monday, June 9.

Apple has also been working on in-house large language models (LLMs), and eventually the company plans to introduce a version of ‌Siri‌ that relies on LLMs and is more like ChatGPT, Claude, and other chatbots. The LLM version of ‌Siri‌ isn't expected until 2026 at the earliest, and it will likely be part of iOS 20.

Related Roundup: iOS 19

Apple today announced a more detailed schedule for its annual developers conference WWDC, which runs from June 9 through June 13. The schedule confirms that Apple's keynote will begin on Monday, June 9 at 10 a.m. Pacific Time, with a live stream to be available on Apple.com, in the Apple TV app, and on YouTube.

WWDC 2025 Banner
During the keynote, Apple is expected to announce iOS 19, iPadOS 19, macOS 16, watchOS 12, tvOS 19, visionOS 3, and other software updates, along with new Apple Intelligence features. In some years, there are also hardware announcements at WWDC, but there has yet to be any rumors about new devices being unveiled at this year's conference.

Apple has reminded developers that WWDC 2025 is "on the horizon" on its developer news page, where developers can now register for group labs.

The keynote will be followed by the Platforms State of the Union video on June 9 at 1 p.m. Pacific Time. This video will provide a deeper dive into the latest features and tools for developers across Apple's software platforms. Apple says the video will be available to stream via the Apple Developer website, app, and YouTube channel.

Both the keynote and the Platforms State of the Union will be available for on-demand playback after each live stream concludes.

In the past, the WWDC schedule included an in-person Apple Design Awards event, but Apple moved to pre-announcing the winning apps last year.

WWDC 2025 will primarily take place online, with more than a hundred videos to be shared across the Apple Developer website, Apple Developer app, and YouTube for free. There will also be an in-person component, as more than 1,000 developers and students have been invited to attend a special day at Apple Park on June 9 to watch the keynote video together, meet some of Apple's teams, socialize, and more.

Related Roundup: WWDC 2025

If you told someone in the 1960s that one day a flying robot would be able to deliver a computer that fits in their pocket to their backyard, they might think it was a scene from The Jetsons. However, that is now a very real possibility.

iPhone and Drone Feature 2
Amazon today announced that it recently received approval from the U.S. Federal Aviation Administration to offer drone delivery for several new categories of items, including Apple products like iPhones, AirPods, and AirTags.

Amazon launched its Prime Air drone delivery in 2022, promising drop-offs to a backyard or other specified location in less than an hour. However, the service is currently limited to parts of the Phoenix, Arizona and College Station, Texas areas in the U.S., and it is only offered during daylight hours and favorable weather conditions. In addition, drone delivery is only available for a single item that weighs up to five pounds.

The delivery location must be clear of obstacles and reachable by a drone.

While this cutting-edge technology is currently limited to just two cities in the U.S., Amazon promises to expand it to additional cities over time. The company has also promised to launch the service in the UK and Italy, subject to regulatory approval.

Amazon's announcement was earlier reported by The Verge.

Tag: Amazon