Apple is looking to create an "entirely new application paradigm" for augmented and virtual reality according to a new job listing, highlighting the company's growing ambitions in the AR and VR space as it ramps up the development of its next-generation products.
The role will include "working closely with Apple's UI frameworks, Human Interface designers and system software teams" in building out Apple's augmented and virtual reality experiences. "This role will push you to think outside-the-box, and solve incredibly ambitious and interesting problems in the AR/VR space," the job listing adds.
Following years of research and development, Apple is expected to release its first AR-focused device in 2022. The device, likely to come in the form of a headset, will be Apple's first major push into the increasingly vibrant AR and VR space. The company's first headset is not expected to be a mainstream hit at first, with credible reports suggesting it will remain a niche product aimed largely at developers for media consumption, communication, and gaming.
Further down the line, Apple plans to release augmented reality glasses. These glasses, unlike the headset, will be in a smaller form factor and are likely to appeal more to the masses. "Apple Glasses" are expected to debut by 2025 at the earliest, with the second generation of Apple's AR headset now rumored to launch in 2024.
The iPhone 17 Pro and iPhone 17 Pro Max are three months away, and there are plenty of rumors about the devices.
Below, we recap key changes rumored for the iPhone 17 Pro models as of June 2025:Aluminum frame: iPhone 17 Pro models are rumored to have an aluminum frame, whereas the iPhone 15 Pro and iPhone 16 Pro models have a titanium frame, and the iPhone X through iPhone 14 Pro have a...
iPadOS 26 allows iPads to function much more like Macs, with a new app windowing system, a swipe-down menu bar at the top of the screen, and more. However, Apple has stopped short of allowing iPads to run macOS, and it has now explained why.
In an interview this week with Swiss tech journalist Rafael Zeier, Apple's software engineering chief Craig Federighi said that iPadOS 26's new Mac-like ...
Alongside WWDC this week, Logitech announced notable new accessories for the iPad and Apple Vision Pro.
The Logitech Muse is a spatially-tracked stylus developed for use with the Apple Vision Pro. Introduced during the WWDC 2025 keynote address, Muse is intended to support the next generation of spatial computing workflows enabled by visionOS 26. The device incorporates six degrees of...
Thursday June 12, 2025 8:58 am PDT by Tim Hardwick
Apple's iPhone development roadmap runs several years into the future and the company is continually working with suppliers on several successive iPhone models simultaneously, which is why we often get rumored features months ahead of launch. The iPhone 17 series is no different, and we already have a good idea of what to expect from Apple's 2025 smartphone lineup.
If you skipped the iPhone...
Apple today provided developers with a revised version of the first iOS 26 beta for testing purposes. The update is only available for the iPhone 15 and iPhone 16 models, so if you're running iOS 26 on an iPhone 14 or earlier, you won't see the revised beta.
Registered developers can download the new beta software through the Settings app on each device.
The revised beta addresses an...
Thursday June 12, 2025 10:14 am PDT by Joe Rossignol
Apple today added Mac Studio models with M4 Max and M3 Ultra chips to its online certified refurbished store in the United States, Canada, Japan, Singapore, and many European countries, for the first time since they were released in March.
As usual for refurbished Macs, prices are discounted by approximately 15% compared to the equivalent new models on Apple's online store. Note that Apple's ...
Apple today added M4 MacBook Air models to its refurbished store in the United States, making the latest MacBook Air devices available at a discounted price for the first time since they launched earlier this year.
Both 13-inch and 15-inch MacBook Air models are available, with Apple offering multiple capacities and configurations. The refurbished devices are discounted by approximately 15...
AR is where it is at. Walk into a store. Look at items on the shelves. Eye tracking knows what you are looking at. HUD displays the price of the product, how often you buy the product. If you buy the product often and have run out at home, a reminder might pop up and indicate how many items would be worth purchasing (based on your purchase history).
You pick up a pack of biscuits, instantly the ingredients list pops up (no more impossibly small ingredients lists on packages). If you are someone with allergies it can warn you if their are ingreients that might cause you a problem. Perhaps even indicate a similar item that does not have something you are allergic to.
As you scan the isles looking for something, the system notices you are searching - prompting you to indicate what you are looking for. You say 'cheese', and a pop up on the display shows 'isle 14' with an arrow indicating the direction to head in.
All pricing labels, paper notices, stickers etc will all dissappear. As a store owner - when there is need to change the price of something - change it in the system and you are done. No meed to update signs or anything - customers see the new proce via their AR immediately. Add a new product to shelves, same story.
Repairing something - HUD shows which screws to remove next, their type and size by highlighting them on the item you are looking at using AR. Complete a step and it shows you the next step. Diagnosing a hardware problem, status shows on the HUD.
It goes on and on, and it is all AR. VR might be 'cool' or useful in training, but I really think AR is where it is at.
The biggest issue is to have a data interchange format that all AR devices support, image recognition that works, and the ability for people to easily create AR resources such as a repair guide for a peice of hardware, or a map of a store and where items are on shelves. The likelyhood that AR itself could be used as a tool for inputting this data in the first place is immesly high. Walk around your store amd it creats a map, look at shelves and it picks up on products and where they are located. No need to type all this data in, making it easy for people to start using in the first place.
Interesting comments here calling for a "killer app".
An anecdote: I was watching a presentation at a large mobile phone conference (GSM World, maybe) just when mobile internet was becoming available. Everyone was looking for the "killer app" that would make it take off. The presenter said "You're all looking for the killer app, but the killer app for mobile phones is voice!" The entire audience, many hundreds of mobile professionals, gave a standing ovation at that comment.
That was the view of professionals back then, and they completely failed to understand the usefulness of data on the phone. They couldn't envision anything beyond incremental progress. Look at what we have now, and realize that there is no single "killer app" for VR/AR. Entire industries will be created, and we have no idea what they will be from our perspective of today. Don't make the mistake of the phone people.