iOS 15's Live Text Feature Lets You Digitize Written Notes, Call a Number on a Sign, Translate a Menu, and Much More

In iOS 15, Apple is introducing a new feature called Live Text that can recognize text when it appears in your camera's viewfinder or in a photo you've taken and let you perform several actions with it.

Apple iPadPro iPadOS15 photos LiveText 060721 big
For example, Live Text allows you to capture a phone number from a storefront with the option to place a call, or look up a location name in Maps to get directions. It also incorporates optical character recognition, so you can search for a picture of a handwritten note in your photos and save it as text.

Live Text's content awareness extends to everything from QR codes to emails that appear in pictures, and this on-device intelligence feeds into Siri suggestions, too.

ios15 live text
For instance, if you take a picture that shows an email address and then open the Mail app and start composing a message, ‌Siri‌'s keyboard suggestions will offer up the option to add "Email from Camera" to the To field of your message.

Other Live Text options include the ability to copy text from the camera viewfinder or photos for pasting elsewhere, share it, look it up in the dictionary, and translate it for you into English, Chinese (both simplified and traditional), French, Italian, German, Spanish, or Portuguese.

live text translate
It can even sort your photos by location, people, scene, objects, and more, by recognizing the text in pictures. For example, searching for a word or phrase in Spotlight search will bring up pictures from your Camera Roll in which that text occurs.

Live Text works in Photos, Screenshot, Quick Look, and Safari and in live previews with Camera. In the Camera app, it's available whenever you point your iPhone's camera at anything that displays text, and is indicated by a small icon that appears in the bottom right corner whenever textual content is recognized in the viewfinder. Tapping the icon lets you tap recognized text and perform an action with it. A similar icon appears in the ‌Photos‌ app when you're viewing a shot image.

visual look up ios 15
In another neural engine feature, Apple is introducing something called Visual Look Up that lets you take photos of objects and scenes to get more information from them. Point your ‌iPhone‌'s camera at a piece of art, flora, fauna, landmarks, or books, and the Camera will indicate with an icon that it recognizes the content and has relevant ‌Siri‌ Knowledge that can add context.

Since Live Text relies heavily on Apple's neural engine, the feature is only available on iPhones and iPads with at least an A12 Bionic or better chip, which means if you have an ‌iPhone‌ X or earlier model or anything less than an iPad mini (5th generation), iPad Air (2019, 3rd generation), or iPad (2020, 8th generation), then unfortunately you won't have access to it.

The iOS 15 beta is currently in the hands of developers, with a public beta set to be released next month. The official launch of iOS 15 is scheduled for the fall.

Related Forum: iOS 15

Popular Stories

iPadOS 26 App Windowing

Apple Explains Why iPads Don't Just Run macOS

Friday June 13, 2025 7:46 am PDT by
iPadOS 26 allows iPads to function much more like Macs, with a new app windowing system, a swipe-down menu bar at the top of the screen, and more. However, Apple has stopped short of allowing iPads to run macOS, and it has now explained why. In an interview this week with Swiss tech journalist Rafael Zeier, Apple's software engineering chief Craig Federighi said that iPadOS 26's new Mac-like ...
iPhone 17 Pro Blue Feature Tighter Crop

iPhone 17 Pro Launching in Three Months With These 12 New Features

Saturday June 14, 2025 5:45 pm PDT by
The iPhone 17 Pro and iPhone 17 Pro Max are three months away, and there are plenty of rumors about the devices. Below, we recap key changes rumored for the iPhone 17 Pro models as of June 2025:Aluminum frame: iPhone 17 Pro models are rumored to have an aluminum frame, whereas the iPhone 15 Pro and iPhone 16 Pro models have a titanium frame, and the iPhone X through iPhone 14 Pro have a...
iphone 16 pro models 1

17 Reasons to Wait for the iPhone 17

Thursday June 12, 2025 8:58 am PDT by
Apple's iPhone development roadmap runs several years into the future and the company is continually working with suppliers on several successive iPhone models simultaneously, which is why we often get rumored features months ahead of launch. The iPhone 17 series is no different, and we already have a good idea of what to expect from Apple's 2025 smartphone lineup. If you skipped the iPhone...
Logitech Logo Feature

Logitech Announces Two New Accessories for WWDC

Friday June 13, 2025 7:22 am PDT by
Alongside WWDC this week, Logitech announced notable new accessories for the iPad and Apple Vision Pro. The Logitech Muse is a spatially-tracked stylus developed for use with the Apple Vision Pro. Introduced during the WWDC 2025 keynote address, Muse is intended to support the next generation of spatial computing workflows enabled by visionOS 26. The device incorporates six degrees of...
iOS 26 Screens

Here Are All the iOS 26 Features That Require iPhone 15 Pro or Newer

Thursday June 12, 2025 4:53 am PDT by
With iOS 26, Apple has introduced some major changes to the iPhone experience, headlined by the new Liquid Glass redesign that's available across all compatible devices. However, several of the update's features are exclusive to iPhone 15 Pro and iPhone 16 models, since they rely on Apple Intelligence. The following features are powered by on-device large language models and machine...
CarPlay Liquid Glass Dark

Apple to Let iPhone Users Watch Videos on CarPlay Screen While Parked

Thursday June 12, 2025 6:16 am PDT by
Apple this week announced that iPhone users will soon be able to watch videos right on the CarPlay screen in supported vehicles. iPhone users will be able to wirelessly stream videos to the CarPlay screen using AirPlay, according to Apple. For safety reasons, video playback will only be available when the vehicle is parked, to prevent distracted driving. The connected iPhone will be able to...
iOS 26 on Three iPhones

Hate iOS 26's Liquid Glass Design? Here's How to Tone It Down

Wednesday June 11, 2025 4:22 pm PDT by
iOS 26 features a whole new design material that Apple calls Liquid Glass, with a focus on transparency that lets the content on your display shine through the controls. If you're not a fan of the look, or are having trouble with readability, there is a step that you can take to make things more opaque without entirely losing out on the new look. Apple has multiple Accessibility options that ...
Mac Studio Feature

Apple Begins Selling Refurbished Mac Studio With M4 Max and M3 Ultra Chips at a Discount

Thursday June 12, 2025 10:14 am PDT by
Apple today added Mac Studio models with M4 Max and M3 Ultra chips to its online certified refurbished store in the United States, Canada, Japan, Singapore, and many European countries, for the first time since they were released in March. As usual for refurbished Macs, prices are discounted by approximately 15% compared to the equivalent new models on Apple's online store. Note that Apple's ...
iOS 26 Feature

Apple Seeds Revised iOS 26 Developer Beta to Fix Battery Issue

Friday June 13, 2025 10:15 am PDT by
Apple today provided developers with a revised version of the first iOS 26 beta for testing purposes. The update is only available for the iPhone 15 and iPhone 16 models, so if you're running iOS 26 on an iPhone 14 or earlier, you won't see the revised beta. Registered developers can download the new beta software through the Settings app on each device. The revised beta addresses an...

Top Rated Comments

ruka.snow Avatar
52 months ago
Let's see them digitise doctors notes and prescriptions.
Score: 19 Votes (Like | Disagree)
Unggoy Murderer Avatar
52 months ago

They don’t want to. Hence why they aren’t enabling it for the intel macs. Just a cash grab as they hope everybody will sell their newish intel Mac, which they paid handsomely for, at a huge loss and then go and pay handsomely again for a new Apple silicon Mac.
I'm sure Apple could get it running on the A11, however people apparently forget there will be trade-offs to enable that to happen. What if enabling it on the A11 halved the battery life, would you want that, would that be a trade-off you'd be willing to make? Or perhaps the performance of the phone drops by 25%, how about that?

Apple can't (and shouldn't be expected to) support older hardware with every single new feature that's released. Apple do a good job of supporting older hardware far better than the competitors do (the iPhone 6s, a 2015 phone, gets iOS 15), they'll get the benefit of at least some new features and improvements.

I'm sure some people will want to upgrade their hardware to support the latest software additions, but I (and I'm sure like a lot of other folks like me with newish machines), won't be bothered enough. Nice features, but I won't lose sleep over not having them.


Apple isn’t the company from heaven as most seem to think.
Of course they're not - Apple is a money making enterprise.
Score: 14 Votes (Like | Disagree)
EmotionalSnow Avatar
52 months ago

Forgive me if I'm wrong, but wasn't the A11 the first chip with a Neural Engine? So surely they could make this work on the X (and even the 8 and 8 Plus) if they REALLY wanted to.
The later phones have an improved Neural Engine. I'm guessing that these are required in order to ensure good performance.
Score: 8 Votes (Like | Disagree)
Unggoy Murderer Avatar
52 months ago

The old World Lens app (and subsequent iterations after Google acquired it) were able to both capture and translate text in real-time using substantially slower hardware.

Live text would just be this, without the translation (whether on device or not). I can’t see why the neural engine would be an absolute requirement, even if you try to make a ‘performance’ argument.
Try using Word Lens for any sustained period - it absolutely hammered your battery. And just to be clear, that isn't a criticism of Word Lens - it was an exceptional app for its time, it just couldn't take advantage of optimised silicon.

The performance argument is *the* argument. Apple more than likely could implement it on all hardware, but what it it caused your battery life to be half of usual, or Safari and camera apps to be considerably slower while it runs CPU-bound neural networks on every image downloaded, or every camera frame?

Also worth noting, text recognition isn't a trivial task, especially with handwriting.
Score: 8 Votes (Like | Disagree)
farewelwilliams Avatar
52 months ago
implemented better than android
Score: 5 Votes (Like | Disagree)
LukeHarrison Avatar
52 months ago

Since Live Text relies heavily on Apple's neural engine, so the feature is only available on iPhones and iPads with at least an A12 Bionic or better chip, which means if you have an iPhone X or earlier model or anything less than an iPad mini (5th generation), iPad Air (2019, 3rd generation), or iPad (2020, 8th generation), then unfortunately you won't have access to it.
Forgive me if I'm wrong, but wasn't the A11 the first chip with a Neural Engine? So surely they could make this work on the X (and even the 8 and 8 Plus) if they REALLY wanted to.
Score: 5 Votes (Like | Disagree)