iOS 15's Live Text Feature Lets You Digitize Written Notes, Call a Number on a Sign, Translate a Menu, and Much More

In iOS 15, Apple is introducing a new feature called Live Text that can recognize text when it appears in your camera's viewfinder or in a photo you've taken and let you perform several actions with it.

Apple iPadPro iPadOS15 photos LiveText 060721 big
For example, Live Text allows you to capture a phone number from a storefront with the option to place a call, or look up a location name in Maps to get directions. It also incorporates optical character recognition, so you can search for a picture of a handwritten note in your photos and save it as text.

Live Text's content awareness extends to everything from QR codes to emails that appear in pictures, and this on-device intelligence feeds into Siri suggestions, too.

ios15 live text
For instance, if you take a picture that shows an email address and then open the Mail app and start composing a message, ‌Siri‌'s keyboard suggestions will offer up the option to add "Email from Camera" to the To field of your message.

Other Live Text options include the ability to copy text from the camera viewfinder or photos for pasting elsewhere, share it, look it up in the dictionary, and translate it for you into English, Chinese (both simplified and traditional), French, Italian, German, Spanish, or Portuguese.

live text translate
It can even sort your photos by location, people, scene, objects, and more, by recognizing the text in pictures. For example, searching for a word or phrase in Spotlight search will bring up pictures from your Camera Roll in which that text occurs.

Live Text works in Photos, Screenshot, Quick Look, and Safari and in live previews with Camera. In the Camera app, it's available whenever you point your iPhone's camera at anything that displays text, and is indicated by a small icon that appears in the bottom right corner whenever textual content is recognized in the viewfinder. Tapping the icon lets you tap recognized text and perform an action with it. A similar icon appears in the ‌Photos‌ app when you're viewing a shot image.

visual look up ios 15
In another neural engine feature, Apple is introducing something called Visual Look Up that lets you take photos of objects and scenes to get more information from them. Point your ‌iPhone‌'s camera at a piece of art, flora, fauna, landmarks, or books, and the Camera will indicate with an icon that it recognizes the content and has relevant ‌Siri‌ Knowledge that can add context.

Since Live Text relies heavily on Apple's neural engine, the feature is only available on iPhones and iPads with at least an A12 Bionic or better chip, which means if you have an ‌iPhone‌ X or earlier model or anything less than an iPad mini (5th generation), iPad Air (2019, 3rd generation), or iPad (2020, 8th generation), then unfortunately you won't have access to it.

The iOS 15 beta is currently in the hands of developers, with a public beta set to be released next month. The official launch of iOS 15 is scheduled for the fall.

Related Forum: iOS 15

Popular Stories

iphone 16 apple intelligence

Apple Aiming to Release 'Breakthrough' New iPhone Accessory

Wednesday February 18, 2026 12:43 pm PST by
Apple is looking for a "breakthrough" with its push into wearable AI devices, including an "AirTag-sized pendant," according to Bloomberg's Mark Gurman. In a report this week, he said the pendant is reminiscent of the failed Humane AI Pin, but it would be an iPhone accessory rather than a standalone product. The pendant would feature an "always-on" camera and a microphone for Siri voice...
Apple Watch 15 Tips Every Owner Needs to Know Feature

Apple Watch: 15 Tips Every Owner Needs to Know

Thursday February 19, 2026 7:38 am PST by
Apple Watch is now eleven generations in, and packed with useful features that are easy to miss at first glance. To help you get more out of your new device, we've rounded up 15 practical tips you might not have discovered yet, including a few that long-time users often overlook. Bounce Between Two Apps On your Apple Watch, double-press the Digital Crown to see a deck of all currently...
Dynamic Island iPhone 18 Pro Feature

10 Reasons to Wait for Apple's iPhone 18 Pro

Wednesday February 18, 2026 5:12 am PST by
Apple's iPhone development roadmap runs several years into the future and the company is continually working with suppliers on several successive iPhone models at the same time, which is why we often get rumored features months ahead of launch. The iPhone 18 series is no different, and we already have a good idea of what to expect for the iPhone 18 Pro and iPhone 18 Pro Max. One thing worth...
Multicolored Low Cost A18 Pro MacBook Feature

Low-Cost MacBook Expected on March 4 in These Colors

Wednesday February 18, 2026 5:42 am PST by
Apple will announce its rumored low-cost MacBook at its event on March 4, with the device coming in a selection of bold color options, according to a known leaker. Earlier this week, Apple announced a "special Apple Experience" for the media in New York, London, and Shanghai, taking place on March 4, 2026 at 9:00am ET. Posting on Weibo, the leaker known as "Instant Digital" said that the...
iphone 17 pro green

iPhone 17 Pro Max Curiously Becomes Most Traded-In Smartphone

Wednesday February 18, 2026 9:13 am PST by
New trade-in data indicates that Apple's iPhone 17 Pro Max has rapidly become the single most traded-in smartphone. According to a new report from SellCell, Apple's latest flagship iPhone has quickly risen to the top of the independent trade-in market, accounting for 11.5% of all devices appearing in the top-20 trade-in rankings just months after release. The analysis is based on SellCell...

Top Rated Comments

ruka.snow Avatar
61 months ago
Let's see them digitise doctors notes and prescriptions.
Score: 19 Votes (Like | Disagree)
61 months ago

They don’t want to. Hence why they aren’t enabling it for the intel macs. Just a cash grab as they hope everybody will sell their newish intel Mac, which they paid handsomely for, at a huge loss and then go and pay handsomely again for a new Apple silicon Mac.
I'm sure Apple could get it running on the A11, however people apparently forget there will be trade-offs to enable that to happen. What if enabling it on the A11 halved the battery life, would you want that, would that be a trade-off you'd be willing to make? Or perhaps the performance of the phone drops by 25%, how about that?

Apple can't (and shouldn't be expected to) support older hardware with every single new feature that's released. Apple do a good job of supporting older hardware far better than the competitors do (the iPhone 6s, a 2015 phone, gets iOS 15), they'll get the benefit of at least some new features and improvements.

I'm sure some people will want to upgrade their hardware to support the latest software additions, but I (and I'm sure like a lot of other folks like me with newish machines), won't be bothered enough. Nice features, but I won't lose sleep over not having them.


Apple isn’t the company from heaven as most seem to think.
Of course they're not - Apple is a money making enterprise.
Score: 14 Votes (Like | Disagree)
61 months ago

Forgive me if I'm wrong, but wasn't the A11 the first chip with a Neural Engine? So surely they could make this work on the X (and even the 8 and 8 Plus) if they REALLY wanted to.
The later phones have an improved Neural Engine. I'm guessing that these are required in order to ensure good performance.
Score: 8 Votes (Like | Disagree)
61 months ago

The old World Lens app (and subsequent iterations after Google acquired it) were able to both capture and translate text in real-time using substantially slower hardware.

Live text would just be this, without the translation (whether on device or not). I can’t see why the neural engine would be an absolute requirement, even if you try to make a ‘performance’ argument.
Try using Word Lens for any sustained period - it absolutely hammered your battery. And just to be clear, that isn't a criticism of Word Lens - it was an exceptional app for its time, it just couldn't take advantage of optimised silicon.

The performance argument is *the* argument. Apple more than likely could implement it on all hardware, but what it it caused your battery life to be half of usual, or Safari and camera apps to be considerably slower while it runs CPU-bound neural networks on every image downloaded, or every camera frame?

Also worth noting, text recognition isn't a trivial task, especially with handwriting.
Score: 8 Votes (Like | Disagree)
61 months ago
implemented better than android
Score: 5 Votes (Like | Disagree)
LukeHarrison Avatar
61 months ago

Since Live Text relies heavily on Apple's neural engine, so the feature is only available on iPhones and iPads with at least an A12 Bionic or better chip, which means if you have an iPhone X or earlier model or anything less than an iPad mini (5th generation), iPad Air (2019, 3rd generation), or iPad (2020, 8th generation), then unfortunately you won't have access to it.
Forgive me if I'm wrong, but wasn't the A11 the first chip with a Neural Engine? So surely they could make this work on the X (and even the 8 and 8 Plus) if they REALLY wanted to.
Score: 5 Votes (Like | Disagree)