iOS 15's Live Text Feature Lets You Digitize Written Notes, Call a Number on a Sign, Translate a Menu, and Much More

In iOS 15, Apple is introducing a new feature called Live Text that can recognize text when it appears in your camera's viewfinder or in a photo you've taken and let you perform several actions with it.

Apple iPadPro iPadOS15 photos LiveText 060721 big
For example, Live Text allows you to capture a phone number from a storefront with the option to place a call, or look up a location name in Maps to get directions. It also incorporates optical character recognition, so you can search for a picture of a handwritten note in your photos and save it as text.

Live Text's content awareness extends to everything from QR codes to emails that appear in pictures, and this on-device intelligence feeds into Siri suggestions, too.

ios15 live text
For instance, if you take a picture that shows an email address and then open the Mail app and start composing a message, ‌Siri‌'s keyboard suggestions will offer up the option to add "Email from Camera" to the To field of your message.

Other Live Text options include the ability to copy text from the camera viewfinder or photos for pasting elsewhere, share it, look it up in the dictionary, and translate it for you into English, Chinese (both simplified and traditional), French, Italian, German, Spanish, or Portuguese.

live text translate
It can even sort your photos by location, people, scene, objects, and more, by recognizing the text in pictures. For example, searching for a word or phrase in Spotlight search will bring up pictures from your Camera Roll in which that text occurs.

Live Text works in Photos, Screenshot, Quick Look, and Safari and in live previews with Camera. In the Camera app, it's available whenever you point your iPhone's camera at anything that displays text, and is indicated by a small icon that appears in the bottom right corner whenever textual content is recognized in the viewfinder. Tapping the icon lets you tap recognized text and perform an action with it. A similar icon appears in the ‌Photos‌ app when you're viewing a shot image.

visual look up ios 15
In another neural engine feature, Apple is introducing something called Visual Look Up that lets you take photos of objects and scenes to get more information from them. Point your ‌iPhone‌'s camera at a piece of art, flora, fauna, landmarks, or books, and the Camera will indicate with an icon that it recognizes the content and has relevant ‌Siri‌ Knowledge that can add context.

Since Live Text relies heavily on Apple's neural engine, the feature is only available on iPhones and iPads with at least an A12 Bionic or better chip, which means if you have an ‌iPhone‌ X or earlier model or anything less than an iPad mini (5th generation), iPad Air (2019, 3rd generation), or iPad (2020, 8th generation), then unfortunately you won't have access to it.

The iOS 15 beta is currently in the hands of developers, with a public beta set to be released next month. The official launch of iOS 15 is scheduled for the fall.

Related Forum: iOS 15

Popular Stories

Apple Logo Spotlight

Apple Expected to Unveil Five All-New Products This Year

Wednesday January 21, 2026 10:54 am PST by
In addition to updating many of its existing products, Apple is expected to unveil five all-new products this year, including a smart home hub, a Face ID doorbell, a MacBook with an A18 Pro chip, a foldable iPhone, and augmented reality glasses. Below, we have recapped rumored features for each product. Smart Home Hub Apple home hub (concept) Apple's long-rumored smart home hub should...
airpods pro 3 purple

New, Higher End AirPods Pro Coming This Year

Tuesday January 20, 2026 9:05 am PST by
Apple is planning to debut a high-end secondary version of AirPods Pro 3 this year, sitting in the lineup alongside the current model, reports suggest. Back in September 2025, supply chain analyst Ming-Chi Kuo reported that Apple is planning to introduce a successor to the AirPods Pro 3 in 2026. This would be somewhat unusual since Apple normally waits around three years to make major...
airtag prime day 2

Apple Developing AirTag-Sized AI Pin With Dual Cameras

Wednesday January 21, 2026 12:31 pm PST by
Apple is working on a small, wearable AI pin equipped with multiple cameras, a speaker, and microphones, reports The Information. If it actually launches, the AI pin will likely run the new Siri chatbot that Apple plans to unveil in iOS 27. The pin is said to be similar in size to an AirTag, with a thin, flat, circular disc shape. It has an aluminum and glass shell, and two cameras at the...
smaller dynamic island iphone 18 pro Filip Vabrous%CC%8Cek

iPhone 18 Pro Leak: Smaller Dynamic Island, No Top-Left Camera Cutout

Tuesday January 20, 2026 2:34 am PST by
Over the last few months, rumors around the iPhone 18 Pro's front-panel design have been conflicted, with some supply-chain leaks pointing to under-display Face ID, reports suggesting a top-left hole-punch camera, and debate over whether the familiar Dynamic Island will shrink, shift, or disappear entirely. Today, Weibo-based leaker Instant Digital shared new details that appear to clarify the ...
M5 MacBook Pro

MacBook Pro to Receive Up to Six New Features by Next Year

Thursday January 22, 2026 9:31 am PST by
Apple is expected to release MacBook Pro models with M5 Pro and M5 Max chips soon, but you might want to pass on them, as bigger changes are around the corner. It has been reported that the MacBook Pro will be receiving a major redesign in late 2026 or in 2027. Six new features have been rumored so far, including an OLED display, touch capabilities, a Dynamic Island, M6 Pro and M6 Max chips...

Top Rated Comments

ruka.snow Avatar
60 months ago
Let's see them digitise doctors notes and prescriptions.
Score: 19 Votes (Like | Disagree)
Unggoy Murderer Avatar
60 months ago

They don’t want to. Hence why they aren’t enabling it for the intel macs. Just a cash grab as they hope everybody will sell their newish intel Mac, which they paid handsomely for, at a huge loss and then go and pay handsomely again for a new Apple silicon Mac.
I'm sure Apple could get it running on the A11, however people apparently forget there will be trade-offs to enable that to happen. What if enabling it on the A11 halved the battery life, would you want that, would that be a trade-off you'd be willing to make? Or perhaps the performance of the phone drops by 25%, how about that?

Apple can't (and shouldn't be expected to) support older hardware with every single new feature that's released. Apple do a good job of supporting older hardware far better than the competitors do (the iPhone 6s, a 2015 phone, gets iOS 15), they'll get the benefit of at least some new features and improvements.

I'm sure some people will want to upgrade their hardware to support the latest software additions, but I (and I'm sure like a lot of other folks like me with newish machines), won't be bothered enough. Nice features, but I won't lose sleep over not having them.


Apple isn’t the company from heaven as most seem to think.
Of course they're not - Apple is a money making enterprise.
Score: 14 Votes (Like | Disagree)
EmotionalSnow Avatar
60 months ago

Forgive me if I'm wrong, but wasn't the A11 the first chip with a Neural Engine? So surely they could make this work on the X (and even the 8 and 8 Plus) if they REALLY wanted to.
The later phones have an improved Neural Engine. I'm guessing that these are required in order to ensure good performance.
Score: 8 Votes (Like | Disagree)
Unggoy Murderer Avatar
60 months ago

The old World Lens app (and subsequent iterations after Google acquired it) were able to both capture and translate text in real-time using substantially slower hardware.

Live text would just be this, without the translation (whether on device or not). I can’t see why the neural engine would be an absolute requirement, even if you try to make a ‘performance’ argument.
Try using Word Lens for any sustained period - it absolutely hammered your battery. And just to be clear, that isn't a criticism of Word Lens - it was an exceptional app for its time, it just couldn't take advantage of optimised silicon.

The performance argument is *the* argument. Apple more than likely could implement it on all hardware, but what it it caused your battery life to be half of usual, or Safari and camera apps to be considerably slower while it runs CPU-bound neural networks on every image downloaded, or every camera frame?

Also worth noting, text recognition isn't a trivial task, especially with handwriting.
Score: 8 Votes (Like | Disagree)
farewelwilliams Avatar
60 months ago
implemented better than android
Score: 5 Votes (Like | Disagree)
LukeHarrison Avatar
60 months ago

Since Live Text relies heavily on Apple's neural engine, so the feature is only available on iPhones and iPads with at least an A12 Bionic or better chip, which means if you have an iPhone X or earlier model or anything less than an iPad mini (5th generation), iPad Air (2019, 3rd generation), or iPad (2020, 8th generation), then unfortunately you won't have access to it.
Forgive me if I'm wrong, but wasn't the A11 the first chip with a Neural Engine? So surely they could make this work on the X (and even the 8 and 8 Plus) if they REALLY wanted to.
Score: 5 Votes (Like | Disagree)