Search
+
    The Economic Times daily newspaper is available online now.

    Apple’s got some game-changing accessibility features coming. Learn what’s dropping soon

    Synopsis

    Apple has recently announced new accessibility features. These features cover both visual and audio aid. They can be tapped across Apple ecosystem

    AppleReuters

    Apple Enhances Vision Pro to Better Support the Blind and Visually Impaired

    Apple recently announced new accessibility features coming later this year, including Accessibility Nutrition Labels, which will provide more detailed information for apps and games on the App Store. Users who are blind or have low vision can explore, learn, and interact using the new Magnifier app for Mac; take notes and perform calculations with the new Braille Access feature; and leverage the powerful camera system of Apple Vision Pro with new updates to visionOS.

    Additional announcements include Accessibility Reader, a new systemwide reading mode designed with accessibility in mind, along with updates to Live Listen, Background Sounds, Personal Voice, Vehicle Motion Cues, and more. Leveraging the power of Apple silicon — along with advances in on-device machine learning and artificial intelligence, users will experience a new level of accessibility across the Apple ecosystem.

    Accessibility Nutrition Labels Are Coming to the App Store

    Apple’s making it easier to know if an app actually works for you before you hit download. The new Accessibility Nutrition Labels will live right on App Store product pages, calling out key accessibility features like VoiceOver, Voice Control, Larger Text, High Contrast, Reduced Motion, captions, and more.

    It’s a win-win: users can instantly see whether an app supports their needs, and developers get a clear way to show how inclusive their design really is. These labels are rolling out globally, and Apple’s also offering developers detailed guidance on how to meet the criteria.

    Magnifier Comes to Mac — Here’s How It’s Leveling Up Accessibility

    The handy Magnifier tool that’s been on iPhone and iPad since 2016 is now making its way to Mac — and it’s bringing some serious upgrades. Designed for users who are blind or have low vision, the new Magnifier app for Mac uses your camera to zoom in on real-world objects — think screens, whiteboards, or even documents.
    It works seamlessly with Continuity Camera from your iPhone or any connected USB cam, and supports Desk View for scanning and reading docs. You can even run multiple live sessions at once — like following a presentation while reading along from a book.

    What’s more, you get full control: tweak brightness, contrast, apply colour filters, adjust the perspective, and save your custom views for later. And with Accessibility Reader built in, it can take physical text and turn it into an easy-to-read format tailored just for you.


    Braille Access Turns Apple Devices Into Powerful Braille Note Takers — Here’s How

    Apple’s introducing Braille Access — a brand-new feature that transforms your iPhone, iPad, Mac, and even Apple Vision Pro into fully-loaded braille note takers, built right into the ecosystem.

    With a built-in app launcher, users can open any app just by typing with Braille Screen Input or a connected braille device. You can jot down notes, do quick math using Nemeth Braille (the go-to for math and science), and even open BRF files — giving you access to tons of books and documents created on traditional braille devices.
    And here’s a game-changer: Live Captions now work directly with braille displays, letting users follow real-time conversations in braille as they happen.

    Meet Accessibility Reader — A Smarter Way to Read, Your Way

    Apple’s new Accessibility Reader is built to make reading easier for everyone — especially users with dyslexia, low vision, or other reading challenges. Rolling out across iPhone, iPad, Mac, and Apple Vision Pro, it’s a systemwide mode that lets you fully customize how text looks and feels.

    You get total control: tweak fonts, adjust colours and spacing, and even turn on Spoken Content for audio support. It works across any app — and it’s also baked right into the Magnifier app, so you can use it to read real-world text too, like books or restaurant menus.

    Live Captions Just Landed on Apple Watch

    Apple Watch is stepping up its accessibility game with Live Captions, now available for users who are deaf or hard of hearing. It works hand-in-hand with Live Listen, which uses your iPhone as a remote mic to stream audio straight to AirPods, Beats, or Made for iPhone hearing aids.

    Now, when Live Listen is active, your Apple Watch shows real-time captions of what your iPhone hears — so you can read along while listening. You can also control sessions right from your wrist: start, stop, or even rewind to catch something you missed, without needing to reach for your phone.

    And because it works from across the room, it’s perfect for meetings, classrooms, or everyday conversations. Plus, it pairs seamlessly with hearing health features on AirPods Pro 2, including Apple’s new clinical-grade Hearing Aid mode.

    Apple Vision Pro Is Getting Smarter for Low Vision and Blind Users


    Apple’s supercharged Vision Pro is about to get even more helpful for users who are blind or have low vision. Thanks to updates in visionOS, the device now uses its advanced camera system to expand accessibility like never before.

    With Zoom, you can magnify everything in view — not just on-screen elements, but your actual surroundings. And for VoiceOver users, a new Live Recognition feature taps into on-device machine learning to describe your environment, spot objects, read text, and more — all in real time.

    Plus, a new developer API will let approved apps access the main camera to offer live visual support — so apps like Be My Eyes can provide hands-free, person-to-person assistance whenever you need it.

    We also had an opportunity to talk to Sarah Herrlinger, she's the Senior Director of Global Accessibility Policy and Initiatives at Apple.

    We asked her how Apple zeroes down on what feature to develop next for accessibility. To which she replied -
    “The core of it is communication with the communities themselves.sometimes that comes from our own employees. So, you know, obviously in the the mantra of nothing about us without us, it starts by hiring people with lived experience who come in and say, gosh, I just wish my iPhone would help me solve this specific problem and that might be the kernel that gets us to something or it could be a single email that comes in from someone who says, I love this device, but my specific disability means I have a limitation here and we'll look at that and say wow, that's a fascinating situation. I wonder how we could fix that.”

    She also shared her insights on how accessibility feature can be triggered using the action button.

    “There are some of the accessibility features that can be triggered by the action button. We love the action button for accessibility uses, you know, as much as everybody else. And so there are features like Magnifier and reader as well as even just starting the accessibility shortcut.”

    (Catch all the Business News, Breaking News, Budget 2024 Events and Latest News Updates on The Economic Times.)

    Subscribe to The Economic Times Prime and read the ET ePaper online.

    ...more

    (Catch all the Business News, Breaking News, Budget 2024 Events and Latest News Updates on The Economic Times.)

    Subscribe to The Economic Times Prime and read the ET ePaper online.

    ...more
    The Economic Times

    Stories you might be interested in

    OSZAR »