BOOK THIS SPACE FOR AD
ARTICLE ADApple on Tuesday announced a list of accessibility features that are aimed to help users with disabilities. The new features, which are coming to the iPhone, Apple Watch, and Mac later this year, are claimed to use hardware, software, and machine learning advancements to help people who have low vision or are visually impaired or the ones with physical or motor disabilities. The features include Door Detection for iPhone and iPad users, Apple Watch Mirroring, and live captions. Apple also announced updates to VoiceOver, with 20 additional locales and languages.
One of the most useful accessibility features that Apple introduced as a part of its latest updates is Door Detection that uses LiDAR sensor on the latest iPhone or iPad models to help navigate users to locate a door. The feature uses a combination of LiDAR, camera, and on-device machine learning to understand how far users are from the door and describe the door's attributes, including whether it is open or closed, the company said.
If the door is closed, the Door Detection feature can help people open it by pushing, turning a knob, or pulling a handle. It is also claimed to read signs and symbols around the door, such as the room number, and even recognise the presence of an accessible entrance symbol.
The Door Detection feature, which will work with the iPhone 13 Pro, iPhone 13 Pro Max, iPhone 12 Pro, iPhone 12 Pro Max, iPad Pro 11-inch (2020), iPad Pro 11-inch (2021), and the iPad Pro 12.9-inch (2020) and iPad Pro 12.9-inch (2021), will be available through the pre-installed Magnifier app.
Apple's Magnifier app will have a new Detection Mode to enable access to the Door Detection feature. It will also have People Detection and Image Descriptions as the two new features that can work alone or simultaneously along with Door Detection to help support people with visually impaired or low vision.
Alongside the updates within Magnifier, Apple Maps will also get sound and haptic feedback for users who have enabled VoiceOver to help them identify the starting point for walking direction, the company announced.
The Apple Watch will also get dedicated Apple Watch Mirroring support to let users control the smartwatch remotely using their paired iPhone. The new offering will help users control Apple Watch using iPhone's assistive features, including Voice Control and Switch Control. Users can use inputs such as voice commands, sound actions, head tracking, and even external Made for iPhone switches as alternatives to tap the Apple Watch display.
All this will assist people with physical and motor disabilities.
Apple said that Apple Watch Mirroring uses hardware and software integration on the system, including AirPlay advancements, to allow users to use features including Blood Oxygen and Heart Rate tracking and Mindfulness app. The mirroring feature will work with the Apple Watch Series 6 and later models.
Apple Watch users will also get double-pinch gesture support. This will help users answer or end a phone call, dismiss a notification, take a photo, play or pause media in the Now Playing app, and start, pause, or resume a workout — all by using the double-pinch gesture. It will work with AssistiveTouch on Apple Watch.
For deaf users or those with hearing impairments, Apple announced Live Captions on the iPhone, iPad, and Mac. It will be available later this year in beta in English for users in the US and Canada on the iPhone 11 and later, iPad models with A12 Bionic and later, and Macs with Apple silicon.
Live Captions will work with any audio content, including phone and FaceTime calls, as well as video conferencing or social media app, streaming media content, and even in case users are having a conversation with someone next to them, the company said.
Apple is bringing Live Captions to iPhone, iPad, and Mac users
Photo Credit: Apple
Users can adjust font size for ease of reading. The feature in FaceTime will also attribute auto-transcribed dialogue to call participants to make it more convenient for users with hearing disabilities to communicate with each other over video calls.
On Mac, Live Captions will come along with the option to type a response and have it spoken aloud in real time to others who are part of the conversation, Apple said. It also claimed that Live Captions will be generated on device — keeping privacy and user safety in mind.
Apple's native screen reader — VoiceOver — is also getting 20 additional locales and languages, including Bengali, Bulgarian, Catalan, Ukrainian, and Vietnamese. There will also be dozens of new voices that are touted to be optimised for assistive features across all supported languages.
The new languages, locales, and voices will also be available for Speak Selection and Speak Screen features. Further, VoiceOver on Mac will work along with the new Text Checker tool to fix formatting issues such as duplicated spaces or misplaced capital letters.
Apple also introduced some additional accessibility features to celebrate the Global Accessibility Awareness Day this week. These features include Siri Pause Time that will help users adjust how long the voice assistant waits before responding to a request, Buddy Control to ask care provider or friend to play a game, and a customisable Sound Recognition that is claimed to be customised to recognise sounds that are specific to a person's environment, like their home's unique alarm, doorbell, or appliances.
The preloaded Apple Books app will also include new themes and customisation options such as bolding text and adjusting line, character, and word spacing to deliver a more accessible reading experience to users. Further, the Shortcuts app on Mac and Apple Watch starting this week will help recommend accessibility features based on user preferences using a new Accessibility Assistant shortcut.
Apple Maps will also get a new guide from the National Park Foundation, Park Access for All, to help users discover accessible features, programmes, and services to explore in parks across the US Guides from Gallaudet University. It will additionally highlight businesses and organisations that value, embrace, and prioritise the Deaf community and signed languages.
Users will also get accessibility-focussed apps and stories from developers in the App Store as well as the Transforming Our World collection in Apple Books with stories by and about people with disabilities. Apple Music will also highlight the Saylists playlists where each will focus on a different sound.
Similarly, the Apple TV app will feature the latest hit movies and shows featuring authentic representation of people with disabilities.
Users will also get the ability to explore guest-curated collections from the accessibility community's standout actors, including Marlee Matlin ("CODA"), Lauren Ridloff ("Eternals"), Selma Blair ("Introducing, Selma Blair"), and Ali Stroker ("Christmas Ever After"), among others.
The Apple Fitness+ service this week will also bring trainer Bakari Williams who will use American Sign Language (ASL) highlight features, including Audio Hints, which are short descriptive verbal cues to support visually impaired or low vision users, and Time to Walk and Time to Run episodes becoming "Time to Walk or Push” and “Time to Run or Push” for wheelchair users.
ASL will also be a part of every workout and meditation on Apple Fitness+, and all videos will include closed captioning in six languages. Trainers will also demonstrate modifications in each workout to help people requiring accessibility assistance join in.
Apple is additionally launching SignTime to connect Apple Store and Apple Support customers with on-demand ASL interpreters. SignTime is already available for customers in the US using ASL, the UK using British Sign Language (BSL), and France using French Sign Language (LSF). Furthermore, Apple Store locations around the world have already started offering live sessions throughout the week to help customers discover accessibility features on iPhone, and Apple Support social channels are showcasing how-to content, the company said.