Mobile News

Apple introduces new accessibility features including door detection and live captioning

On Tuesday, Apple announced a list of accessibility features designed to help users with disabilities. New features coming to iPhone, Apple Watch, and Mac later this year are said to use hardware, software, and machine learning advances to help people with low or visual impairments, as well as those with physical disabilities. or movement disorders. disability. Features include door detection for iPhone and iPad users, Apple Watch mirroring, and live captioning. Apple also announced updates to VoiceOver with 20 additional language settings.

One of the most useful accessibility features that Apple introduced as part of its Latest updates is a door detection that uses the LiDAR sensor on the latest iPhones or iPads to help users find the door. The feature uses a combination of LiDAR, a camera, and on-device machine learning to understand how far users are from a door and describe the door’s attributes, including whether it’s open or closed, the company says.

If the door is closed, the door detection function can help people open it by pressing, turning the handle or pulling the handle. It is also claimed to read the signs and symbols around the door, such as the room number, and even recognize the presence of an available input symbol.

Door detection feature that will work with iPhone 13 Pro, iPhone 13 Pro Max, iPhone 12 Pro, iPhone 12 Pro Max, iPad Pro 11-inch (2020), iPad Pro 11-inch (2021), and iPad Pro . The 12.9-inch (2020) and iPad Pro 12.9-inch (2021) will be available through the preinstalled Magnifier app.

The Apple Magnifier app will have a new detection mode to allow access to the door detection feature. It will also have people detection and image descriptions as two new features that can work alone or at the same time as door detection to help people with visual impairments or low vision.

The company announced that along with updates to Magnifier, Apple Maps will also receive audio and haptic feedback for users who have enabled VoiceOver to help them determine the starting point for walking direction.

The Apple Watch will also get dedicated support for Apple Watch Mirroring so that users can control the smartwatch remotely with their paired iPhone. The new offering will help users control their Apple Watch with iPhone assistive features, including voice control and switch control. Users can use inputs like voice commands, audio actions, head tracking, and even Made for iPhone external switches as an alternative to tapping the Apple Watch display.

All this will help people with physical and motor disabilities.

Apple said that Apple Watch Mirroring uses hardware and software integration in the system, including AirPlay advances to enable users to use features such as blood oxygen and heart rate tracking and the Mindfulness app. The mirroring feature will work with Apple Watch Series 6 and later.

Apple Watch users will also get support for double-tap gestures. This will help users answer or end a phone call, dismiss a notification, take a photo, play or pause media in the Now Playing app, and start, pause or resume a workout, all with a double-tap gesture. It will work with AssistiveTouch on Apple Watch.

For users who are deaf or hard of hearing, Apple has announced Live Captions on iPhone, iPad, and Mac. It will be available later this year in English beta for users in the US and Canada on iPhone 11 and later, iPad models with A12 Bionic and later, and Macs with Apple Silicon.

According to the company, Live Captions will work with any audio content, including phone calls and FaceTime calls, as well as video conferencing or social media apps, streaming media content, and even if users are talking to someone next to them.

Apple brings Live Captions to iPhone, iPad and Mac users
Photo Credit: Apple

Users can adjust the font size for readability. This feature in FaceTime will also attribute automatically transcribed dialogue to call participants to make it easier for hearing-impaired users to communicate with each other over video calls.

On the Mac, Live Captions will come with the ability to type a response and speak it out loud in real time to other participants in the conversation, Apple says. It was also claimed that Live Captions would be generated on the device with user privacy and security in mind.

Apple’s native screen reader – VoiceOver – is also getting 20 additional languages ​​and languages, including Bengali, Bulgarian, Catalan, Ukrainian, and Vietnamese. There will also be dozens of new voices touted as being optimized for assistive features in all supported languages.

New languages, locales and voices will also be available for the Speak Selection and Speak Screen features. In addition, VoiceOver on Mac will work with a new Text Inspection tool to fix formatting issues such as repeated spaces or inappropriate capitalization.

Apple has also introduced some additional accessibility features to celebrate World Accessibility Awareness Day this week. These features include Siri pause time, which will help users adjust how long the voice assistant waits before answering a request, Buddy Control to ask a healthcare provider or friend to play a game, and custom sound recognition, which is said to be set to recognize sounds. . that are specific to the human environment, such as a unique home alarm, doorbell, or home appliance.

The preloaded Apple Books app will also include new themes and customization options, such as making text bold and adjusting line, character, and word spacing to make reading more accessible to users. In addition, the Shortcuts app on Mac and Apple Watch, starting this week, will help recommend accessibility options based on user preferences with a new accessibility assistant shortcut.

Apple Maps will also receive a new guide from the National Parks Foundation’s “Park Access for All” to help users discover available features, programs, and services to explore in parks across the US. Travel guides from Gallaudet University. In addition, it will feature businesses and organizations that value, support and prioritize the deaf and sign language community.

Users will also receive accessibility-focused apps and stories from developers in the App Store, as well as the Transforming Our World collection in Apple Books, featuring stories from and about people with disabilities. Apple Music will also highlight Saylists playlists, each dedicated to a different sound.

Likewise, the Apple TV app will feature the latest popular movies and shows that truly feature people with disabilities.

Users will also have the opportunity to explore collections created by prominent members of the accessibility community, including Marley Matlin (“CODA”), Lauren Ridloff (“The Eternals”), Selma Blair (“Introducing Selma Blair”) and Eli Stroker. (“Christmas Ever After”) and others.

Apple Fitness+ will also introduce Trainer Bakary Williams this week using core American Sign Language (ASL) features, including audio cues, which are short, descriptive verbal cues to support visually impaired users, and Time to Walk and Time to Run. episodes that would become “Time to Walk or Push” and “Time to Run or Push” for wheelchair users.

ASL will also be part of every workout and meditation at Apple Fitness+, and all videos will be subtitled in six languages. Coaches will also showcase changes to each workout to help people in need of help join.

Apple is also launching SignTime to connect Apple Store customers and Apple Support to on-demand ASL interpreters. SignTime is already available to customers in the US using ASL, in the UK using British Sign Language (BSL), and in France using French Sign Language (LSF). In addition, Apple Stores around the world have already begun offering live sessions throughout the week to help customers learn about accessibility features on iPhone, and Apple’s social support channels showcase how-to content, the company said.


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button