Whether for work, school, relaxation or fun, smartphones and other devices have become a critical part of our lives. But for those with disabilities, these products aren’t always as accessible or inclusive as they should be. Luckily, Apple is committed to changing this.

In a recent announcement, Apple stated that they will introduce a slew of new accessibility features and updates to existing features by the end of the year. These will include products specifically designed for the visually impaired, deaf and hard of hearing, as well as those with sensory challenges.

“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO. “That’s why, for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”

Here’s what’s coming to Apple devices:

Eye Tracking

Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on the device and isn’t shared with Apple.

Eye Tracking works across iPadOS and iOS apps and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes and other gestures solely with their eyes.

Music Haptics

Music Haptics is a new way for users who are deaf or hard of hearing to experience music on iPhone. With this accessibility feature turned on, the Taptic Engine in iPhone plays taps, textures and refined vibrations to the audio of the music. Music Haptics works across millions of songs in the Apple Music catalog and will be available as an API for developers to make music more accessible in their apps.

New Features

With Vocal Shortcuts, iPhone and iPad users can assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks. Listen for Atypical Speech, another new feature, allows users to enhance speech recognition for a wider range of speech. Listen for Atypical Speech uses on-device machine learning to recognize user speech patterns. Designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS) or stroke, these features provide a new level of customization and control, building on features introduced in iOS 17 for users who are nonspeaking or at risk of losing their ability to speak.

Vehicle Motion Cues

Vehicle Motion Cues is a new experience for iPhone and iPad that can help reduce motion sickness for passengers in moving vehicles. Research shows that motion sickness is commonly caused by a sensory conflict between what a person sees and feels, which can prevent some users from comfortably using an iPhone or iPad while riding in a moving vehicle. With Vehicle Motion Cues, animated dots on the edges of the screen represent changes in vehicle motion to help reduce sensory conflict without interfering with the main content. Using sensors built into iPhone and iPad, Vehicle Motion Cues recognizes when a user is in a moving vehicle and responds accordingly. The feature can be set to show automatically on the iPhone or turned on and off in the Control Center.

CarPlay Gets Voice Control

Accessibility features coming to CarPlay include Voice Control, Color Filters and Sound Recognition. With Voice Control, users can navigate CarPlay and control apps with just their voice. With Sound Recognition, drivers or passengers who are deaf or hard of hearing can turn on alerts to be notified of car horns and sirens. For users who are colorblind, Color Filters make the CarPlay interface visually easier to use, with additional visual accessibility features including Bold Text.

Accessibility Features

This year, accessibility features coming to visionOS will include systemwide Live Captions to help everyone—including users who are deaf or hard of hearing—follow along with spoken dialogue in live conversations and audio from apps. With Live Captions for FaceTime in visionOS, more users can easily enjoy the unique experience of connecting and collaborating using their Persona.

Apple Vision Pro will add the capability to move captions using the window bar during Apple Immersive Video and support for additional Made for iPhone hearing devices and cochlear hearing processors. Updates for vision accessibility will include the addition of Reduce Transparency, Smart Invert and Dim Flashing Lights for users with low vision or those who want to avoid bright lights and frequent flashing.

Other Features

A few more simple features will also be modified and added to make devices more accessible. These include new ways to start and stay in Braille Screen Input, Reader Mode for the magnifier, more language availability for Personal Voice, new voice options for VoiceOver, better compatibility for Live Speech and Live Captions, an updated dictionary for Voice Control and many others.

Visit Apple for more information.

Read more articles for the DIVERSEability community here.

This article was originally published on diversitycomm.net.

Leave a comment