New Accessibility Features For Your Device

New Accessibility Features For Your Device

Discover Apple's latest accessibility features, including Eye Tracking and Music Haptics, enhancing usability for all.

Quick Links

Apple has announced a suite of new accessibility features set to debut later this year, aimed at enhancing usability for individuals with disabilities. Among the standout features is Eye Tracking, which enables users to control their iPad or iPhone using only their eyes. Additional announcements include Music Haptics, Vocal Shortcuts, Vehicle Motion Cues, and a host of new accessibility tools for visionOS.

These features are designed to leverage Apple’s powerful hardware and software, including Apple silicon, artificial intelligence, and machine learning.

iPhone 15 Pro Accessibility Functions
iPhone 15 Pro Accessibility Functions

Expanding the Reach of Accessibility

“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”

“Each year, we break new ground when it comes to accessibility,” added Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”

Eye Tracking Comes To iPad & iPhone

Eye Tracking, powered by machine learning, offers a revolutionary way for users with physical disabilities to navigate their iPad or iPhone using only their eyes. The front-facing camera calibrates the feature within seconds, and on-device learning ensures all data is securely stored on the device.

Eye Tracking integrates seamlessly across iPadOS and iOS apps without the need for additional hardware. Users can navigate app elements and use Dwell Control to activate functions such as physical buttons and gestures, all with their eyes.

What Is Dwell Control?

Dwell Control is an accessibility feature that allows users with physical disabilities to interact with their device seamlessly. By leveraging eye-tracking technology or alternative pointing devices, users can navigate and select on-screen elements simply by gazing at them or hovering the cursor over them for a predetermined amount of time. This system eliminates the need for physical touch, making it easier for individuals with limited motor abilities to operate their devices.

Music Haptics Makes Songs More Accessible

Music Haptics is designed to help users who are deaf or hard of hearing experience music through the Taptic Engine in iPhone. This feature converts audio into tactile sensations, enabling users to feel the rhythm and nuances of songs. Music Haptics will be available for millions of songs in the Apple Music catalog and accessible to developers through an API.

New Features For A Wide Range Of Speech

Vocal Shortcuts allow users to assign custom utterances for Siri to launch shortcuts and perform complex tasks. Listen for Atypical Speech enhances speech recognition for users with conditions like cerebral palsy, ALS, or those recovering from a stroke. This feature uses on-device machine learning to adapt to unique speech patterns, providing greater customisation and control.

“Artificial intelligence has the potential to improve speech recognition for millions of people with atypical speech,” said Mark Hasegawa-Johnson, principal investigator at the Speech Accessibility Project. “We are thrilled that Apple is bringing these new accessibility features to consumers.”

Apple Carplay Accessibility Features Vehicle Motion Cues
Apple Carplay Accessibility Features Vehicle Motion Cues

Vehicle Motion Cues & CarPlay Voice Control

Vehicle Motion Cues aim to reduce motion sickness for iPhone and iPad users in moving vehicles. This feature uses animated dots on the screen’s edges to represent vehicle motion, helping to reduce sensory conflict without distracting from the main content. The feature activates automatically when motion is detected or can be manually controlled via Control Centre.

CarPlay is also receiving new accessibility features, including Voice Control, Colour Filters, and Sound Recognition. Voice Control allows users to navigate and control CarPlay with their voice. Sound Recognition alerts users who are deaf or hard of hearing to car horns and sirens, while Colour Filters make the CarPlay interface more accessible to colourblind users.

Accessibility Features Coming To visionOS

visionOS will introduce system-wide Live Captions, enabling users who are deaf or hard of hearing to follow along with spoken dialogue in live conversations and audio from apps. Updates for vision accessibility will include features like Reduce Transparency, Smart Invert, and Dim Flashing Lights for users with low vision.

Additional Updates

Apple also announced new features for the Magnifier app, including a Reader Mode for improved text accessibility. These new accessibility features come to iPhone, iPad, and the Apple Vision Pro later in the year.