iOS 18 Beta Unveils Accessibility Features, Vocal Shortcuts, Eye Tracking to Help Disabled People

Highlights

  • iOS 18 introduces Vocal Shortcuts for voice commands without Siri.
  • Eye Tracking enables control of iPhone and iPad using eye movements.
  • Music Haptics offers a new way for deaf users to experience music.
  • New CarPlay features include Sound Recognition, Colour Filters, and Voice Control.

The forthcoming iOS 18 update from Apple will likely make iPhones and iPads super accessible.

It has released features that can help users with disabilities in the latest beta.

Vocal Shortcuts are now available, which lets users issue voice commands throughout the system without invoking Siri.

Then there is Eye Tracking, which uses the front camera and on-device machine learning to offer eye-controlled navigation for users with physical disabilities.

“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”

“Each year, we break new ground when it comes to accessibility,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”

“Apple Vision Pro is without a doubt the most accessible technology I’ve ever used,” said Ryan Hudson-Peralta, a Detroit-based product designer, accessibility consultant, and cofounder of Equal Accessibility LLC. “As someone born without hands and unable to walk, I know the world was not designed with me in mind, so it’s been incredible to see that visionOS just works. It’s a testament to the power and importance of accessible and inclusive design.”

“Artificial intelligence has the potential to improve speech recognition for millions of people with atypical speech, so we are thrilled that Apple is bringing these new accessibility features to consumers,” said Mark Hasegawa-Johnson, the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign’s principal investigator.

“The Speech Accessibility Project was designed as a broad-based, community-supported effort to help companies and universities make speech recognition more robust and effective, and Apple is among the accessibility advocates who made the Speech Accessibility Project possible.”

Let’s jump into what these updates actually mean.

Vocal Shortcuts: Seamless Voice Commands

iOS 18 introduces Vocal Shortcuts for voice commands without Siri

Apple announced that in iOS 18, a potent new voice tool, Vocal Shortcuts, will be available.

Using this feature, users will be able to set up custom phrases that will initiate system wide actions without the need for saying “Siri.”

Apple demonstrated an example where typing “Rings” would open the user’s Activity app and display their Activity Rings but did not elaborate.

Maybe there will be more utility-driven initiatives for this in the future.

Vocal Shortcuts will allow iPhone and iPad users to assign “custom utterances” that Siri can understand to “launch shortcuts and complete complex tasks.”

From Apple’s press release: With Vocal Shortcuts, iPhone and iPad users can assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks.

The feature’s explicit use of the word “shortcuts,” combined with the recent history of Apple enabling the iPhone 15 Pro’s Action button to be used as a shortcut trigger.

One could set up simple voice shortcuts for enabling Do Not Disturb, Low Power Mode, or toggling similar system settings.

Using Shortcuts app, one could build complex, multi-step shortcuts that all trigger when they would say the magic words.

Removing the word “Siri” from voice triggers certainly opens the potential for far more accidental triggers.

iOS 18 will require you to configure each custom phrase manually, and say each phrase multiple times during set up, hopefully those unintentional triggers will be minimal.

Eye Tracking and Aural Experience with Music

Eye Tracking enables control of iPhone and iPad using eye movements

More interestingly, Apple also released new tools for access: Music Haptics and Eye Tracking.

The feature of eye tracking will enable disabled persons with physical impairments to just control their iPad or iPhone with their eyes.

It could be set up and calibrated instantly via the front-facing camera.

iPhone’s Taptic Engine will play “taps, textures, and refined vibrations” that correspond with the audio of the music

Setup and control data is stored on the device via on-device machine learning and kept fully secure.

Meanwhile, Music Haptics will be the feature that uses the Taptic Engine of the iPhone, introducing a new way for hard-of-hearing or deaf users to experience music.

Motion Cues in Vehicles and CarPlay Compatibility

New CarPlay features include Sound Recognition, Colour Filters, and Voice Control

Apple has released Vehicle Motion Cues, meant to help diminish sensory conflict when users are moving around.

This feature represents the changes in vehicle motion using moving dots at the edges of the screen without interfering with the primary information.

CarPlay will be gaining Voice Control, Color Filters, and Sound Recognition.

Vehicle Motion Cues uses the built-in sensors of iPhone and iPad to detect when a user is riding in a moving vehicle and to respond accordingly.

In addition, Apple has disclosed a number of accessibility features for CarPlay, including as Sound Recognition, Colour Filters, and Voice Control.

Voice Control allows users to operate CarPlay and applications using just their voice, allowing them to drive hands-free.

FAQs

What are Vocal Shortcuts in iOS 18?

Vocal Shortcuts allow users to issue custom voice commands to initiate system wide actions without invoking Siri, making the device more accessible.

How does the Eye Tracking feature work in iOS 18?

Eye Tracking uses the front camera and on-device machine learning to enable users with physical disabilities to control their iPhone or iPad using eye movements.

What is Music Haptics in iOS 18?

Music Haptics uses the Taptic Engine to provide a tactile experience for hard-of-hearing or deaf users, allowing them to feel the music through vibrations.

What are Vehicle Motion Cues in iOS 18?

Vehicle Motion Cues use built-in sensors to represent vehicle motion with moving dots on the screen, helping to reduce sensory conflict for users on the move.

What new accessibility features are available for CarPlay in iOS 18?

iOS 18 introduces Sound Recognition, Colour Filters, and Voice Control for CarPlay, enhancing hands-free operation and accessibility while driving.

What are the Additional Updates by Apple ?

For users who are blind or have low vision, VoiceOver will include new voices, a flexible Voice Rotor, custom volume control, and the ability to customize VoiceOver keyboard shortcuts on Mac.

Magnifier will offer a new Reader Mode and the option to easily launch Detection Mode with the Action button.

Braille users will get a new way to start and stay in Braille Screen Input for faster control and text editing; Japanese language availability for Braille Screen Input; support for multi-line braille with Dot Pad; and the option to choose different input and output tables.

For users with low vision, Hover Typing shows larger text when typing in a text field, and in a user’s preferred font and color.

For users at risk of losing their ability to speak, Personal Voice will be available in Mandarin Chinese. Users who have difficulty pronouncing or reading full sentences will be able to create a Personal Voice using shortened phrases.

For users who are nonspeaking, Live Speech will include categories and simultaneous compatibility with Live Captions.

For users with physical disabilities, Virtual Trackpad for AssistiveTouch allows users to control their device using a small region of the screen as a resizable trackpad.

Switch Control will include the option to use the cameras in iPhone and iPad to recognize finger-tap gestures as switches.

Voice Control will offer support for custom vocabularies and complex words.

What are the best features of Eye Tracking introduced to iPad and iPhone ?

Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes.

Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.

Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

Does Music Haptics Makes Songs More Accessible ?

Music Haptics is a new way for users who are deaf or hard of hearing to experience music on iPhone. With this accessibility feature turned on, the Taptic Engine in iPhone plays taps, textures, and refined vibrations to the audio of the music.

Music Haptics works across millions of songs in the Apple Music catalog, and will be available as an API for developers to make music more accessible in their apps.

What are the New Features introduced for a Wide Range of Speech by Apple ?

With Vocal Shortcuts, iPhone and iPad users can assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks. Listen for Atypical Speech, another new feature, gives users an option for enhancing speech recognition for a wider range of speech.

Listen for Atypical Speech uses on-device machine learning to recognize user speech patterns.

Designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, these features provide a new level of customization and control, building on features introduced in iOS 17 for users who are nonspeaking or at risk of losing their ability to speak.

Can Vehicle Motion Cues Help Reduce Motion Sickness ?

Vehicle Motion Cues is a new experience for iPhone and iPad that can help reduce motion sickness for passengers in moving vehicles.

Research shows that motion sickness is commonly caused by a sensory conflict between what a person sees and what they feel, which can prevent some users from comfortably using iPhone or iPad while riding in a moving vehicle.

With Vehicle Motion Cues, animated dots on the edges of the screen represent changes in vehicle motion to help reduce sensory conflict without interfering with the main content.

Using sensors built into iPhone and iPad, Vehicle Motion Cues recognizes when a user is in a moving vehicle and responds accordingly.

The feature can be set to show automatically on iPhone, or can be turned on and off in Control Center.

What features has CarPlay added in IOS 18 beta ?

CarPlay Gets Voice Control, More Accessibility Updates

Accessibility features coming to CarPlay include Voice Control, Color Filters, and Sound Recognition. With Voice Control, users can navigate CarPlay and control apps with just their voice.

With Sound Recognition, drivers or passengers who are deaf or hard of hearing can turn on alerts to be notified of car horns and sirens.

For users who are colorblind, Color Filters make the CarPlay interface visually easier to use, with additional visual accessibility features including Bold Text and Large Text.

What are the Accessibility Features Coming to visionOS ?

This year, accessibility features coming to visionOS will include systemwide Live Captions to help everyone — including users who are deaf or hard of hearing — follow along with spoken dialogue in live conversations and in audio from apps.

With Live Captions for FaceTime in visionOS, more users can easily enjoy the unique experience of connecting and collaborating using their Persona. Apple Vision Pro will add the capability to move captions using the window bar during Apple Immersive Video, as well as support for additional Made for iPhone hearing devices and cochlear hearing processors.

Updates for vision accessibility will include the addition of Reduce Transparency, Smart Invert, and Dim Flashing Lights for users who have low vision, or those who want to avoid bright lights and frequent flashing.

These features join the dozens of accessibility features already available in Apple Vision Pro, which offers a flexible input system and an intuitive interface designed with a wide range of users in mind.

Features such as VoiceOver, Zoom, and Color Filters can also provide users who are blind or have low vision access to spatial computing, while features such as Guided Access can support users with cognitive disabilities.

Users can control Vision Pro with any combination of their eyes, hands, or voice, with accessibility features including Switch Control, Sound Actions, and Dwell Control that can also help those with physical disabilities.

Also Read: Apple’s iPhone 16 to Revolutionize with AI Integration in iOS 18

Also Read: Apple’s iOS 18 Is Likely to Be Its Most Ambitious Update Yet: Mark Gurman

Also Read: Apple Reportedly Acquires DarwinAI to Boost iOS 18 AI Features and More