Apple ups accessibility ante with eye tracking, vocal shortcuts and music haptics (VIDEO)

Malay Mail
Malay Mail

KUALA LUMPUR, May 16 — It’s that time of year when Global Accessibility Awareness Day (GLAAD) happens on May 16.

As technology becomes an essential part of the fabric of our lives, keeping it inclusive matters and it’s encouraging that big tech names have embedded accessibility features into much of their offerings.

Apple just announced that accessibility improvements are coming soon with their next big software update and perhaps the most exciting announcement is that of Eye Tracking.

Usually eye tracking devices and software tend to be prohibitively expensive but Apple’s Eye Tracking will allow iPhone and iPad users to use their devices without needing their hands, usually tricky when you consider that they’re touchscreen devices.

In a video, Apple demonstrates how disabled users can access apps with just using their eyes.

You can watch the Eye Tracking demo here:

What else is coming? For those with low vision, the ever-useful Magnifier is now improved with a Reader mode that makes it easier to identify text in images and you can configure iPhones’ Action buttons to launch the Detection mode to scan images.

For those who get motion sickness while traveling, the new Vehicle Motion Cues will help people use their iPhones and iPads more comfortably.

Motion sickness is caused by sensory conflicts between what people see and what they feel, so the cues on the screen (animated dots on the edges) will display changes in vehicle motion without interfering with what’s on the display.

You can even set the feature to automatically run when it detects you’re in a moving vehicle.

Apple’s also added Music Haptics so even those with low hearing can interact with songs in Apple Music on the iPhone.

How would that go? Basically, it makes use of the iPhone’s Taptic Engine that will generate taps, textures as well as vibrations along to the audio of a song making it so you can feel the music even if you can’t fully hear it.

An important feature for those whose speech is affected by disabilities or medical conditions would be Vocal Shortcuts.

With it, users can create custom utterances that Siri can recognise and use to perform tasks or launch shortcuts.

Recognising speech will also be enhanced via Listen for Atypical Speech that will be able to recognise a wider range of speech.

Apple’s CarPlay will also include Sound Recognition to alert deaf users of sounds such as sirens and car horns and Voice Control will allow users to navigate CarPlay and control apps with spoken commands.

Colour blind users are also going to benefit from a new interface that will be visually easier with additional features such as Large Text and Bold Text.

Apple’s new Vision Pro headset will soon support Live Captions for FaceTime as well as tweaks to visual options, including being able to dim flashing lights.

All these features will likely appear with the next iOS and iPadOS updates coming in a few months.