For years, Apple has been pretty great at baking accessibility features into its tech—whether that’s its VoiceOver screen reader or FaceTime, which gave users with hearing or speech difficulties a tool for communicating nonverbally using sign language and facial expression.
On Wednesday, Apple continued building on this important legacy by announcing a slew of new accessibility features for its products.
Introducing SignTime
The first of these, set to launch Thursday, May 20, is a new feature called SignTime. It’s designed to help customers chat with AppleCare and Retail Customer Care by using American Sign Language (ASL) in the United States, British Sign Language (BSL) in the UK, or French Sign Language (LSF) in France. This can be done right from the web browser on Apple devices.
It’s also accessible to customers visiting Apple Store locations so they can remotely access sign language interpreters to assist with their shopping experience. While the feature will initially be available only in the US, UK, and France, Apple says that it will expand to other markets in the future.
In an Apple press release, Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives, said that:
“At Apple, we’ve long felt that the world’s best technology should respond to everyone’s needs, and our teams work relentlessly to build accessibility into everything we make. With these new features, we’re pushing the boundaries of innovation with next-generation technologies that bring the fun and function of Apple technology to even more people—and we can’t wait to share them with our users.”
AssistiveTouch, Eye-Tracking, and Beyond
Another new Apple feature is called AssistiveTouch. Designed for watchOS, AssistiveTouch takes advantage of the Apple Watch’s in-built sensors, plus a bit of machine learning magic, to give users with limb differences a way to navigate a cursor with hand gestures. This can also be used to answer incoming calls, among other actions.
Then there’s eye-tracking support for iPad to allow users to control their tablets using third-party eye-tracking devices, an improved VoiceOver feature that can describe the contents of images, support for bi-directional hearing aids on iPhone, new Memoji customizations that represent users with “oxygen tubes, cochlear implants, and a soft helmet for headwear,” and more. These will roll out in upcoming software updates.
One other sure-to-be-popular feature is a new background sound feature that can provide white noise to help users stay focused, calm, or rest. These will include ocean, rain, or stream sounds, along with “balanced, bright, or dark noise.” Apple notes that the feature is designed in “support of neurodiversity.”