Apple revealed it has been working on improving cognitive, hearing, and vision accessibility of its products and unveiled new features that should help people with disabilities. The first of those is Live Speech, allowing nonspeaking people to type instead of speaking during calls.
Personal Voice will create a model of a synthesized voice, while Detection Mode is for people who are blind or have very impaired vision.
The features will arrive later this year, revealed Apple without providing a detailed schedule. We expect them to make their way to iOS 17 and iPadOS 17, as well as some apps on next-gen MacOS. All interfaces are likely to be introduced at WWDC next month.
Assistive Access can be turned on or off through Settings. It includes a customized experience for Phone and FaceTime, which is unified into a Calls app with big and bright shortcuts for contacts. Messages have a big emoji keyboard, while the gallery magnifies thumbnails for easier previewing of photos. It can be applied both on iPadOS and iOS.
Live Speech will allow you to save commonly used phrases to chime in quickly into conversations. Personal Voice will offer a set of text prompts for a person to read and to record 15 minutes of audio. This would allow the model to be trained, so when the user starts declining in vocal capabilities, they can still communicate with loved ones.
Detection Mode will interact with objects that have text labels and will read them loudly when pointed. It will be built into the Magnifier app and will also offer People Detection, Door Detection, and Image Descriptions.
Once the features arrive, deaf or hard-of-hearing customers will be able to use their Made for iPhone hearing devices directly on Mac. Users who are sensitive to rapid images and animations will be able to pause gifs in Messages and Safari. Another neat feature is adapting how quickly Siri speaks, with options ranging from 0.8x to 2x.
Tip us
1.7m 126k
RSS
EV
Merch
Log in I forgot my password Sign up