Apple reveals upcoming accessibility features, including a voice replicator

The new features should make it easier for people with disabilities and impairments to use their iPhones and iPads.

Apple accessibility features

Apple has announced a slew of accessibility features for iOS 17 and iPadOS 17 in a move that the company says will make a positive impact on the lives of iPhone users with disabilities and impairments. The announcement comes just a couple of days ahead of the 12th annual Global Accessibility Awareness Day, which is set to be observed on Thursday, May 18.

One of the most notable new features is Assistive Access, which is meant for people with cognitive disabilities. The feature, which Apple states was designed with feedback from “people with cognitive disabilities and their trusted supporters,” streamlines the iOS and iPadOS interface, making it easier for people to talk to loved ones, share photos and listen to music. As part of the new feature, Phone and FaceTime apps are merged into a single Calls app. The feature also includes high contrast buttons and large text labels, all of which are configurable to better serve people with accessibility needs.

Apple is also making it easier for non-verbal people to communicate with its Live Speech feature that lets people type what they want to say, and have it read out loud during phone calls, FaceTime chats, and in-person conversations. The feature also allows people to save some common words and phrases to be used while talking to friends and family.

Another new feature is Personal Voice, which is aimed at people who are at the risk of losing their ability to speak due to conditions like ALS. The feature uses machine learning to generate a unique personal voice for each individual user. To create personal voice, users have to read some words and phrases into the iPhone or iPad’s microphone for about fifteen minutes. It integrates with Live Speech to let them use their own voice while speaking to their loved ones. Personal Voice will only be available in English, and is only supported on devices powered by Apple silicon.

Another new feature is Point and Speak, which uses the iPhone’s camera and LiDAR scanner to enable visually disabled people interact with physical objects that have several text labels. As an example, Apple says that the feature could make it easier for users with vision disabilities to better operate household devices like the microwave oven, as it could read aloud the text on each button when users move their finger over the keypad. At launch, Point and Speak will be available in English, French, Italian, German, Spanish, Portuguese, Chinese, Cantonese, Korean, Japanese, and Ukrainian.

Another notable feature is Voice Control, which adds “phonetic suggestions for text editing”. This is meant to help users who type with their voice to choose the right word out of several similar-sounding words, like ‘do,’ ‘due,’ and ‘dew,’ for example. Users will now also be able to turn any switch into a virtual game controller using Switch Control, while text size adjustment is now easier in Mac apps, like Finder, Messages, Mail, Calendar, and Notes. As part of the new focus on accessibility, Apple is also expanding SignTime to Germany, Italy, Spain and South Korea.

Apple is expected to preview iOS 17 and iPadOS 17 at its upcoming WWDC23 event next month. While the new operating systems were originally expected to be minor upgrades over their predecessors, newer reports suggest that they could include more new features than previously expected. The new accessibility options should make these operating systems even more welcome, especially for people with cognitive disabilities and impairments.