Apple has released a Press release where the company announced new cognitive accessibility, language and vision features expected to arrive in iOS 16 and iOS 17 later this year. by reading a series of random texts, we can create a personal sound”. Fifteen minutes of audio from this source material can be found on the iPhone or iPad. faithfully reproduce the user’s voice thanks to the grace of machine learning.
This accessibility feature, such as those with Charcot disease, “to risk losing their ability to speak ». Live Talk on iPhone, iPad and Mac allows users to Enter the phrases they want the device to speak during a phone call or on FaceTime.
iPhone will imitate voices of language impaired users
The company’s engineers recognize that “technology often poses physical, visual, or cognitive challenges for these individuals.” La Loupe will thus win one very useful new feature: “Point and Speak”. Visually impaired users can “interact with physical objects more easily with multiple text tags. For example, when using a microwave, Point & Speak combines data from the Camera app, LiDAR scanner, and machine learning from the iPhone. speak the text of each button when the user moves their finger over the device.

Function “Assistant Access” It will personalize and simplify the use of iPhone or iPad to enhance the experience of cognitively impaired users. The interface of the smartphone will be reduced to its simplest expression, with large buttons, colors and larger fonts. And for users who prefer to communicate visually, Messages will only offer emoji keyboard.