Skip to content

Apple reveals new accessibility features, like custom text-to-speech voices


Apple previewed a set of new features today to improve cognitive, visual, and speech accessibility. These tools are scheduled to arrive on the iPhone, iPad, and Mac later this year. A established leader In mainstream technology accessibility, Apple emphasizes that these tools are built with feedback from disability communities.

Coming soon to iOS and iPadOS, Assisted Access is designed for people with cognitive disabilities. Assistive Access streamlines the interface on iPhone and iPad, specifically focusing on making it easier to talk with loved ones, share photos, and listen to music. The Phone and FaceTime apps are merged into one, for example.

The design is also made more digestible by incorporating larger icons, higher contrast, and clearer text labels to keep the screen simpler. However, the user can customize these visual features to their liking, and those preferences are carried over to any application that supports Assisted Access.

As part of the existing Magnifier tool, blind and low vision users can now use their phone to locate themselves nearby doors, people or signs. Now Apple is introducing a feature called Point and Speak, which uses the device’s camera and LIDAR scanner to help the visually impaired to interact with physical objects that have various text labels.

Image Credits: Apple

So, if a user with low vision wanted to microwave food, they could use Point and Speak to tell the difference between the “popcorn”, “pizza” and “power level” buttons: when the device identifies this text, read it aloud. Point and Speak will be available in English, French, Italian, German, Spanish, Portuguese, Chinese, Cantonese, Korean, Japanese, and Ukrainian.

One particularly cool feature of the pool is Personal Voice, which creates an automated voice that sounds like you, instead of Siri. The tool is designed for people who may be at risk of losing their ability to speak due to conditions such as ALS. To generate a Personal Voice, the user has to spend about fifteen minutes clearly reading randomly chosen text messages into their microphone. Then, using machine learning, the audio is processed locally on your iPhone, iPad, or Mac to create your personal voice. It sounds similar to what Acapela has been doing with its “my own voice” service, which works with other assistive devices.

It’s easy to see how a repository of unique, highly-skilled text-to-speech models could be dangerous in the wrong hands. But according to Apple, this personalized voice data is never shared with anyone, not even Apple itself. In fact, Apple says it doesn’t even connect your personal voice to your Apple ID, since some households can share a login. Instead, users must choose whether they want a Personal Voice they create on their Mac to be accessible on their iPhone, or vice versa.

At launch, Personal Voice will only be available to English speakers and can only be created on Apple Silicon devices.

Whether you’re speaking like Siri or its AI voice twin, Apple is making it easy for non-verbal people to communicate. Live Speech, available on all Apple devices, allows people to type what they want to say so it can be said out loud. The tool is available on the lock screen, but it can also be used in other apps, like FaceTime. Also, if users need to repeat the same phrases frequently, like a regular coffee order, for example, they can store preset phrases in Live Speech.

Apple’s existing speech-to-text tools are also being updated. Voice Control will now add phonetic text editing, making it easier for voice typers to quickly correct mistakes. So if you see your computer transcribe “excellent” but you meant “gray”, it will be easier to make that correction. This feature, Phonetic Suggestions, will be available in English, Spanish, French, and German for now.

Image Credits: Apple

These accessibility features are expected to roll out to a number of Apple products this year. Building on its existing offerings, Apple will expand SignTime access to Germany, Italy, Spain and South Korea on Thursday. signal time provides users with on-demand sign language interpreters for Apple Store and Apple Support customers.


—————————————————-

Source link

🔥📰 For more news and articles, click here to see our full list.🌟✨

👍 🎉Don’t forget to follow and like our Facebook page for more updates and amazing content: Decorris List on Facebook 🌟💯