The University of Illinois (UIUC) has partnered with Amazon, Apple, Google, Meta, Microsoft and nonprofits on the Speech Accessibility Project. The aim is to improve voice recognition for communities with disabilities and diverse speech patterns often not considered by AI algorithms. That includes people with Lou Gehrig's disease (ALS), Parkinson's, cerebral palsy, Down syndrome and other diseases that affect speech.
“Speech interfaces should be available to everybody, and that includes people with disabilities," UIUC professor Mark Hasegawa-Johnson said. "This task has been difficult because it requires a lot of infrastructure, ideally the kind that can be supported by leading technology companies, so we’ve created a uniquely interdisciplinary team with expertise in linguistics, speech, AI, security and privacy."
To include communities of people with disabilities like Parkinson's, The Speech Accessibility Project will collect speech samples from individuals representing a diversity of speech patterns. The UIUC will recruit paid volunteers to contribute voice samples and help create a "private, de-identified" dataset that can be used to train machine learning models. The group will focus on American English at the start.
The Davis Phinney Foundation (Parkinson's) and Team Gleason (ALS) have pledged support for the project. "Parkinson’s affects motor symptoms, making typing difficult, so speech recognition is a critical tool for communication and expression," said The Davis Phinney Foundation's executive director, Polly Dawkins. "Part of [our] commitment includes ensuring people with Parkinson’s have access to the tools, technologies, and resources needed to live their best lives."
from Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics https://ift.tt/6jvtXNq
No comments:
Post a Comment
Guys Comments for Revolutionary Change!!!