Researcher Dr. Jordan Green collaborates with Google team to create a new app that promises to help millions communicate better and for a longer time.

A collaboration between researchers at Google and the MGH Institute promises to provide millions of people with speech disorders the chance to be better understood.

Dr. Jordan Green, the IHP’s chief scientific officer, the Matina Souretis Horner Professor, and director of the Speech and Feeding Disorders Lab, worked with Google’s Euphonia team to develop the technology behind Project Relate, an AI android phone app that works as a personalized speech recognition for people who have impaired or unintelligible speech due to hearing impairment, stroke, traumatic brain injury, aphasia, neurologic disease, and laryngectomy.

“It’s a game changer,” said Green, whose research focuses on people with amyotrophic lateral sclerosis (ALS), a progressive neurodegenerative disease that over time robs people of the ability to initiate and control muscle movement and eventually the ability to speak, eat, move, and breathe. “This AI technology holds the potential to enable individuals with speech disorders to interact and actively participate in society."

The app, currently in Beta testing, has three features: Listen, Repeat, and Assistant. Listen transcribes a person’s speech to text, serving as a live closed captioning. Repeat restates the words in a clear, computerized voice, which can help performing such tasks as ordering coffee. Assistant utilizes the Google voice command so the user can ask to set a timer or play a song. 

In many ways, Green noted, the new app is a logical extension of the technology that is now commonly used in modern translator apps. But, in this instance, the app leverages AI to create a distinctive model for each speaker, mapping their highly individualized speech patterns to words and sentences. And with as many as 250,000,000 people across the globe with non-standardized speech, the app has the potential to improve their lives immeasurably.

Here’s how it works: a user is asked to record 500 words or phrases for the program to understand their speech patterns. A few days later, using a personalized speech model created just for them, the user can use the app to translate what they’re saying into recognizable language and vocalize it.

The MGH Institute has partnered with Google to create a new presentation for speech-language pathologists to learn more about the app that they can use as an Alternative and Augmentative Communication (AAC) or Assistive Technology (AT) tool for their clients. Google’s Project Relate App: Using Automated Speech Recognition & Speech-to-Speech Conversion to Facilitate Conversation is a free, two-hour, online prerecorded presentation that is open to both speech language pathologists (SLPs) and anyone else interested in the technology.

A Tool for Speech-Language Pathologists

While the app can be used by those with speech disorders, SLPs can play a major role in helping people use the device to its maximum effectiveness – something that’s stressed in the presentation.

“Speech-language pathologists may look at this and say, ‘This kind of technology could replace me’ but there's a lot that a speech-language pathologist needs to do to make this technology work in real-life environments,” Green said. “It’s important for speech-language pathologists to work with patients and their families so the app can be set up just right for their unique environment and it can be optimized so they can get the most out of it. ”

For example, physical limitations caused by ALS or a stroke may prevent some people from holding a phone to their ear or have the dexterity to press the app’s activation button. Others may need assistance because of impaired language or cognition abilities. And identifying noisy locations to avoid is another factor; using the app next to a window air conditioner, for example, will not work as well as in a quiet room.

“It's not as easy as just talking to it,” Green pointed out. "Anyone who has tried to use dictation systems in real-world settings to communicate will be familiar with these challenges."

Green has been working for several years with Google on ways to improve speech for people with impairments. A major component in developing Project Relate, Green said, has been Google’s two million recordings from thousands of people with speech disorders to help make their products more accessible. Without that data collection, creating working technology would have continued to be a challenge.

"When I learned about this project led by Google, I was thrilled,” recalled Green. “With the resources and talent, they were willing to invest; it's not just a possibility; it's a reality in the making. The big bottleneck many times for developing AI applications is the need for massive training data sets. So, Google had the insight to start the Euphonia project and collect sample speech samples from people with a wide variety of speech disorders.  This treasure trove of recordings combined with the highly dedicated and talented team at Google is what makes this project so groundbreaking and unique.”

The app is still in its early stages, Green said, and much work needs to be done to make the systems work better during spontaneous conversational speech and in noisy real-world environments. At this time, it can understand just a few languages, but more are expected to come online as the Euphonia team’s efforts expand. But, from someone who has worked decades for a patient with ALS to articulate just a few words as their condition worsens, Project Relate is something Green could only dream about.

“When I first heard about the project, I thought that maybe it would provide someone with 50 words because that would be transformative for somebody who has no words to have 50 words,” he said. “But then the Google team came along and used their sophisticated pipelines. And they showed how the app can recognize thousands of words, taking somebody who's only 10% intelligible and making them up to 80-90% understandable. It’s remarkable.”

Do you have a story the Office of Strategic Communications should know about? If so, email us at ihposc [at] mghihp.edu (ihposc[at]mghihp[dot]edu).