A computerized-looking image of a brain in green with directional lines coming out of it on a black space-age-type background.

Using AI to Treat Aphasia

Once the stuff of science fiction, artificial intelligence (AI) has become a valuable tool to help people with aphasia communicate more effectively. The possibilities for new solutions are limitless.

Aphasia is a condition that affects someone’s ability to use and understand language. Some people with aphasia have trouble speaking or understanding what people are saying. Others have difficulty reading or writing.

“About two years ago, the average person was not aware of the term ‘aphasia’ and the kinds of language problems people with aphasia struggle with,” said Swathi Kiran, PhD, professor in neurorehabilitation and director of the Center for Brain Recovery at Boston University. “Only after large language models came about did people understand how language is planned and produced, as the models are able to understand complex linguistic patterns and generate contextually relevant responses. The models generally tend to do this very quickly and efficiently, but they have also provided a window of insight what may go wrong when something goes wrong.”

AI is technology that can mimic human intelligence. One type of AI is called generative AI, which can create content, whether that’s language, images or music. Breaking it down even more, a subset of generative AI focuses on language and is created using large language models (LLM). An LLM is a type of machine learning that is “trained” using large data sets of text—which allows it to understand and create content using human language.

Currently, as one part of therapy, patients with aphasia may be given printed materials that ask them to describe pictures, write a poem, categorize items, or write and read scripts for conversation.

Now, clinicians can use generative AI tools such as ChatGPT, Google Gemini or Microsoft Copilot to create practice materials designed specifically for each aphasia patient. “Instead of developing words and sentences that each patient can practice, clinicians are often using AI to develop personalized therapy materials,” Dr. Kiran said.

Many people with aphasia rely on iPads or other tablets for practice and communication. The available apps have become more complex, enhancing the user experience. “There’s a lot of aphasia therapy software that you can download on an iPad,” Dr. Kiran said. “It’s easy to use, and the apps generally allow for a lot of practice at home.”

AI-powered apps that translate speech to text and text to speech help people with aphasia understand and communicate more easily. When text is translated to spoken word, the AI-powered voice sounds natural, not robotic. The speech-to-text function is also helpful for people with aphasia.

“If somebody has an auditory comprehension problem, but they can understand written language better, it can convert speech to text and provide captions during conversation,” said Maya Henry, PhD, associate professor in the department of speech, language and hearing sciences and director of the Aphasia Research and Treatment Lab at the University of Texas at Austin. “I’m seeing people with aphasia doing that more and more.”

Improvements in AI-powered speech-recognition technology have made it easier for people with aphasia to be understood by AI tools. “Five years ago, even the best speech-recognition technology could not recognize somebody who had aphasia,” Dr. Kiran said. “If they took a long time or said the wrong word, it would get confused and not know how to transcribe that speech. Today, the technology is better.”

What’s Next?

Researchers are studying the impact that AI-powered technology can have on people with aphasia. Some of the findings are encouraging, but much of what’s available is experimental and not yet available to patients.

AI-powered machine-learning models can be used for several applications, such as detecting changes in the brain, identifying primary progressive aphasia or dementia early, and determining when therapy may be useful.

“With primary progressive aphasia and dementia, change happens gradually, compared to changes from one sudden event, like a stroke,” Dr. Kiran said. “In such situations, AI-powered models can be trained to detect warning signs in behavior that can result in some earlier actions and interventions.”

Machine-learning processors may also help people with aphasia get “unstuck” during conversation. “If you’re at an impasse because you can’t come up with the words, AI could offer targeted suggestions and you could select a suggestion,” said Sara Pillay, PhD, a clinical neuropsychologist and associate professor of neurology at Froedtert and the Medical College of Wisconsin. “AI would be learning, based on your responses, so that it gets more tailored.”

Dr. Kiran and colleagues have built machine-learning models to determine whether the length and intensity of therapy affects communication in people with aphasia. They created a web-based user interface based on their model in which people can enter their information for outcome predictions. Models like this may become more intricate and widespread.

Dr. Kiran and colleagues were also interested in whether other treatment factors, such as the length and intensity of therapy, affected communication in people with aphasia. “We were trying to understand, does more therapy help?” Dr. Kiran said. “The answer is yes, but it’s not one-size-fits-all.” For example, he said, a 40-year-old who recently had a stroke might improve better and faster with less therapy than a 70-year-old who had a stroke several years ago.

Researchers have uncovered several other ways AI may be able to help people with aphasia. Let’s look at the types of tools that may one day be widely available:

  • Researchers at the University of Texas at Austin have shown that a language decoder can help translate a person’s thoughts into text, potentially helping people with aphasia communicate more effectively.
  • Some researchers have used AI in a research setting to identify different aphasia syndromes or the types of speech errors that affect people with aphasia.
  • AI or virtual reality may one day help improve speech fluency by making it easier for aphasia patients to view a mouth forming words. “For a long time, we’ve been recording humans producing scripted content for patients to rehearse with,” Dr. Henry said. “Now that we have ultra-realistic avatars, we could build an app that would allow people to have their own personal mouth model.”

While AI offers a lot of potential and hope for aphasia therapy, it won’t replace clinicians. Instead, it would be one potential tool clinicians could use to help their patients.

“There are some places where I don’t think it’s going to be useful,” Dr. Kiran said. “For example, I don’t think it’s going to replace a clinician because of the humanistic aspects that clinicians bring to therapy settings.”