AI and neuropsychology

Artificial Intelligence, or AI, is processing data in just about every industry, performing billions of tasks each day.

According to a 2024 report from market research company Grand View Research, the global AI market topped $196 billion in 2023 and is expected to grow 36.6 percent from 2024 to 2030.

Neuropsychology is one field trying to embrace this technology, incorporating AI into certain aspects of identification and management of cognitive decline and other neuropsychological issues.

AI holds promise in multiple medical applications, including interpreting radiological studies such as CT and MRI, transcribing medical documents, and focusing medication development. In general, these technologies can enhance human judgment in data-centric and functional tasks, reduce human error by identifying patterns in clinical data that humans are prone to miss, leverage data patterns to support precision medicine, and streamline production of necessary data from exams. This opens a wide window of potential applications for AI in neuropsychology, which heavily depends on precise data from exams as well as accurate integration of medical data from other specialties.

AI/Neuropsych in the wild

Several neuropsychologists around the world already have written peer-reviewed articles about how AI can be and has been incorporated into neuropsychology already.

One of the most recent articles, published in 2024 in the Journal of Personalized Medicine, offered a systematic review of articles that spotlighted AI in neuropsychological assessment for the early detection and personalized treatment of mild cognitive impairment and Alzheimer’s disease. The review highlights several advancements, such as analyzing behavioral data from executive function tasks within virtual environments to identify biomarkers or distinguish subjects with mild cognitive impairment (MCI) from those who are healthy. The article also identifies a gap in developing tools that simplify clinicians’ workflows, and underscores the need for explainable AI in healthcare decision-making.

Another recent work: a 2024 article in PLoS Computational Biology that demonstrated an AI deep-learning model that can predict behavior from brain activity with 95 percent accuracy.

Older stories paint a similarly compelling picture.

In one piece published in Disability and Rehabilitation in 2023, five authors discussed how doctors and researchers are using AI to deliver innovative solutions in aphasia management and rehabilitation. Some of these solutions include utilizing AI to power conversational agents (ie, chatbots) and affect recognition software. The takeaway: AI has not been used widely to help patients in these areas, but when it is used, it yields encouraging results.

Another article, published in Nature magazine’s NPJ Digital Medicine in 2022, is an umbrella review of the performance of AI in diagnosing mental health disorders. The gist of this one was simple: AI offers great promise in diagnosing these conditions, as it automatically compares a patient’s symptoms and evaluation scores against a database of symptoms and scores from thousands of prior cases.

A fourth article, published in Neuroscience & Biobehavioral Reviews in 2020, notes that AI and machine learning have been established as one of the more powerful approaches to extract reliable predictors of different phenotypes of Alzheimer’s disease. This process is based on data and scores on neuropsychological exams and other evaluations. The report suggests that AI can help doctors and researchers extract relevant neuropsychological measures useful to diagnosis and prognosis.

Of course there are more mainstream treatments, like this piece which focuses on how AI can help predict the onset of dementia using speech patterns over time.

Still other publications (and several high-level analysis pieces) focus on potential ethical and legal pitfalls with AI. Among them: lack of standards, privacy concerns and diversity concerns.

In other words, while AI has the potential to make a big impact, these are the early days.

Potential stumbling blocks

For this reason, many neuropsychologists said the growing push for more AI in neuropsychology is controversial at best.

According to Dr. Bob Bilder, director of the Tennenbaum Center for the Biology of Creativity at the University of California Los Angeles, AI likely will never completely replace humans in this very human-oriented field, but it may become a useful tool.

Dr. Builder, who is also a clinical neuropsychologist, said the technology is best applied in number crunching, though the human element is still required to observe behavior in the most useful ways.

“Even with the best video cameras, they don’t capture what the human eye can capture,” said Dr. Bilder. “In terms of the speed, the breadth, the changing field of view, the focus capacities of human observers are way beyond existing video technology.”

Dr. Cullum, a former NAN president who has studied the developing field of teleneuropsychology for the last decade or so described AI as “helpful” but with “shortcomings.”

Dr. Munro Cullumis Pam Blumenthal Distinguished Professor in Clinical Psychology and a neuropsychologist at UT Medical Center. He agreed with Dr. Bilder’s thoughts on patient interactions: “I want to see that person across the table from me, or across the table from my assistant, during that evaluation, to really observe them,” he said.

Having these first-hand observations, Dr. Cullum continued, enables psychometric analysts and neuropsychologists to pick up nuance, body language, and other non-verbal communication during the interactions.

Dr. Bilder noted that the best opportunity to apply AI in neuropsychology might be to interpret lab test results—formerly mind-numbing procedures.

He said that while AI can give reports based on current data (and nothing else), the clinician’s job is to determine whether the AI-generated opinion actually matches the patient’s history, clinical course, and clinical presentation.

“You don’t need humans to be figuring out if you’ve got some rare disease based on lab test results—the machines are much better at that,” he said. “Wouldn’t it be great if you went to see your doctor and they talked to you and looked at you in the eye? Right now, they [must] look into the electronic health record and type stuff on the keyboard so they’re in compliance and can bill.”

Dr. Cullum shares a similar vision.

In his clinic of the future, perhaps patients would come in for neuropsychological evaluations and would wear EEG caps that measure magnetic brain waves while they’re performing a battery of neuropsychological measures. Dr. Cullum also acknowledged that the possibility of third-party observers affecting neuropsychological data is a powder keg of an issue. In these cases, legal implications could take hold in the neuropsychology industry—particularly in forensic scenarios.

“Maybe there’s a machine learning component—a simultaneous EEG with cognitive test performance with speech and behavioral analysis all at once,” he says. “Once the system identifies your whereabouts [on the exam], it can give that feedback in a nice quantitative fashion to the examiner, who then interprets the findings for the patient.”

This article has been factchecked. For more about that process, click here.