Research into the development of Alzheimer’s disease has revealed how changes in speech patterns related to memory loss become detectable years prior to a clinical diagnosis.

These changes are too subtle for human ears to notice. However, scientists now report that artificial intelligence (AI) can come to the rescue, providing a way for doctors to “hear” Alzheimer’s at a very early stage.

A number of studies have already demonstrated specific speech differences between those with and without dementia.

For example, scientists are finding that brain changes taking place with Alzheimer’s affect language in a certain way such as the use of unique words declines, while non-specific nouns and other word fillers increase over time.

As a result, it becomes more difficult for people developing Alzheimer’s disease to complete sentences or phrases. Speech slows down and becomes more repetitive. Pauses become longer as verbal fluency declines.

Developing a Speech-Based Alzheimer’s “Smart” Test 

These changes potentially mean it should be possible to develop an inexpensive, easy-to-use, speech-based diagnostic test that can be applied in a medical facility or even at home.

Researchers turned to artificial intelligence. However, early attempts at using AI fell short of the mark.

Last year, for instance, Pfizer and IBM Research demonstrated that a machine learning model incorporating 87 linguistic variables could predict Alzheimer’s disease with 70 percent accuracy.

Not bad, but Japanese researchers believed they could do better. They tried again and struck gold.

XGBoost Gets Top Marks with 100 Percent Accuracy 

The Japanese research team analyzed audio recordings from telephone conversations taken from 24 seniors afflicted with Alzheimer’s and 99 others over the age of 65 who were dementia free.

Each participant also underwent a standard cognitive test that’s carried out in Japan which involves a recorded telephone interview.

Various vocal features obtained from some of the recordings were used to train machine-learning algorithms to spot the difference between the two groups of participants. The remaining recordings were used to gauge the performance of the models.

One of the machine learning models, called XGBoost, came out on top with scores that were quite remarkable, achieving both a sensitivity and specificity in Alzheimer’s diagnosis of 100 percent.

In other words, every person with Alzheimer’s was correctly identified as having the disease, and every individual without Alzheimer’s was correctly identified as being cognitively healthy. There were no false negatives or false positives. However, we shouldn’t get too carried away.

There are a number of limitations to this research such as the small number of participants and the fact that it was a comparison of those with and without Alzheimer’s with no one else in between. For instance, the study didn’t include anyone with mild cognitive impairment, which would have provided a bigger challenge to the AI model.

Even so, researchers believe this type of “smart” speech-based approach will provide a wealth of new diagnostic opportunities in the future.

Identify Cognitive Problems on Your Smartphone 

The researchers wrote, that, “the XGBoost model demonstrated performance comparable to cognitive tests…” and could “represent a unique opportunity for early detection of Alzheimer’s.

“Our achievement in predicting Alzheimer’s well using only vocal features from daily conversation indicates the possibility of developing a pre-screening tool for Alzheimer’s among the general population that is more accessible and lower-cost.”

The researchers recognize more work needs to be done, but hope, if further modifications and testing prove successful, that the disease will be diagnosed early by using these methods via a website or smartphone app.


  1. www.medicalnewstoday.com/articles/app-could-flag-up-alzheimers-from-phone-conversations 
  2. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0253988