The onset of Alzheimer’s disease and dementia can manifest itself in many ways—but largely, the accumulation of small, imperceptible changes in the patterns of a person’s everyday life may help paint a picture that points toward cognitive decline, even if it’s hard to see with the human eye.
By ingesting and analyzing information from smartphones, tablets, wearables, sleep monitors and more, Big Data firm Evidation Health—with help from Apple and Eli Lilly & Co.—was able to show that digital biomarkers could be useful in spotting these symptoms earlier.
Their study collected at least 16 terabytes of data from 113 participants over three months in real-world settings, including passive sensor readings from devices, as well as questionnaires about mood and energy and simple assessment activities.
Recent exploratory results showed that motor skill tests—such as rhythmically tapping a touchscreen or dragging and matching shapes in an app, alongside reading and typing tasks—were performed more slowly in symptomatic patients with mild cognitive impairment or dementia related to Alzheimer’s.
In addition, these participants were less likely to strongly adhere to a daily routine: they would pick up or put down their cell phones for the first or last time at different points in the day, for example.
"We know that insights from smart devices and digital applications can lead to improved health outcomes, but we don't yet know how those resources can be used to identify and accelerate diagnoses,” first author Nikki Marinsek, a data scientist at Evidation, said after the study was presented in August. “The results of the trial set the groundwork for future research that may be able to help identify people with neurodegenerative conditions earlier than ever before."
Evidation estimates that 5.7 million people in the U.S. and 46.8 million worldwide live with dementia, with global costs reaching $1 trillion annually, while early detection measures are lacking.
One large step forward could come from Evidation’s work in digital voice recordings, and searching for similarly subtle changes in a person’s speech and diction over a period of years.
Parsing more than 7,000 audio recordings obtained from the Framingham Heart Study—a multigenerational, longitudinal research project first launched in 1948—Evidation compared spoken responses to neuropsychological tests over a period of 11 years among people who would eventually be diagnosed with dementia. The project is supported by DARPA and NIH grants, in collaboration with MIT and Boston University.
In 2018, early feasibility results showed that a machine learning-based classifier was able to predict outcomes of dementia or normal cognitive function by studying features such as a person’s talking speed, the length of their pauses or their delay before answering questions.
The AI program used both manually and automatically transcribed speech recordings, while noting a person’s acoustic pitch, jitter and harmonic-to-noise ratio—essentially the clarity of their speech. The classifier’s accuracy was improved further when linguistic features and natural language processing were added, along with health and demographic data.
Evidation said that its future work will include larger sample sizes, as it aims to develop these acoustic biomarkers into a scalable tool to screen for people at risk.
Meanwhile, Eli Lilly and Evidation spun their digital biomarker collaboration into a multiyear project, with the goal of mining everyday data from smartphones and wearables across several disease areas, including diabetes.