Artificial Intelligence can help predict who will develop dementia, a new study finds

 

Photo by Lukas Budimaier

If you could find out years ahead that you were likely to develop Alzheimer’s, would you want to know?

Researchers from McGill University argue that patients and their families could better plan and manage care given this extra time. So the team has developed new artificial intelligence software that uses positron emission tomography (PET) scans to predict whether at-risk patients will develop Alzheimer’s within two years.

They retrospectively studied 273 individuals with mild cognitive impairment who participated in the Alzheimer’s Disease Neuroimaging Initiative, a global research study that collects imaging, genetics, cognitive, cerebrospinal fluid and blood data to help define the progression of Alzheimer’s disease.

Patients with mild cognitive impairment have noticeable problems with memory and thinking tasks that are not severe enough to interfere with daily life. Scientists know these patients have abnormal amounts of tau and beta-amyloid proteins in specific brain regions involved in memory, and this protein accumulation occurs years before the patients have dementia symptoms.

However, not everyone with mild cognitive impairment will go on to develop dementia, and the McGill researchers aimed to predict which ones will.

First, the team trained their artificial intelligence software to identify patients who would develop Alzheimer’s, by identifying key features in the amyloid PET scans of the ADNI participants. Next, they assessed the performance of the trained AI using an independent set of ADNI amyloid PET scans. It predicted Alzheimer’s progression with an accuracy of 84 percent before symptom onset, as reported in a recent paper in Neurobiology of Aging.

The researchers hope their new AI tool will help improve patient care, as well as accelerate research to find a treatment for Alzheimer’s disease by identifying which patients to select for clinical trials.

“By using this tool, clinical trials could focus only on individuals with a higher likelihood of progressing to dementia within the time frame of the study. This will greatly reduce the cost and time necessary to conduct these studies,” said Serge Gauthier, MD, a senior author and professor of neurology and neurosurgery and of psychiatry at McGill, in a recent news release.

The new AI tool is now available to scientists and students, but the McGill researchers need to conduct further testing before it will be approved and available to clinicians.

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

New study intervenes to help female collegiate distance runners eat enough

Photo by David Gonzalez

Like other athletes at risk, female collegiate distance runners are predisposed to develop bone stress injuries from a condition known as the female athlete triad triad, said Michael Fredericson, MD, a professor of orthopaedic surgery and sports medicine at Stanford, who has worked with Stanford athletes for more than 25 years.

The triad stems from an energy deficiency, he explained:

“When your body isn’t getting enough food, then you stop producing normal levels of sex hormones, which leads to menstrual dysfunction. Your growth hormones go down, so you lose muscle mass. Your thyroid hormones go down, so your metabolism gets suppressed. And your stress hormones go up, which also leads to menstrual dysfunction and reduced muscle mass. And all of that leads to lower bone density, and eventually osteopenia [low bone strength] or even osteoporosis.”

The problem is common. “Based on our historical data, 38 percent of our female runners developed stress fractures over a three-year period from 2010-2013,” Fredericson said. “I knew the time had come to do something to prevent this.”

He is investigating the effectiveness of a nutritional intervention, in collaboration with Aurelia Nattiv, MD, from the University of California, Los Angeles. They have enrolled about 180 male and female runners from Stanford and UCLA in their study.

“The goal is to have our runners eat 45 kcal/kg/fat free mass per day, which is really just a normal diet — so their energy input equals their energy output,” Fredericson said. “We found a third of the women were getting less than this and half of the men were getting less. So it’s fair to say that a significant number of our runners were not getting adequate nutrition or calories.”

The runners met individually with a sports dietician and filled out an extensive dietary questionnaire to estimate their food intake, Fredericson told me. A low-dose x-ray machine was also used to measure their bone density and basic blood work measured vitamin D and thyroid levels, he said. Finally, their risk of female athlete triad was assessed using an established point system.

After their health assessment, a dietician helped each runner select individual nutrition goals, like adding a snack or increasing the energy density of a current meal, Fredericson said. “We typically want them to eat smaller more frequent meals — particularly right before and immediately after exercising,” he said.

The runners also used an app developed by collaborators at UCLA, which provided an eight-week nutrition education curriculum, including handouts, video clips, recipes and behavior modifying exercises.

Although the researchers have only completed the first year of a three-year study, they have found their intervention is working. “A majority of the runners have increased their bone density over the one year period by 2 to 5 percent,” Fredericson said. “ Our preliminary findings also show for every one-point increase in risk score, there was a 17 percent increase in the time it took to recover after an injury. … Anecdotally, we are seeing less injuries and the type of injuries that we are seeing are less severe.”

He emphasized the importance of the work:

“We have a number of young women that are exercising at levels beyond their ability to support their nutritional requirements. By the time they enter college many of them have osteoporosis … Ours is the first attempt to address these issues in an organized study with elite athletes. We need to turn things around for these young women, and prevent more serious health problems later in life.”

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

Strong association between vision loss and cognitive decline

Photo by Les Black

In a nationally representative sample of older adults in the United States, Stanford researchers found a strong relationship between visual impairment and cognitive decline, as recently reported in JAMA Ophthalmology.

The research team investigated this association in elderly populations by analyzing two large US population data sets — over 30,000 respondents from the National Health and Aging Trends Study (NHATS) and almost 3,000 respondents from the National Health and Nutrition Examination Study (NHANES) — which both included measurements of cognitive and vision function.

“After adjusting for hearing impairment, physical limitations, patient demographics, socioeconomic status and other clinical comorbidities, we found an over two-fold increase in odds of cognitive impairment among patients with poor vision,” said Suzann Pershing, MD, assistant professor of ophthalmology at Stanford and chief of ophthalmology for the VA Palo Alto Health Care System. “These results are highly relevant to an aging US population.”

Previous studies have shown that vision impairment and dementia are conditions of aging, and their prevalence is increasing as our populations become older. However, the Stanford authors noted that their results are purely observational and do not establish a causative relationship.

The complexity of the relationship between vision and cognition was discussed in a related commentary by Jennifer Evans, PhD, an assistant professor of epidemiology at the London School of Hygiene and Tropical Medicine. She stated that this association could arise owing to problems with measuring vision and cognitive impairment tests in this population. “People with vision impairment may find it more difficult to complete the cognitive impairment tests and … people with cognitive impairment may struggle with visual acuity tests,” she wrote.

Assuming the association between vision and cognitive impairment holds, Evans also raised questions relevant patient care, such as: Which impairment developed first? Would successful intervention for visual impairment reduce the risk of cognitive impairment? Is sensory impairment an early marker of decline?

Pershing said she plans to follow up on the study:

“I am drawn to better understand the interplay between neurosensory vision, hearing impairment and cognitive function, since these are likely synergistic and bidirectional in their detrimental effects. For instance, vision impairment may accelerate cognitive decline and cognitive decline may lead to worsening ability to perform visual tasks. Ultimately, we can aim to better identify impairment and deliver treatments to optimize all components of patients’ health.”

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

Stanford researchers find a better way to predict ER wait times

Photo by meredith kuipers

Need to go to the emergency room, but not sure where to go? There’s an app for that. A growing number of hospitals are publishing their emergency department wait times on mobile apps, websites, billboards and screens within a hospital.

But according to researchers at the Stanford Graduate School of Business, these estimates aren’t that accurate.

The trouble with most wait time estimates is that the models these systems use are often oversimplified compared to the complicated reality on the ground, said study author Mohsen Bayati, PhD, an associate professor of operations, information and technology at Stanford, in a recent news story. The most common ways of estimating a wait time is to simply use a rolling average of the time it took for the last several patients to be seen, but this only works well if patients arrive at a steady rate with similar ailments, he said.

Bayati and his colleagues studied five ways to predict ER wait times using data from four hospitals, including three private teaching hospitals in New York City and the San Mateo Medical Center public hospital that primarily serves low-income residents.

In particular, the researchers focused on wait times for less acute patients who often have to wait much longer than predicted, because patients with life-threatening illnesses are given priority and treated quickly. At SMMC, they found that some patients with less severe health needs had to wait more than an hour and a half longer than predicted using the standard rolling average method, according to their recent paper in Manufacturing and Service Operations Management.

The researchers determined that their new approach, called Q-Lasso, predicted ER wait times more accurately than the other methods — for instance, it reduced the margin of error by 33 percent for SMMC.

The team’s new Q-Lasso method combined two mathematical techniques: queueing theory and lasso statistical analysis. From queuing theory, they identified a large number of potential factors that could influence ER wait times. They then used lasso analysis to identify the combination of these factors that are the best predictors at a given time.

The authors were quick to qualify that their method was more accurate, but it still had errors — between 17 minutes to an hour. However, they said that it has the advantage of overestimating wait times rather than underestimating them, which leads to a more positive experience. Bayati explained in the news piece why this is important:

“If a patient is very satisfied with the service, they’re much more likely to follow the care advice that they receive. A good prediction that provides better patient satisfaction benefits everyone.”

This is a reposting of my Stanford blog story, courtesy of Stanford School of Medicine.

Socioeconomic status and food: A Stanford researcher observed families to learn more

Photo courtesy of Priya Fielding-Singh

Priya Fielding-Singh, a PhD candidate in sociology at Stanford, wanted to learn more about the relationship between socioeconomic status and diet. So she made observations and conducted in-depth interviews with parents and adolescents from 73 families across the socioeconomic spectrum throughout the San Francisco Bay Area. I recently spoke with her to learn about her study.

What inspired you to research the relationship between socioeconomic status and diet?

“Growing up, my family was a foster family and we took in many children that came from impoverished backgrounds. I think this early exposure to social inequality was formative in shaping my interests and propelling me into the field of sociology. I became interested in food the more that I learned about diet and disease prevention.

We have excellent large-scale, quantitative studies that show a socioeconomic gradient in diet quality in the United States. Thus, we know that socioeconomic status is one of a few key determinants of what and how people eat. But what we understand less well is why. I wanted to know: how do people’s socioeconomic conditions shape the way that they think about and consume food?”

How did you obtain your data?

“In almost every family, I interviewed, separately, at least one parent and one adolescent to better understand both family members’ perspectives. I also conducted 100 hours of observations with families across socioeconomic status, where I spent months with each family and went about daily life with them.

I saw very clearly that food choices are shaped by myriad different external and internal influences that I only gained exposure to when I spent hours with families on trips to supermarkets, birthday parties, church services, nail salons and back-to-school nights. Importantly, I was able to collect data on family members’ exchanges around food, including discussions and arguments. What families eat is often the product of negotiations and compromises.”

What was it like to observe the family dynamics first-hand?

 “I’m a very curious person, as well as a people person, so I felt in my element conducting ethnographic observations. I was touched by how generously families welcomed me into their lives and shared their experiences with me. Because families were so open with me — and in many cases, did not attempt to shelter me from the challenging aspects of family life — observations were an incredibly illuminative part of the research.”

Based on your study, how is diet transmitted from parents to children?

“I found that parents play a central role in shaping teenagers’ beliefs around food, but there was often a difference in how adolescents perceived their mothers and fathers in relation to diet. Adolescents generally saw their mothers as the healthy parent and their fathers as less invested in healthy eating. So, feeding families and monitoring the dietary health of families largely remains moms’ job, as I explained in a recent article.

In addition, I found that how mothers talked to adolescents about food varied across socioeconomic status. My Stanford colleague, Jennifer Wang, and I wrote a paper explaining these differences. More affluent families had discussions that highlighted the importance of consuming high quality food, which may strengthen messages about healthy eating. In contrast, less privileged families had more discussions about the price of food that highlighted the unaffordability of healthy eating.

Finally, I found that lower-income parents sometimes used food to buffer their children against the hardships of life in poverty. They often had to deny their children’s requests for bigger purchases because those purchases were out of financial reach, but they had enough money to say yes to their kids’ food requests. So low-income parents used food to nourish their children physically, but they also used food to nourish their children emotionally.”

What were your favorite foods as a child?

“My favorite food growing up is the same as my favorite food today: ice cream. Beyond that, the diet I ate as a child was very different than the one I follow now. I grew up in a family of carnivores, but I became a vegetarian in my early 20s and never looked back.”

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.