Ramadan: Advising clinicians on safe fasting practices

Photo by mohamed_hassan

If you are a basketball fan who recently watched Portland Trail Blazers’ Enes Kanter play against the Warriors in the western NBA semi-finals, you may have heard about Ramadan fasting. But most Americans haven’t — and that includes clinicians.

“Even those clinicians who are aware of Ramadan often do not fully understand the nuances of fasting,” explains Rania Awaad, MD, a clinical assistant professor of psychiatry and behavioral sciences and the director of the Muslims and Mental Health Lab at Stanford. “For example, there is no oral intake from sunup to sundown of food, liquids and also medications. For clinicians who may be alarmed by this, it’s important to remember that fasting is globally practiced safely by adjusting the timing and dosing of medications and by following best practices like consuming enough fluids to rehydrate after the fast.”

Ramadan is the ninth month of the Islamic calendar, which is 11 days shorter than the solar year. This year in the U.S., it began on May 5 and ends on June 4. During Ramadan, many of the nearly two billion Muslims around the world fast during the sunlight hours as a means of expressing self-control, gratitude and compassion for those in need.

Several groups are exempted from this religious requirement — including pregnant women, children, the elderly and people who are acutely or chronically ill — but some fast anyway because of the spiritual significance, Awaad says.

“Ramadan is a very spiritual and communal month. So when clinicians immediately advise their patients not to fast, they may not realize they’re inadvertently isolating their patients from the broader community and support system,” Awaad says. She notes this is particularly important for patients with mental health disorders.

Awaad says she strongly advises clinicians to encourage their patients to seek a dual consultation with both a faith leader and medical professional at places like the Khalil Center, a professional counseling center specializing in Muslim mental health. Alternatively, patients observing Ramadan can consult both their faith leader and physician individually and help facilitate a consultation between both entities.

“Without a holistic treatment plan, patients are either fasting when they shouldn’t be — not taking their medications without telling their health care provider — or they are potentially not partaking in Ramadan when they can be,” Awaad says.

In a recent editorial in The Lancet Psychiatry, Awaad and her colleagues outline more clinical suggestions on the safety and advisability of Ramadan fasting that she hopes physicians will consider. For example, the editorial suggests that physicians working with patients with eating disorders should discuss the risks and benefits of fasting and consider close follow-up in this period and in the months following.

But the first step is knowing whether patients are Muslim. By co-teaching the “Culture and Religion in Psychiatry” class, Awaad says she helps Stanford psychiatry residents become comfortable asking about their patients’ religion, in the same way they are trained to ask other sensitive questions like sexual orientation.

“If we miss that our patient draws strength and support from their religion, then we miss the opportunity to support them holistically by incorporating their faith leader or faith community into their treatment plans,” Awaad explains. “The last Gallop poll revealed 87 percent of Americans believe in God, so it’s important to incorporate this into our patient care.”

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

Creativity can jump or slump during middle childhood, a Stanford study shows

 

Photo by Free-Photos

As a postdoctoral fellow in psychiatry, Manish Saggar, PhD, stumbled across a paper published in 1968 by a creativity pioneer named E. Paul Torrance, PhD. The paper described an unexplained creativity slump occurring during fourth grade that was associated with underachievement and increased risk for mental health problems. He was intrigued and wondered what exactly was going on.  “It seemed like a profound problem to solve,” says Saggar, who is now a Stanford assistant professor of psychiatry and behavioral sciences.

Saggar’s latest research study, recently published in NeuroImage, provides new clues about creativity during middle childhood. The research team studied the creative thinking ability of 48 children — 21 starting third grade and 27 starting fourth grade — at three time points across one year. This allowed the researchers to piece together data from the two groups to estimate how creativity changes from 8 to 10 years of age.

At each of the time points, the students were assessed using an extensive set of standardized tests for intelligence, aberrant behavior, response inhibition, temperament and creativity. Their brains were also scanned using a functional near-infrared spectroscopy (fNIRS) cap, which imaged brain function as they performed a standardized Figural Torrance Test of Creative Thinking.

During this test, the children sat at a desk and used a pen and paper to complete three different incomplete figures to “tell an unusual story that no one else will think of.” Their brains were scanned during these creativity drawing tasks, as well as when they rested (looking at a picture of a plus sign) and they completed a control drawing (connecting the dots on a grid).

Rather than using the conventional categories of age or grade level, the researchers grouped the participants based on the data — revealing three distinct patterns in how creativity could change during middle childhood.

The first groups of kids slumped in creativity initially and then showed an increase in creativity after transitioning to the next grade, while the second group showed the inverse. The final group of children had no change in creativity and then a boost after transitioning to the next grade.

“A key finding of our study is that we cannot group children together based on grade or age, because everybody is on their own trajectory,” says Saggar.

The researchers also found a correlation between creativity and rule-breaking or aggressive behaviors for these participating children, who scored well within the normal range of the standard child behavior checklist used to assess behavioral and emotional problems. As Saggar clarifies, these “problem behaviors” were things like arguing a lot or preferring to be with older kids rather than actions like fighting.

“In our cohort, the aggression and rule-breaking behaviors point towards enhanced curiosity and to not conforming to societal rules, consistent with the lay notion of ‘thinking outside the box’ to create unusual and novel ideas,” Saggar explains. “Classic creative thinking tasks require people to break rules between cognitive elements to form new links between previously unassociated elements.”

They also found a correlation between creativity and increased functional segregation of the frontal regions of the brain. Certain functions of our brain are done by regions independently and other functions are done by integration, when different brain regions come together to help us do the task. For example, a relaxing walk in the park with a wandering mind might have brain regions chattering in a segregated independent fashion, while focusing intently to memorize a series of numbers might require brain integration. And our brain needs to balance between this segregation and integration. In the study, they showed that increases in creativity tracked with increased segregation of the frontal regions.

“Having increased segregation in the frontal regions indicates that they weren’t really focusing on something specific,” Saggar says. “The hypothesis we have is perhaps you need more diffused attention to be more creative. Like when you get your best ideas while taking a shower or a walk.”

Saggar hopes their findings will help develop new interventions for teachers and parents in the future, but he says that longer studies, with a larger and more diverse group of children, are first needed to validate their results.

Once they confirm that the profiles observed in their current study actually exist in larger samples, the next step will be to see if they can train kids to improvise and become more creative, similar to a neuroscience study that successfully trained adults to enhance their creativity.

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

Pokémon experts’ brains shed light on neurological development

Photo by Colin Woodcock

Whether parents are dreading or looking forward to taking their kids to see the new “Pokémon” movie may depend on how their brains developed as a child. If they played the video game a lot growing up, a specific region of their visual cortex — the part of the brain that processes what we see — may preferentially respond to Pokémon characters, according to a new research study.

The Stanford psychologists studied the brains of Pokémon experts and novices to answer fundamental questions about how experience contributes to your brain’s development and organization.

Jesse Gomez, PhD, first author of the study and a former neuroscience Stanford graduate student, started playing a lot of Pokémon around first grade. So he realized that early exposure to Pokémon provided a natural neuroscience experiment. Namely, children that played the video game used the same tiny handheld device at roughly the same arm’s length. They also spent countless hours learning the hundreds of animated, pixelated characters, which represent a unique category of stimuli that activates a unique region of the brain.

The research team identified this specialized brain response using functional magnetic resonance to image the brains of 11 Pokémon experts and 11 Pokémon novices, who were adults similarly aged and educated. During the fMRI scan, each participant randomly viewed different kinds of stimuli, including faces, animals, cartoons, bodies, pseudowords, cars, corridors and Pokémon characters.

“We find a big difference between people who played Pokémon in their childhood versus those who didn’t,” explained Gomez in the video below. “People who are Pokémon experts not only develop a unique brain representation for Pokémon in the visual cortex, but the most interesting part to us is that the location of that response to Pokémon is consistent across people.”

In the expert participants, Pokémon activated a specific region in the high-level visual cortex, the part of the brain involved in recognizing things like words and faces. “This helped us pinpoint which theory of brain organization might be the most responsible for determining how the visual cortex develops from childhood to adulthood,” Gomez said.

The study results support a theory called eccentricity bias, which suggests the brain region that is activated by a stimulus is determined by the size and location of how it is viewed on the retina. For example, our ability to discriminate between faces is thought to activate the fusiform gyrus in the temporal lobe near the ears and to require the high visual acuity of the central field of vision. Similarly, the study showed viewing Pokémon characters activates part of the fusiform gyrus and the neighboring region called the occipitotemporal sulcus — which both get input from the central part of the retina — but only for the expert participants.

The eccentricity bias theory implies that a different or larger region of the brain would be preferentially activated by early exposure to Pokémon played on a large computer monitor. However, this wasn’t an option for the 20-something participants when they were children.

These findings have applications well beyond Pokémon, as Gomez explained in the video:

“The findings suggest that the very way that you look at a visual stimulus, like Pokémon or words, determines why your brain is organized the way it is. And that’s useful going forward because it might suggest that visual deficits like dyslexia or face blindness might result simply from the way you look at stimuli. And so that’s a promising future avenue.”

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

What healthy looks like: New study offers clues based on personalized tracking

Photo by DariuszSankowski

Everyone’s body is a little bit different. So it is important to understand our personal biological makeup while we are still healthy, so deviations from these healthy baselines can be used to detect early signs of disease. That’s key to precision health.

“We generally study people when they’re sick, rarely when they’re healthy, and it means we don’t really know what ‘healthy’ looks like at an individual biochemical level,” said Stanford geneticist Michael Snyder, PhD, in a recent Stanford news release.

As part of an international collaboration, Snyder used big data approaches to profile and track the health of more than 100 people at risk for diabetes for up to eight years. Participants underwent extensive testing each quarter, including clinical laboratory testing, exercise and physiological testing, microbial and molecular assessments, genetic sequencing, cardiovascular imaging and wearable sensor monitoring using smart watches or glucose monitors.

The goal of the study was to evaluate whether the emerging technologies could detect diseases early. During the study, the researchers discovered over 67 major clinically-actionable health issues — spanning across metabolism disorders, cardiovascular disease, cancer, blood disorders and infectious diseases. Namely, most of the participants had an unknown potential health problem flagged by the study, as reported in a paper recently published in Nature Medicine.

“We caught a lot of health issues because we noticed their delta, or their change from baseline. For instance, we caught nine people with diabetes as it was developing by continuously monitoring their glucose and insulin levels,” Snyder explained in the release. He added, “We were able to catch a lot of things before they were even symptomatic. And in most cases, it either led to folks being followed more carefully or to a medical intervention.”

The research team also used the big datasets to discover new biomarkers that may be able to predict the risk of cardiovascular and certain other diseases. Although preliminary, these results have inspired them to conduct larger follow-up studies.

This approach of extensively tracking personal health is currently too expensive to implement into standard health care on a broad scale, according to Snyder. But he hopes the prices will drop as more researchers and physicians innovate in the space.

“Ultimately, we want to shift the practice of medicine from treating people when they are ill to a focus on keeping them healthy by predicting disease risk and catching disease before it is symptomatic,” Snyder said.

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

Genetic roots of psychiatric disorders clearer now thanks to improved techniques

Photo by LionFive

New technology and access to large databases are fundamentally changing how researchers investigate the genetic roots of psychiatric disorders.

“In the past, a lot of the conditions that people knew to be genetic were found to have a relatively simple genetic cause. For example, Huntington’s disease is caused by mutations in just one gene,” said Laramie Duncan, PhD, an assistant professor of psychiatry and behavioral sciences at Stanford. “But the situation is entirely different for psychiatric disorders, because there are literally thousands of genetic influences on every psychiatric disorder. That’s been one of the really exciting findings that’s come out of modern genetic studies.”

These findings are possible thanks to genome-wide association studies (GWAS), which test for millions of genetic variations across the genome to identify the genes involved in human disease.

Duncan is the lead author of a recent commentary in Neuropsychopharmacology that explains how GWAS studies have demonstrated the inadequacy of previous methods. The paper also highlights new genetics findings for mental health.

Before the newer technologies and databases were available, scientists could only analyze a handful of genetic variations. So they had to guess that a specific genetic variation (a candidate) was associated with a disorder — based on what was known about the underlying biology — and then test their hypothesis. The body of research that has emerged from GWAS studies, however, show that nearly all of these earlier “candidate study” results are incorrect for psychiatric disorders.

“There are actually so many genetic variations in the genome, it would have been almost impossible for people to guess correctly,” Duncan said. “It was a reasonable thing to do at the time. But we now have better technology that’s just as affordable as the old ways of doing things, so traditional candidate gene studies are no longer needed.”

Duncan said she began questioning the candidate gene studies as a graduate student. As she studied the scientific literature, she noticed a pattern in the data that suggested the results were wrong. “The larger studies tended to have null results and the very small studies tended to have positive results. And the only reason you’d see that pattern is if there was strong publication bias,” said Duncan. “Namely, positive results were published even if the study was small, and null results were only published when the study was very large.”

In contrast, the findings from the GWAS studies become more and more precise as the sample size increases, she explained, which demonstrates their reliability.

Using GWAS, researchers now know that thousands of variations distributed across the genome likely contribute to any given mental disorder. By using the statistical power gleaned from giant databases such as the UK Biobank or the Million Veterans Program, they have learned that most of these variations aren’t even in the regions of the gene’s DNA that code for proteins, where scientists expected them to be. For example, only 1.1 percent of schizophrenia risk variants are in these coding regions.

“What’s so interesting about the modern genetic findings is that they are revealing entirely new clues about the underlying biology of psychiatric disorders,” Duncan said. “And this opens up lots of new avenues for treatment development.”

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.