Looking beyond drug treatments for parasitic disease

A Stanford study investigates the barriers to controlling parasitic disease and possible interventions beyond mass drug and education campaigns.

Just two doses of praziquantel can effectively treat adults and children for schistosomiasis — a disease caused by parasitic worms that develop and multiply inside infected freshwater snails, and then enter the water and penetrate the skin of people who are bathing or fishing.

So why are 200 million people worldwide, mostly in sub-Saharan Africa, still suffering from schistosomiasis despite widespread administration of this antiparasitic drug? Why are people in some “hot spot” regions re-infected over and over?

A new study explored these questions by surveying 74 residents from four rural villages along the Senegal River, a region with very high rates of schistosomiasis despite mass drug administration campaigns. In each village, the field team conducted focused discussion groups separately for adult men, adult women and mixed-gender youth to facilitate open conversation among peers. These different groups are known to have varied activities involving contact with the parasite-infested water

The study made three key findings. First, the researchers learned that many residents have a fairly sophisticated understanding of schistosomiasis risk, including knowing where and when infections occur, even though they don’t understand the underlying biological details. For example, the villagers realize infection risk increases midday — an observation supported by studies, which show snails tend to shed parasites in daily cycles that peak around noon.

Second, the scientists determined that residents use their knowledge to develop strategies to reduce their exposure to the parasite, such as avoiding the river at certain times or forbidding urination and defecation near the river or lake. In addition to personal strategies, some villages adopt written village-wide rules for water use that are enforced with fines.

Lastly, despite having translated knowledge into strategies to reduce disease risk, the researchers found that the residents are still consistently exposed to the parasite because their rural livelihoods depend on the river and lakes — even in villages with limited piped water. They use surface water to cultivate crops. They wade in the water to fish and do laundry. They harvest the cattail reeds to use for roofs, fences or floor mats. And children play in the water.

“There is a feeling of inevitability around schistosomiasis infection, given the constraints of poverty,” said Susanne Sokolow, PhD, a Stanford disease ecologist and study author with the Woods Institute for the Environment, in a recent Stanford news release. “That jibes with the experience of the many years of efforts to distribute pills and carry out educational campaigns in the regions without a huge drop in schisto transmission or infection.”

Instead of focusing exclusively on mass praziquantel distribution, the researchers also recommend using local community input to develop diverse environmental strategies for reducing infection risk. Possible interventions include chemical or biological snail control, provision of sanitation facilities and laundry platforms, removal of vegetation to reduce snail habitats and behavioral change interventions.

According to the study authors, the key is to work with the local communities to select interventions that take into account their specific social and environmental factors.

Photo by eutrophication&hypoxia / Fundraising | Wikimedia Commons   

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

How yellow fever shaped 19th-century New Orleans: A Q&A

Stanford historian explains how frequent yellow fever epidemics in nineteenth-century Louisiana generated cultural and social norms in its fatal wake.

Development of Yellow Fever_crop.png

I was intrigued when I came across the Stanford profile of Kathryn Olivarius, PhD, a historian of 19th-century America. Her research primarily explores how epidemic yellow fever disrupted society in the antebellum South, generating cultural and social norms in its fatal wake. To learn more, I spoke with her recently.

As a historian, what got you interested in fellow fever?

“When I embarked on my PhD, I wanted to write about how slavery changed in Louisiana after 1803 with the Louisiana Purchase, as the region shifted from Spanish and French to American rule. But while sitting in Tulane’s archives and perusing letters, diaries, plantation ledgers and ship manifests, what impressed me the most was how much people spoke about disease. And the disease they feared the most was undoubtedly yellow fever  — a disease that struck antebellum New Orleans at epidemic levels nearly every third summer.

Yellow fever victims experienced a sudden onset of headache, back pains, jaundice, nausea and chills. Within days, they oozed blood through their external orifices, writhed in pain and vomited up partly coagulated blood. About half of all people who contracted yellow fever in the 19th century died, while the survivors gained lifetime immunity.

In my view, yellow fever played a critical role in Louisiana’s asymmetrical social organization, on the schedule and character of the cotton market, on capitalism itself and on the entire system and ideology of racial slavery. So I decided to focus on the disease for my PhD and my forthcoming book.”

How did the disease impact the social structure of 19th-century Louisiana?

“Antebellum New Orleans sat at the heart of America’s slave and cotton kingdoms. But it was also the nation’s necropolis, the city of the dead, with yellow fever routinely killing about 8 percent of its population between July and October. In some neighborhoods — particularly those with high densities of immunologically-naive recent immigrants from Germany, Ireland and the American North — yellow fever deaths could reach 20 or even 30 percent.

These repeated epidemics generated a hierarchy of ‘acclimated’ survivors who leveraged their immunity for social, economic and political power and ‘unacclimated’ recent immigrants who languished in social and professional purgatory. Until whites could prove they were acclimated, they struggled to find steady, well-paid employment, housing, spouses and a political voice. From the employer’s perspective, it wasted time and money to train someone for a detail-oriented job only to watch him sicken and die by the autumn.”

How did this affect slavery?

“Because of the disease, the commercial-civic elite of New Orleans argued that they required large-scale black slavery — publicly proclaiming that black people were naturally immune to the disease based on spurious and racially-specific visions of medicine and biology. It became a powerful proslavery argument with many whites claiming that black slavery was natural, even humanitarian, as it distanced white people from labor, spaces and activities that would kill them. Some even argued black immunity signaled divine sanction for widespread slavery, with God creating black slaves specifically to labor in the cane and sugar fields of the Mississippi Valley.

But in private, most slavers would not buy an unacclimated slave. The slave market essentially shut down in August, September and October in order to protect the health of potential buyers and their valuable slave property. This inconsistency suggests that the widespread belief in black immunity was less a reflection of biological reality but instead a social tool, a means to epidemiologically-justify racial slavery.”

Do you believe anything similar is happening today?

“Yellow fever still kills thousands of people each year. It’s endemic in 47 countries, mostly in Africa and Central and South America. The Intergovernmental Panel on Climate Change’s report released last year also suggests that Americans may become more familiar with this disease again as ecologies change and mosquito populations migrate. Zika, spread by the same mosquito as yellow fever, has been an increasing problem in recent years.

In terms of the social impact of disease, there are certainly modern analogues of societies in the midst of terrifying epidemics rationalizing mass death or singling out certain marginalized groups as the cause. The most obvious comparison in the U.S. is probably HIV/AIDS in the 1980s with gay people, intravenous drug users and Haitians who were blamed for the disease’s spread and who faced severe discrimination on the basis of their alleged-vulnerability.”

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

Community cooperation following disasters key to recovery, Stanford study finds

Photo of Norway by Vidar Nordli-Mathisen

Why are some communities resilient in the face of disasters such as epidemics, while others struggle to recover? You might think it is driven by the availability of economic resources, but a new study shows that community cooperation — admittedly challenging in the face of an infectious disease — is the key.

Recently published in Academy of Management Journal, the study led by Hayagreeva Rao, PhD, a Stanford Business professor, found that a community’s resilience primarily depends on two factors:

  • whether the cause of the disaster is attributed to other community members or an act of nature; and
  • whether the community includes diverse organizations that encourage collaboration.

The researchers analyzed and compared two well-documented disasters that occurred in Norway in the early 1900s: an outbreak of the highly-contagious Spanish flu that caused many fatalities, and a severe spring frost that led to economic hardship for the predominantly farming community.

They found that disasters attributed to other community members — like contagious epidemics — weakened cooperation, increased distrust and led to a long-term reduction in organization building. By contrast, disasters attributed to an act of nature evoked a sense of shared fate that fostered cooperation.

Rao and colleague Henrich Greve, PhD, a professor of entrepreneurship at INSEAD, explained in the paper:

“The typical response to pandemics includes isolation and treatment, home quarantines, closure of schools, cancellation of large-scale public meetings, and other steps to reduce social density. While these immediate responses are entirely practical, policy planners should also consider how a pandemic impairs the social infrastructure of a community over the long term, and undertake initiatives to foster the building of community organizations.”

For instance, the Spanish flu impaired the Norwegian communities from building new community organizations for 25 years, they wrote.

In contrast, Norway’s farming families pulled together when faced with natural agricultural disasters — motivating them to form retail cooperatives, mutual insurance organizations and savings banks to help share risk.

The researchers determined that successful disaster recovery also hinged on the existing social infrastructure: a community with diverse and cooperative voluntary organizations more effectively responded.

“The better the infrastructure, the better the recovery,” said Rao in a recent Stanford Business news piece. “A disaster is a shock. Think of those organizations as the shock absorbers.”

In the paper, they offered an example. In the 1995 heat wave in Chicago, which led to far fewer deaths in a Latino neighborhood than in an adjacent African-American neighborhood. This was because the sheer variety of Latino neighborhood organizations created overlapping networks that allowed people to check on the elderly, they wrote.

The authors concluded with a call for more research on the effect of climate-related disasters like floods and droughts. We need to know how these impact the birth and sustainability of community volunteer organizations, they said.

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.