Stanford graduate student Aisulu Aitbekova wins 2021 Melvin P. Klein Award

Aisulu Aitbekova

Aisulu Aitbekova, a 2021 doctoral graduate from Stanford University, discovered her passion for research when she traveled from Kazakhstan to the U.S. for a summer internship as a chemical engineering undergraduate. She said that experience inspired her to go to graduate school.

After earning a master’s in chemical engineering at the Massachusetts Institute of Technology, she continued her studies at Stanford University under the supervision of Matteo Cargnello, an assistant professor of chemical engineering and Aitbekova’s doctoral advisor. Much of her thesis work involved beamline studies at the Stanford Synchrotron Radiation Lightsource (SSRL) at the Department of Energy’s SLAC National Accelerator Laboratory.  

Now, Aitbekova has been selected to receive the 2021 Melvin P. Klein Scientific Development Award, which recognizes outstanding research accomplishments by undergraduates, graduate students and postdoctoral fellows within three years of completing their doctoral degrees.

In a nomination letter for the award, SLAC Distinguished Staff Scientist Simon Bare praised Aitbekova’s initiative. “She has quickly become proficient in the application of X-ray techniques available at the synchrotron at SLAC. This proficiency and mastery include everything from operating the beamline to analyzing and interpreting the data,” he wrote.

Aitbekova said she felt “absolutely thrilled and grateful” to all of her mentors when she found out about winning the award.

“I’m so thankful for my PhD advisor Matteo Cargnello. My success would not have been possible without his mentorship,” Aitbekova said. “I’m also particularly grateful to Simon Bare, who I consider to be my second advisor. His continuous excitement about X-ray absorption spectroscopy has been the driving force for my work at SSRL.” 

Catalyzing change

Aitbekova said she is passionate about finding solutions to combat climate change. She designs materials to convert harmful pollutant gases into useful fuels and chemicals. To perform these chemical transformations, she develops catalysts and studies their properties using X-ray absorption spectroscopy (XAS). Catalysts are substances that increase rates of chemical reactions without being consumed themselves.

“I have identified that a catalyst’s size, shape and composition profoundly affect its performance in eliminating these gases,” but exactly how those properties affect performance remains unknown, she said. “This problem is further complicated by the dynamic nature of catalytic materials. As a catalyst performs chemical transformations, its structure changes, making it challenging to precisely map a catalyst’s properties to its performance.”

To overcome these barriers, she engineers materials the size of one ten-thousandth the diameter of a human hair and then tracks how they change during reactions using XAS.

In one study, Aitbekova and her colleagues engineered a catalyst using a combination of ruthenium and iron oxide nanoparticles, which they think act in a tag-team fashion to improve the synthesis of fuels from carbon dioxide and hydrogen. Using a prototype in the lab, they achieved much higher yields of ethane, propane and butane than previous catalysts.

Switching gears

While engineering catalysts that convert carbon dioxide into chemicals, she developed a new approach for preparing materials, where small particles are encapsulated inside porous oxide materials – for example, encapsulating ruthenium within a sheath of iron.

However, Aitbekova recognized a completely different application for this new approach: creating a palladium-platinum catalyst that works inside a car’s emission control system.

To eliminate the discharge of noxious emission gases, cars are equipped with a catalytic converter. Exhaust gases pass into the catalytic converter, where they are turned into less harmful gases. The catalysts inside these units are platinum and palladium metals, but these metals gradually lose their efficiency due to their extreme working conditions, she said.

“My platinum and palladium catalysts show excellent stability and performance after being subjected to air and steam at 1,100 degrees Celsius, the harshest operating environment automotive exhaust emission control catalysts could be subjected to,” explained Aitbekova. “Further improvements in these materials and successful testing under true exhaust conditions have a potential to revolutionize the field of automotive exhaust emission control.”

Her nominators agreed, citing it as the highlight of her graduate career.

“This work, currently under review for publication, is truly the remarkable result of Aisulu’s hard work and experience in pivoting from one area to another to make an impact and of her ability to connect multiple fields and solve important problems,” Cargnello wrote.

Amplifying impact

Despite this success, Aitbekova is already focused on how to make an even greater impact through mentoring and future research.

Her nominators all applauded her passion and commitment to mentor the next generation of STEM scholars, as demonstrated by mentoring “a countless number of undergraduates” according to Cargnello and by exchanging letters with middle school students from underrepresented groups.

Part of this passion, Cargnello and others wrote, stems from her experiences growing up in a highly conservative environment with the understanding that homemaking would be her eventual job. Aitbekova’s nominators wrote that they admired the fact that she made her way to Stanford and has acted as an ambassador for the values and principles of diversity and inclusion.

Aitbekova said she embraces the role. “Since my first summer research experience in the USA, I’ve wanted to serve as a bridge to science and graduate school to those who, like me, didn’t have access to such knowledge and resources.”

She will continue to act as a bridge in her next endeavor as a Kavli Nanoscience Institute Prize Postdoctoral Fellow at Caltech, where she plans to expand her work of converting carbon dioxide into fuels by running the chemical transformations with solar energy. That will “bring society one step closer to sustainable energy sources,” she said.

Bare and others praised her drive to make an everyday impact. “She has a natural passion for wanting to understand the physical principles behind the phenomena that she has observed in her research. But this passion for understanding is nicely balanced by her desire to discover something new, and to make a real difference — the practicality that is often missing in someone early in their career,” wrote Bare.

The award will be presented to Aitbekova at the 2021 SSRL/LCLS Annual Users’ Meeting during the plenary session on September 24. 

This is a reposting of my news story, courtesy of SLAC National Accelerator Laboratory.

Scientists uncover surprising behavior of a fatty acid enzyme with potential biofuel applications

Derived from microscopic algae, the rare, light-driven enzyme converts fatty acids into starting ingredients for solvents and fuels.

A study using SLAC’s LCLS X-ray laser captured how light drives a series of complex structural changes in an enzyme called FAP, which catalyzes the transformation of fatty acids into starting ingredients for solvents and fuels. This drawing captures the starting state of the catalytic reaction. The dark green background represents the protein’s molecular structure. The enzyme’s light-sensing part, called the FAD cofactor, is shown at center right with its three rings absorbing a photon coming from bottom left. A fatty acid at upper left awaits transformation. The amino acid shown at middle left plays an important role in the catalytic cycle, and the red dot near the center is a water molecule. (Damien Sorigué/Université Aix-Marseille)

By Jennifer Huber

Although many organisms capture and respond to sunlight, it’s rare to find enzymes – proteins that promote chemical reactions in living things – that are driven by light. Scientists have identified only three so far. The newest one, discovered in 2017, is called fatty acid photodecarboxylase (FAP). Derived from microscopic algae, FAP uses blue light to convert fatty acids into hydrocarbons that are similar to those found in crude oil.

“A growing number of researchers envision using FAPs for green chemistry applications because they can efficiently produce important components of solvents and fuels, including gasoline and jet fuels.” says Martin Weik, the leader of a research group at the Institut de Biologie Structurale at the Université Grenoble Alpes.

Weik is one of the primary investigators in a new study that has captured the complex sequence of structural changes, or photocycle, that FAP undergoes in response to light, which drives this fatty acid transformation. Researchers had proposed a possible FAP photocycle, but the fundamental mechanism was not understood, partly because the process is so fast that it’s very difficult to measure. Specifically, scientists didn’t know how long it took FAP to split a fatty acid and release a hydrocarbon molecule.

Experiments at the Linac Coherent Light Source (LCLS) at the Department of Energy’s SLAC National Accelerator Laboratory helped answer many of these outstanding questions. The researchers describe their results in Science.

All the tools in a toolbox

To understand a light-sensitive enzyme like FAP, scientists use many different techniques to study processes that take place over a broad range of time scales. For instance, photon absorption happens in femtoseconds, or millionths of a billionth of a second, while biological responses on the molecular level often happen in thousandths of a second.

“Our international, interdisciplinary consortium, led by Frédéric Beisson at the Université Aix-Marseille, used a wealth of techniques, including spectroscopy, crystallography and computational approaches,” Weik says. “It’s the sum of these different results that enabled us to get a first glimpse of how this unique enzyme works as a function of time and in space.”

The consortium first studied the complex steps of the catalytic process at their home labs using optical spectroscopy methods, which investigate the electronic and geometric structure of atoms in the samples, including chemical bonding and charge. Spectroscopic experiments identified the intermediate states of the enzyme that accompanied each step, measured their lifetimes and provided information on their chemical nature. These results revealed the need for the ultrafast capabilities of the LCLS X-ray free-electron laser (XFEL), which can track the molecular motion with atomic precision.

A structural view of changes in the FAP molecule during the catalytic process was provided by serial femtosecond crystallography (SFX) at the LCLS. During these experiments, a jet of tiny FAP microcrystals was hit with optical laser pulses to kick off the catalytic reaction. This ensured that all the molecules react at a similar time, synchronizing their behavior and making it possible to track the process in detail. Extremely brief, ultrabright X-ray pulses then measured the resulting changes in the enzyme’s structure.

By integrating thousands of these measurements – acquired using various time delays between the optical and X-ray pulses – the researchers were able to follow structural changes in the enzyme. They also determined the structure of the enzyme’s resting state by probing without the optical laser.

Surprisingly, the researchers found that in the resting state, the light-sensing part of the enzyme has a bent shape. “This small molecule, called the FAD cofactor, is a derivative of vitamin B2 that acts like an antenna to capture photons,” Weik says. “It absorbs blue light and initiates the catalytic process. We thought the starting point of the FAD cofactor was planar, so this bent configuration was unexpected.”

The bent shape of the FAD cofactor was first discovered by X-ray crystallography at the European Synchrotron Radiation Facility, but the scientists had suspected this bend was an artifact of radiation damage, a common problem for crystallographic data collected at synchrotron light sources.

“Only SFX experiments could confirm this unusual configuration because of their unique ability to capture structural information before damaging the sample,” Weik says. “These experiments were complemented by computations. Without the high-level quantum calculations performed by Tatiana Domratcheva of Moscow State University, we wouldn’t have understood our experimental results.”

Next steps

Even with this improved understanding of FAP’s photocycle, unanswered questions remain. For example, researchers know carbon dioxide is formed during a certain step of the catalytic process at a specific time and location, but they don’t know if it is transformed into another molecule before leaving the enzyme.

“In future XFEL work, we want to identify the nature of the products and to take pictures of the process with a much smaller step size so as to resolve the process in much finer detail,” says Weik. “This is important for fundamental research, but it can also help scientists modify the enzyme to do a task for a specific application.”

Such precision experiments will be fully enabled by upcoming upgrades to the LCLS facility that will increase its pulse repetition rate from 120 pulses per second to 1 million pulses per second, transforming scientists’ ability to track complex processes like this.

Other researchers are already working towards industrial FAP applications, including a group that is designing an economic way to produce gases such as propane and butane.

The interdisciplinary consortium included researchers from the Institute of Structural Biology in Grenoble, Max Planck Institute for Medical Research in Heidelberg, Université Aix-Marseille, Ecole Polytechnique in Paris-Palaiseau, the Integrative Biology of the Cell Institute in Paris-Saclay, Moscow State University, the ESRF and SOLEIL synchrotrons in Grenoble and Paris-Saclay, and the team at SLAC National Accelerator Laboratory.

LCLS is a DOE Office of Science user facility. Major funding for this work came from the French National Research Agency (ANR).

Citation: D. Sorigué et al., Science, 9 April 2021 ((https://doi.org/10.1126/science.abd5687)

For questions or comments, contact the SLAC Office of Communications at communications@slac.stanford.edu.

This reposting of my news release, courtesy of SLAC National Accelerator Center.

Harnessing AMReX for Wind Turbine Simulations

ECP ExaWind Project Taps Bereley Lab’s AMReX to Help Model Next-Generation Wind Farms

Driving along Highway 580 over the Altamont Pass in Northern California, you can’t help but marvel at the 4,000+ wind turbines slowly spinning on the summer-golden hillsides. Home to one of the earliest wind farms in the United States, Altamont Pass today remains one of the largest concentrations of wind turbines in the world. It is also a symbol of the future of clean energy.

Before utility grids can achieve wide-scale deployment of wind energy, however, they need more efficient wind plants. This requires advancing our fundamental understanding of the flow physics governing wind-plant performance.

ExaWind, a U.S. Department of Energy (DOE) Exascale Computing Project, is tackling this challenge by developing new simulation capabilities to more accurately predict the complex flow physics of wind farms. The project entails a collaboration between the National Renewable Energy Laboratory (NREL), Sandia National Laboratories, Oak Ridge National Laboratory, the University of Texas at Austin, Parallel Geometric Algorithms, and — as of a few months ago — Lawrence Berkeley National Laboratory (Berkeley Lab).

“Our ExaWind challenge problem is to simulate the air flow of nine wind turbines arranged as a three-by-three array inside a space five kilometers by five kilometers on the ground and a kilometer high,” said Shreyas Ananthan, a research software engineer at NREL and lead technical expert on the project. “And we need to run about a hundred seconds of real-time simulation.” 

By developing this virtual test bed, the researchers hope to revolutionize the design, operational control, and siting of wind plants, plus facilitate reliable grid integration. And this requires a combination of advanced supercomputers and unique simulation codes.

Unstructured + Structured Calculations

The principle behind a wind turbine is simple: energy in the wind turns the turbine blades, which causes an internal gearbox to rotate and spin a generator that produces electricity. But simulating this is complicated. The flexible turbine blades rotate, bend, and twist as the wind shifts direction and speed. The yaw and pitch of these blades are controlled in real time to extract as much energy as possible from a wind event. The air flow also entails complex dynamics  — such as influences from the ground terrain, formation of a turbulent wakefield downstream from the blades, and turbine-turbine interactions.

To improve on current simulations, scientists need more computing power and higher resolution models that better capture the crucial dynamics. The ExaWind team is developing a predictive, physics-based, and high-resolution computational model — progressively building from petascale simulations of a single turbine toward exascale simulations of a nine-turbine array in complex terrain.

A Nalu-Wind solution to the differential equations of motion for a wind turbine operating in uniform air flow (moving from left to right). Two of the three wind turbine’s blades are pictures (think blue rectangles on left). The slice in the background represents the contours of the whirling air’s motion, showing the vertical direction of the wake structure behind the turbine blades (red indicates swirl in counterclockwise direction and blue clockwise direction around blade tip).

“We want to know things like the air velocity and air temperature across a big three-dimensional space,” said Ann Almgren, who leads the Center for Computational Sciences and Engineering in Berkeley Lab’s Computational Research Division. “But we care most about what’s happening right at the turbines where things are changing quickly. We want to focus our resources near these turbines, without neglecting what’s going on in the larger space.”

To achieve the desired accuracy, the researchers are solving fluid dynamics equations near the turbines using a computational code called Nalu-Wind, a fully unstructured code that gives users the flexibility to more accurately describe the complex geometries near the turbines, Ananthan explained.

But this flexibility comes at a price. Unstructured mesh calculations have to store information not just about the location of all the mesh points but also about which points are connected to which. Structured meshes, meanwhile, are “logically rectangular,” which makes a lot of operations much simpler and faster.

“Originally, ExaWind planned to use Nalu-Wind everywhere, but coupling Nalu-Wind with a structured grid code may offer a much faster time-to-solution,” Almgren said.

Enter AMReX

Luckily, Ananthan knew about Berkeley Lab’s AMReX, a C++ software framework that supports block-structured adaptive-mesh algorithms for solving systems of partial differential equations. AMReX supports simulations on a structured mesh hierarchy; at each level the mesh is made up of regular boxes, but the different levels have different spatial resolution.

Ananthan explained they actually want the best of both worlds: unstructured mesh near the turbines and structured mesh elsewhere in the domain. The unstructured mesh and structured mesh have to communicate with each other, so the ExaWind team validated an overset mesh approach with an unstructured mesh near the turbines and a background structured mesh. That’s when they reached out to Almgren to collaborate.

“AMReX allows you to zoom in to get fine resolution in the regions you care about but have coarse resolution everywhere else,” Almgren said. The plan is for ExaWind to use an AMReX-based code (AMR-Wind) to resolve the entire domain except right around the turbines, where the researchers will use Nalu-Wind. AMR-Wind will generate finer and finer cells as they get closer to the turbines, basically matching the Nalu-Wind resolution where the codes meet. Nalu-Wind and AMR-Wind will talk to each other using a coupling code called TIOGA.

Even with this strategy, the team needs high performance computing. Ananthan’s initial performance studies were conducted on up to 1,024 Cori Haswell nodes at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC) and 49,152 Mira nodes at the Argonne Leadership Computing Facility.

“For the last three years, we’ve been using NERSC’s Cori heavily, as well as NREL’s Peregrine and Eagle,” said Ananthan. Moving forward, they’ll also be using the Summit system at the Oak Ridge Leadership Computing Facility and, ultimately, the Aurora and Frontier exascale supercomputers — all of which feature different types of GPUs: NVIDIA on Summit (and NERSC’s next-generation Perlmutter system), Intel on Aurora, and AMD on Frontier. 

Although Berkeley Lab just started partnering with the ExaWind team this past fall, the collaboration has already made a lot of progress. “Right now we’re still doing proof-of-concept testing for coupling the AMR-Wind and Nalu-Wind codes, but we expect to have the coupled software running on the full domain by the end of FY20,” said Almgren.

NERSC is a DOE Office of Science user facility.

Top figure: Some of the 4000+ wind turbines in Northern California’s Altamont Pass wind farm. Credit: David Laporte

This is a reposting of my news feature, courtesy of Berkeley Lab.

Searching for Photocathodes that Convert CO2 into Fuels

Figure
Six-step selection criteria used in the search for photocathodes for CO2 reduction. The search began with 68,860 inorganic compounds. The number of materials that satisfied the requirements of each step are shown in red, with 52 meeting all the requirements.

Carbon dioxide (CO2) has a bad reputation due to its pivotal role in the greenhouse gas effect at the Earth’s surface. But scientists at the Joint Center for Artificial Photosynthesis (JCAP), a U.S. Department of Energy (DOE) Innovation Hub, view CO2 as a promising solution to clean, low-cost, renewable energy.

JCAP is a team led by the California Institute of Technology (Caltech) that brings together more than 100 world-class scientists and engineers, primarily from Caltech and its lead partner, Lawrence Berkeley National Laboratory (Berkeley Lab).

The JCAP team is developing new ways to produce transportation fuels from CO2, sunlight, and water using a process called artificial photosynthesis, which harvests solar energy and stores it in chemical bonds. If successful, they’ll be able to produce fuels while also eliminating some CO2 — a “win-win,” according to Arunima Singh, an assistant professor of physics at Arizona State University and a former member of the JCAP team.

Singh became involved in the research as a postdoctoral associate at Berkeley Lab, where she searched for new photocathodes to efficiently convert CO2 to chemical fuels — a major hurtle to realizing scalable artificial photosynthesis.

“There is a dire need to find new materials to enable the photocatalytic conversion of CO2. The existing photocathodes have very low efficiencies and product selectivity, which means the CO2 produces many products that are expensive to distill,” said Singh. “Previous experimental attempts found new photocatalytic materials by trial and error, but we wanted to do a more directed search.”

Searching for Needles in a Materials Project Haystack

Using supercomputing resources at the National Energy Research Scientific Computing Center (NERSC), the Berkeley Lab team performed a massive photocathode search, starting with 68,860 materials and screening them for specific intrinsic properties. Their results were published in the January issue of Nature Communications.

“The candidate materials need to be thermodynamically stable so they can be synthesized in the lab. They need to absorb visible light. And they need to be stable in water under the highly reducing conditions of CO2 reduction, ” said first author Singh. “These three key properties were already available through the Materials Project.”

The Materials Project is a DOE-funded database of materials properties calculated based on predictive quantum-mechanical simulations using supercomputing clusters at NERSC, which is a DOE Office of Science User Facility. The database includes both experimentally known materials and hypothetical structures predicted by machine learning algorithms or various other procedures. Of the 68,860 candidate materials screened in the Nature Communications study, about half had already been experimentally synthesized, while the remaining were hypothetical.

The researchers screened these materials in six steps. First they used the Materials Project to identify the materials that were thermodynamically stable, able to absorb visible light, stable in water, and electrochemically stable. This strategy reduced the candidate pool to 235 materials — dramatically narrowing the list for the final two steps, which required computationally intensive calculations.

“By leveraging a large amount of data already available in the Materials Project, we were able to cut the computational cost of the project by several millions of CPU hours,” said Kristin Persson, a faculty scientist in Berkeley Lab’s Energy Technologies Area and senior author on the paper.

Additional Screening with First-Principles Calculations

However, the Materials Project database did not have all the necessary data. So the final screening required new first-principles simulations of materials properties based on quantum mechanics to accurately estimate the electronic structures and understand the energy of the excited electrons. These calculations were computed at NERSC and the Texas Advanced Computing Center (TACC) for the remaining 235 candidate materials.

“NERSC is the backbone of the Materials Project computation and database. But we also used about two million NERSC core hours to do the step 5 and 6 calculations,” said Singh. “Without NERSC, we would have been running our simulations on 250 cores for 24 hours a day for a year, versus being able to do these calculations in parallel on NERSC in a matter of a few months.”

The team also used about half a million core hours for these calculations at TACC, which were allocated through the National Science Foundation’s Extreme Science and Engineering Discovery Environment (XSEDE).

These theoretical calculations showed that 52 materials met all of the stringent requirements of the screening process, but that only nine of these had been previously studied for CO2 reduction. Among the 43 newly identified photocathodes, 35 have previously been synthesized and eight are hypothetical materials.

“We performed the largest exploratory search for CO2 reduction photocathodes to date, covering 68,860 materials and identifying 43 new photocathode materials exhibiting promising properties,” Persson said.

Finally, the researchers narrowed the list down to 39 promising candidates by looking at the vibrational properties of the eight hypothetical materials and ruling out the four predicted to be unstable by themselves.

However, more work is needed before artificial photosynthesis becomes are reality, including working with their experimental colleagues like Caltech’s John Gregoire (a leader of JCAPS’s high-throughput experimentation laboratory) to validate their computational results.

“We have collaborators at Berkeley Lab and Caltech who are actively trying to grow these materials and test them,” Singh said. “I’m excited to see our study opening up new avenues of research.”

This is a reposting of my Computing Sciences news feature, courtesy of Berkeley Lab.

Low-cost “magic box” could decontaminate water in rural communities

Photo by Shawn

More than a billion people drink water that is contaminated and can spread deadly diseases such as cholera, dysentery, hepatitis A, typhoid, polio and diarrhea.

Most contaminated water could be purified by adding hydrogen peroxide, which safely kills many of the disease-causing organisms and oxidizes organic pollutants to make them less harmful. Hydrogen peroxide disinfects water in a similar way as standard water chlorination, but it leaves no harmful residual chemicals. Unfortunately, it’s difficult to make or obtain hydrogen peroxide in rural settings with limited energy sources.

Now, researchers from Stanford University and SLAC National Accelerator Laboratory have developed a portable device that produces hydrogen peroxide from oxygen gas and water — and it can be powered by a battery or conventional solar panels. You can hold the small device in one hand.

“The idea is to develop an electrochemical cell that generates hydrogen peroxide from oxygen and water on site, and then use that hydrogen peroxide in ground water to oxidize organic contaminants that are harmful for humans to ingest,” said Christopher Hahn, PhD, a SLAC associate staff scientist, in a recent news release.

First, the researchers designed and synthesized a catalyst that selectively speeds up the chemical reaction of converting oxygen gas into hydrogen peroxide. For this application, standard platinum-mercury or gold-plated catalysts were too expensive, so they investigated cheaper carbon-based materials.

Next, they used their carbon-based material to build a low-cost, simple and robust device that generates and stores hydrogen peroxide at the concentration needed for water purification, which is one-tenth the concentration of the hydrogen peroxide you buy at the drug store for cleaning a cut. Although this device uses materials not available in rural communities, it could be cheaply manufactured and shipped there.

Their results were recently reported in Reaction Chemistry and Engineering. However, more work needs to be done before a higher-capacity device will be available for use.

“Currently it’s just a prototype, but I personally think it will shine in the area of decentralized water purification for the developing world,” said Bill Chen, first author and a chemistry graduate student at Stanford. “It’s like a magic box. I hope it can become a reality.”

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine. For more details, please read my SLAC news release.

Berkeley Lab Tackles Vaccine Delivery Problem with Portable Solar-Powered Vaccine Fridge

LBNL Institute for Globally Transformative Technologies research team with prototype vaccine fridge and backpack for developing countries. (Berkeley Lab / Roy Kaltschmidt)
LBNL Institute for Globally Transformative Technologies research team with prototype vaccine fridge and backpack for developing countries. (Berkeley Lab / Roy Kaltschmidt)

Vaccines are arguably one of the most important inventions of mankind. Unfortunately, vaccines must be produced and stored in an environment with very tight temperature regulation – between 36 °F and 46 °F – to keep the vaccine bugs alive. So vaccine delivery is a major problem due to the absence of reliable refrigeration in many remote countries.

Approximately 30 million children worldwide – roughly one in five – do not receive immunizations, leaving them at significant risk of disease. As a result, 1.5 million children under the age of five die annually from vaccine-preventable diseases, such as pneumonia and diarrhea. Perhaps more surprising, almost half of the vaccines in developing countries are thrown away because they get too warm during delivery so they are no longer viable. Some administered vaccines are also ineffective because they froze during transport, but there is no easy way to test this.

Scientists at Lawrence Berkeley National Laboratory (LBNL) are trying to solve this vaccine delivery problem by developing a portable solar-powered fridge. Fabricated entirely at LBNL, their portable solar-powered vaccine fridge will be transported by bicycle or motorcycle in remote areas of the developing world. Zach Friedman and Reshma Singh are leading the project as part of the LBNL Institute for Globally Transformative Technologies, which seeks to bring scientific and technological breakthroughs to address global poverty and related social ills.

The team’s first prototype portable fridge uses a thermoelectric heat pump, rather than a traditional vapor compression heat pump that relies on a circulating liquid refrigerant to absorb and remove heat. The thermoelectric chips were initially developed to keep laptops cool, so laptops could be made thinner without fans. The technology was adapted for this global application to reduce the size and weight of the fridge.

Their portable units have a one to three-liter capacity, much smaller than standard solar fridges that are typically 50 liters or more. Once the fridge cools down to the right temperature (36 °F – 46 °F), it is designed to run within that temperature range for at least five days without any power, at an ambient outside temperature as hot as 110 °F.

Before the researchers can field test their first prototype fridge in Africa, they need to pass the World Health Organization’s Performance, Quality and Safety testing protocol for products used in immunization programs. They are currently busy performing in-house testing at LBNL to ensure that they pass the formal tests, which will be conducted by an independent laboratory in the UK.

“We aren’t in the process of field testing yet, but we have established field testing agreements in both Kenya and Nigeria and have locations identified,” said Friedman. “We expect to start testing this coming year.”

Meanwhile, they are continuing their portable fridge development. “Currently, we are pursuing both thermoelectric and vapor compression heat pumps, even for these smaller devices,” explained Jonathan Slack, lead engineer. “It is not clear which will win out in terms of manufacturability and affordability.”

They are also developing a backpack version of the vaccine fridge. However, human-carried devices have to meet stricter World Health Organization standards, so they are focusing at this stage on the small portable fridge instead.

Ultimately their goal is to make it easy for health care workers to deliver viable vaccines to children in remote areas, solving the “last miles” of vaccine delivery.

This is a repost of my KQED Science blog.

Solar Research Shines

sunshine
Courtesy of Creative Commons

Everyone loves the idea of solar power — heating and cooling your home using the sun as a clean, free source of power. It sounds like the ultimate way to lower your carbon foot print! However, solar cells are expensive and typically only about 15% efficient, as I discussed in an earlier blog.

In order to make solar power more practical on a wide scale, a lot of research is underway to increase solar power efficiency. Stanford researchers have just reported a significant breakthrough in such solar power research, as described in their new paper in Nature Materials. They have developed a novel solar technology that uses both the light and heat of the sun to generate electricity. This new technology could double solar power efficiency and make it more affordable.

When most people think of solar power, they think of rooftop solar panels. These sort of solar panels (or arrays of photovoltaic solar cells) use expensive semiconductor materials to convert photons of light into electricity. The photons from sunlight are absorbed by the semiconductor material, so the energy from the photons is given to the electrons in the semiconductor. The energy given to an electron can “excite” it from the valence band to the conduction band, where it is free to move around within the semiconductor to produce electricity. Solar panels basically convert solar energy into direct current electricity. However, these types of solar panels aren’t very efficient. If an excited photon doesn’t absorb enough energy, then it can’t make it to the conduction band to produce electricity. On the other hand, if an excited photon absorbs more energy than needed (to make it to the conduction band) then the excess energy is lost as heat. In silicon solar panels, half of the solar energy that hits the solar panel is lost due to these two processes. Ideally you would like to somehow harvest the energy that is lost as heat, in order to make solar cells more efficient.

Solar power can also be generated by a thermionic energy convertor, which directly converts heat into electricity. A thermionic converter produces electricity by causing a heat-induced flow of electrons from a hot cathode across a vacuum gap to a cooler anode. However, only a small fraction of the electrons gain sufficient thermal energy to generate this kind of electricity, and very high temperatures are needed for efficient thermionic conversion.

The Stanford researchers have recently developed a new process that exploits the benefits of both solar and thermal cell conversion. The research was led by Nicholas Melosh, as a joint venture of Stanford and SLAC National Accelerator Laboratory. Melosh’s group coated a piece of semiconducting material with a thin layer of metal cesium, demonstrating that this allowed the material to use both light and heat to generate electricity. This new PETE (photon-enhanced thermionic emission) device used the same basic architecture as a thermionic converter except with this special semiconductor as the cathode.

Although the physical process of this PETE device is different than the standard solar cell mechanisms, the new device gives a similar response at very high temperatures. In fact, the PETE device is most efficient at over 200 C. This means that PETE devices won’t replace rooftop solar panels, since they require higher temperatures to be efficient. Instead, they could be used in combination with solar concentrators as part of a large scale solar power plant, for instance in the Mojave Desert.

Melosh’s initial “proof of concept” research was performed with the semiconductor galium nitride to demonstrate that the new energy conversion process works, but galium nitride isn’t suitable for solar applications. They plan to extend their research to other semiconductors, such as gallium arsenide which is commonly used in household electronics. Based on theoretical calculations, they expect to develop PETE devices that operate with a 50 percent efficiency at temperatures exceeding 200 C. They hope to design the new PETE devices so they can be easily incorporated into existing solar power plants, significantly increasing the efficiency of solar power to make it competitive with oil.

Solar-Powered Drip Irrigation May Save Lives in Africa

Americans spend on average 12.4% of their paycheck on food according to the U.S. Department of Labor’s latest survey. In contrast, sub-Saharan African communities spend 50-80% of their income on food, even though they are engaged in agricultural production as their main livelihood. These communities rely on rain-fed agriculture for crop production, despite having a short annual rainy season of only 3-6 months. Traditionally women and girls are responsible for hauling water by hand from very long distances in order to grow some crops, particularly during the long dry season.

Only 4% of cropland is irrigated in sub-Saharan Africa. Clearly irrigation could help improve quality of life for these food-insecure communities, if a water source is available. The most efficient type of irrigation for such a dry climate is drip (micro) irrigation, which delivers water and fertilizer directly to the roots of a plant. Low-pressure drip irrigation systems require only 1 m of pressure to irrigate plots of up to 1,000 square meters (0.25 acres). However, this irrigation technology requires access to a reliable water source.

One solution is a photovoltaic-powered drip irrigation system that combines the efficiency of the drip irrigation with the reliability of a solar-powered water pump. In such a system, a photovoltaic solar array powers a surface or submersible pump (depending on the water source) that feeds water into a reservoir. The reservoir then gravity-distributes the water to the low-pressure drip irrigation system. Energy is stored via the height of column of water in the reservoir. These systems can be configured so that no batteries are required. The pump only runs during the daytime and the system passively self-regulates. Namely, the volume of water increases on clear hot days when plants need the most water.

This kind of solar-powered drip irrigation system was tested in two rural villages in Northern Benin. The systems were installed and financed by an Non-governmental Organization, Solar Electric Light Fund, with the goal of boosting vegetable production from communal gardens in order to combat high malnutrition and poverty levels. The research was performed in collaboration with Stanford University. This NGO-academic research team scientifically evaluated the impact of the irrigation system on the community through a randomized controlled project that was rigorously studied and analyzed. The study results were recently published by Stanford University in the Proceeding of the National Academy of Sciences.

Three solar-powered drip irrigation systems were installed in two villages. Each irrigation system was used collectively by an agricultural group of 30-35 women, who each farmed her own 120 square meter plot and some additional shared plots used for group expenses. Researchers monitored these communities, as well as two “control” villages in which women’s agricultural groups grew vegetables by hand watering. This allowed a comparison between the solar-powered drip irrigation system to traditional watering method.

Each of the solar-powered irrigation systems supplied on average 1.9 tonnes of produce per month — including high-valued crops such as tomatoes, okra, peppers, eggplants, carrots, and greens — without displacing other agricultural production. The women farmers kept on average 18% by weight of the vegetables and sold the rest at local markets. As a result, vegetable intake across all villages increased by about 1 serving (150 g raw weight) per day during the rainy season. For the villages with irrigation systems, the vegetable intake rose to 3-5 servings per day even during the dry season. Overall the users of the irrigation systems showed remarkable benefits even in the first year, compared with the control households. The article states, “Their standard of living increased relative to the non-beneficiaries (by 80% of the baseline), their consumption of vegetables increased to the Recommended Daily Allowance, and the income generated by production of market vegetables enabled them to purchase staples and protein during the dry season.”

Hardly anyone is going to argue against the potential benefit of irrigation in Africa. However, one question remains — is the expense of a solar-powered system really necessary? The Stanford researchers would argue that it is, despite the expensive up-front costs. They compared their irrigation system with a hypothetical alternative system that used a liquid-fuel (gasoline, kerosene, or diesel) engine-driven pump, instead of the photovoltaic array and pump. This alternate pump can have significant problems, because fuel supplies can be unreliable and fuel prices volatile. According to their analysis, the solar-powered irrigation system is actually more cost effective in the long run, particular when fuel prices are high. It is also better for the environment since it doesn’t cause carbon-emissions.

The solar-powered drip irrigation system in the Benin project cost approximately $18,000 to install ($475 per 120 square meter plot) and requires about $5,750 ($143 per plot) per year to maintain. Based on the projected earnings of the farmers, the system should pay for itself in about 2.3 years. In addition, the cost of the photovoltaic arrays is expected to lower for larger-scale projects.

The project in Benin isn’t the only one underway. Solar-powered drip irrigation systems are also being installed by other groups in different areas of the world. For instance, the Sustainable Agriculture Water Management Project has installed solar-powered drip irrigation systems to 5,000 farmers in Sri Lanka’s dry zones. The hope is that these international efforts can provide substantial economic, nutritional, and environmental benefits to food-insecure impoverished communities.

Making Diesel at Solar Plants

Normally biofuels and solar power are considered to be competing alternative energy sources. However, some researchers are merging these technologies, trying to use the best of both to create “solar fuels.”  This includes the researchers at a small start-up company from Cambridge Massachusetts, Joule Unlimited, which was recently listed as one of the world’s ten most important emerging technologies by MIT’s Technology Review 2010 TR10. It was also selected as part of the TR50 in February, the only company besides Google that was chosen for both honors.

Joule Unlimited has manipulated and designed genes to create photosynthetic microorganisms. These microorganisms use energy from the sun to convert carbon dioxide and water directly into ethanol or hydrocarbon fuels (such as diesel). The photosynthetic microorganisms are designed with a genetic switch that limits growth. They are allowed to multiply for a couple days, then the genetic switch is flipped to divert their energy into fuel production. The microorganisms excrete the fuel, which is chemically separated and collected using conventional technologies.

The goal of this direct, continuous process is to achieve high fuel production with minimal land use. The microorganisms are grown in water inside transparent bioreactors, where they are circulated to make sure that all the microorganisms are exposed to sunlight. Different kinds of non-potable water can be used in this process, including brackish water, waste water or seawater. The microorganisms are fed concentrated carbon dioxide and other nutrients. The long term hope is to use carbon dioxide from polluting facilities such as coal plants.

Joule Unlimited claims to have specifically designed both their microorganisms and bioreactors to work in harmony together, in order to maximize fuel production. For instance, the company carefully designed the bioreactor to keep the heat within the limits required by their microorganism. In the long term, the company is hoping to produce 25,000 gallons per acre per year of ethanol and 15,000 gallons per acre per year of diesel at the competitive price of $30 per barrel. They are planning to scale up from demonstration facilities to building a commercial facility in 2012, in order to start producing diesel in 2013. However, their engineers still need to improve the performance of the microorganism to meet these targets, as well as address whatever issues arise during scale-up.

Joule Unlimited isn’t the only one working in this research area. Others working on solar fuels include:  (1) Synthetic Genomics in La Jolla, CA, (2) BioCee in Minneapolis, MN, and (3) University of Minnesota BioTechnology Institute, St. Paul, MN. Hopefully the race is on, and the winner will be all of us.

Joule facility
A diagram of how a Joule facility would work with bioreactors growing micro organisms with sunlight and CO2 in water. A separator removes the end product -- liquid fuel or chemicals. (Courtesy of Joule Unlimited)