Superconductivity and charge density waves caught intertwining at the nanoscale

The team aimed infrared laser pulses at the YBCO sample to switch off its superconducting state, then used X-ray laser pulses to illuminate the sample and examined the X-ray light scattered from it. Their results revealed that regions of superconductivity and charge density waves were arranged in unexpected ways. (Courtesy Giacomo Coslovich/SLAC National Accelerator Laboratory)

Room-temperature superconductors could transform everything from electrical grids to particle accelerators to computers – but before they can be realized, researchers need to better understand how existing high-temperature superconductors work.

Now, researchers from the Department of Energy’s SLAC National Accelerator Laboratory, the University of British Columbia, Yale University and others have taken a step in that direction by studying the fast dynamics of a material called yttrium barium copper oxide, or YBCO.

The team reports May 20 in Science that YBCO’s superconductivity is intertwined in unexpected ways with another phenomenon known as charge density waves (CDWs), or ripples in the density of electrons in the material. As the researchers expected, CDWs get stronger when they turned off YBCO’s superconductivity. However, they were surprised to find the CDWs also suddenly became more spatially organized, suggesting superconductivity somehow fundamentally shapes the form of the CDWs at the nanoscale.

“A big part of what we don’t know is the relationship between charge density waves and superconductivity,” said Giacomo Coslovich, a staff scientist at the Department of Energy’s SLAC National Accelerator Laboratory, who led the study. “As one of the cleanest high-temperature superconductors that can be grown, YBCO offers us the opportunity to understand this physics in a very direct way, minimizing the effects of disorder.”

He added, “If we can better understand these materials, we can make new superconductors that work at higher temperatures, enabling many more applications and potentially addressing a lot of societal challenges – from climate change to energy efficiency to availability of fresh water.”

Observing fast dynamics

The researchers studied YBCO’s dynamics at SLAC’s Linac Coherent Light Source (LCLS) X-ray laser. They switched off superconductivity in the YBCO samples with infrared laser pulses, and then bounced X-ray pulses off those samples. For each shot of X-rays, the team pieced together a kind of snapshot of the CDWs’ electron ripples. By pasting those together, they recreated the CDWs rapid evolution.

“We did these experiments at the LCLS because we needed ultrashort pulses of X-rays, which can be made at very few places in the world. And we also needed soft X-rays, which have longer wavelengths than typical X-rays, to directly detect the CDWs,” said staff scientist and study co-author Joshua Turner, who is also a researcher at the Stanford Institute for Materials and Energy Sciences. “Plus, the people at LCLS are really great to work with.”

These LCLS runs generated terabytes of data, a challenge for processing. “Using many hours of supercomputing time, LCLS beamline scientists binned our huge amounts of data into a more manageable form so our algorithms could extract the feature characteristics,” said MengXing (Ketty) Na, a University of British Columbia graduate student and co-author on the project.

The team found that charge density waves within the YBCO samples became more correlated – that is, more electron ripples were periodic or spatially synchronized – after lasers switched off the superconductivity.

“Doubling the number of waves that are correlated with just a flash of light is quite remarkable, because light typically would produce the opposite effect. We can use light to completely disorder the charge density waves if we push too hard,” Coslovich said.

Blue areas are superconducting regions, and yellow areas represent charge density waves. After a laser pulse (red), the superconducting regions are rapidly turned off and the charge density waves react by rearranging their pattern, becoming more orderly and coherent. (Greg Stewart/SLAC National Accelerator Laboratory)

To explain these experimental observations, the researchers then modeled how regions of CDWs and superconductivity ought to interact given a variety of underlying assumptions about how YBCO works. For example, their initial model assumed that a uniform region of superconductivity when shut off with light would become a uniform CDW region – but of course that didn’t agree with their results.  

“The model that best fits our data so far indicates that superconductivity is acting like a defect within a pattern of the waves. This suggests that superconductivity and charge density waves like to be arranged in a very specific, nanoscopic way,” explained Coslovich. “They are intertwined orders at the length scale of the waves themselves.”

Illuminating the future

Coslovich said that being able to turn superconductivity off with light pulses was a significant advance, enabling observations on the time scale of less than a trillionth of a second, with major advantages over previous approaches.

“When you use other methods, like applying a high magnetic field, you have to wait a long time before making measurements, so CDWs rearrange around disorder and other phenomena can take place in the sample,” he said. “Using light allowed us to show this is an intrinsic effect, a real connection between superconductivity and charge density waves.”

The research team is excited to expand on this pivotal work, Turner said. First, they want to study how the CDWs become more organized when the superconductivity is shut off with light. They are also planning to tune the laser’s wavelength or polarization in future LCLS experiments in hopes of also using light to enhance, instead of quench, the superconducting state, so they could readily turn the superconducting state off and on.

“There is an overall interest in trying to do this with pulses of light on very fast timescales, because that can potentially lead to the development of superconducting, light-controlled devices for the new generation of electronics and computing,” said Coslovich. “Ultimately, this work can also help guide people who are trying to build room-temperature superconductors.”

This research is part of a collaboration between researchers from LCLS, SLAC’s Stanford Synchrotron Radiation Lightsource (SSRL), UBC, Yale University, the Institut National de la Recherche Scientifique in Canada, North Carolina State University, Universita CAattolica di Brescia and other institutions. This work was funded in part by the DOE Office of Science. LCLS and SSRL are DOE Office of Science user facilities.

Citation: Scott Wandel et al., Science, 20 May 2022 (10.1126/science.abd7213)

This is a reposting of my news feature courtesy of Stanford Linear Accelerator Laboratory.

Revitalizing batteries by bringing ‘dead’ lithium back to life


An animation shows how charging and discharging a lithium battery test cell causes an island of “dead,” or detached, lithium metal to creep back and forth between the electrodes. The movement of lithium ions back and forth through the electrolyte creates areas of negative (blue) and positive (red) charge at the ends of the island, which swap places as the battery charges and discharges. Lithium metal accumulates at the negative end of the island and dissolves at the positive end; this continual growth and dissolution causes the back-and-forth movement seen here. SLAC and Stanford researchers discovered that adding a brief, high-current discharging step right after charging the battery nudges the island to grow in the direction of the anode, or negative electrode. Reconnecting with the anode brings the islands dead lithium back to life and increases the batterys lifetime by nearly 30%. (Greg Stewart/SLAC National Accelerator Laboratory.)

Menlo Park, Calif. — Researchers at the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University may have found a way to revitalize rechargeable lithium batteries, potentially boosting the range of electric vehicles and battery life in next-gen electronic devices.

As lithium batteries cycle, they accumulate little islands of inactive lithium that are cut off from the electrodes, decreasing the battery’s capacity to store charge. But the research team discovered that they could make this “dead” lithium creep like a worm toward one of the electrodes until it reconnects, partially reversing the unwanted process.

Adding this extra step slowed the degradation of their test battery and increased its lifetime by nearly 30%.

“We are now exploring the potential recovery of lost capacity in lithium-ion batteries using an extremely fast discharging step,” said Stanford postdoctoral fellow Fang Liu, the lead author of a study published Dec. 22 in Nature.

Lost connection

A great deal of research is looking for ways to make rechargeable batteries with lighter weight, longer lifetimes, improved safety, and faster charging speeds than the lithium-ion technology currently used in cellphones, laptops and electric vehicles. A particular focus is on developing lithium-metal batteries, which could store more energy per volume or weight. For example, in electric cars, these next-generation batteries could increase the mileage per charge and possibly take up less trunk space.

Both battery types use positively charged lithium ions that shuttle back and forth between the electrodes. Over time, some of the metallic lithium becomes electrochemically inactive, forming isolated islands of lithium that no longer connect with the electrodes. This results in a loss of capacity and is a particular problem for lithium-metal technology and for the fast charging of lithium-ion batteries.

However, in the new study, the researchers demonstrated that they could mobilize and recover the isolated lithium to extend battery life.

“I always thought of isolated lithium as bad, since it causes batteries to decay and even catch on fire,” said Yi Cui, a professor at Stanford and SLAC and investigator with the Stanford Institute for Materials and Energy Research (SIMES) who led the research. “But we have discovered how to electrically reconnect this ‘dead’ lithium with the negative electrode to reactivate it.”

Creeping, not dead

The idea for the study was born when Cui speculated that applying a voltage to a battery’s cathode and anode could make an isolated island of lithium physically move between the electrodes – a process his team has now confirmed with their experiments.

The scientists fabricated an optical cell with a lithium-nickel-manganese-cobalt-oxide (NMC) cathode, a lithium anode and an isolated lithium island in between. This test device allowed them to track in real time what happens inside a battery when in use.

They discovered that the isolated lithium island wasn’t “dead” at all but responded to battery operations. When charging the cell, the island slowly moved towards the cathode; when discharging, it crept in the opposite direction.

“It’s like a very slow worm that inches its head forward and pulls its tail in to move nanometer by nanometer,” Cui said. “In this case, it transports by dissolving away on one end and depositing material to the other end. If we can keep the lithium worm moving, it will eventually touch the anode and reestablish the electrical connection.”

Boosting lifetime

The results, which the scientists validated with other test batteries and through computer simulations, also demonstrate how isolated lithium could be recovered in a real battery by modifying the charging protocol.

“We found that we can move the detached lithium toward the anode during discharging, and these motions are faster under higher currents,” said Liu. “So we added a fast, high-current discharging step right after the battery charges, which moved the isolated lithium far enough to reconnect it with the anode. This reactivates the lithium so it can participate in the life of the battery.”

She added, “Our findings also have wide implications for the design and development of more robust lithium-metal batteries.”

This work was funded by the DOE Office of Energy Efficiency and Renewable Energy, Office of Vehicle Technologies under the Battery Materials Research (BMR), Battery 500 Consortium and eXtreme Fast Charge Cell Evaluation of Li-ion batteries (XCEL) programs.

This is a reposting of a press release, courtesy of SLAC National Accelerator Laboratory.

Stanford graduate student Aisulu Aitbekova wins 2021 Melvin P. Klein Award

Aisulu Aitbekova

Aisulu Aitbekova, a 2021 doctoral graduate from Stanford University, discovered her passion for research when she traveled from Kazakhstan to the U.S. for a summer internship as a chemical engineering undergraduate. She said that experience inspired her to go to graduate school.

After earning a master’s in chemical engineering at the Massachusetts Institute of Technology, she continued her studies at Stanford University under the supervision of Matteo Cargnello, an assistant professor of chemical engineering and Aitbekova’s doctoral advisor. Much of her thesis work involved beamline studies at the Stanford Synchrotron Radiation Lightsource (SSRL) at the Department of Energy’s SLAC National Accelerator Laboratory.  

Now, Aitbekova has been selected to receive the 2021 Melvin P. Klein Scientific Development Award, which recognizes outstanding research accomplishments by undergraduates, graduate students and postdoctoral fellows within three years of completing their doctoral degrees.

In a nomination letter for the award, SLAC Distinguished Staff Scientist Simon Bare praised Aitbekova’s initiative. “She has quickly become proficient in the application of X-ray techniques available at the synchrotron at SLAC. This proficiency and mastery include everything from operating the beamline to analyzing and interpreting the data,” he wrote.

Aitbekova said she felt “absolutely thrilled and grateful” to all of her mentors when she found out about winning the award.

“I’m so thankful for my PhD advisor Matteo Cargnello. My success would not have been possible without his mentorship,” Aitbekova said. “I’m also particularly grateful to Simon Bare, who I consider to be my second advisor. His continuous excitement about X-ray absorption spectroscopy has been the driving force for my work at SSRL.” 

Catalyzing change

Aitbekova said she is passionate about finding solutions to combat climate change. She designs materials to convert harmful pollutant gases into useful fuels and chemicals. To perform these chemical transformations, she develops catalysts and studies their properties using X-ray absorption spectroscopy (XAS). Catalysts are substances that increase rates of chemical reactions without being consumed themselves.

“I have identified that a catalyst’s size, shape and composition profoundly affect its performance in eliminating these gases,” but exactly how those properties affect performance remains unknown, she said. “This problem is further complicated by the dynamic nature of catalytic materials. As a catalyst performs chemical transformations, its structure changes, making it challenging to precisely map a catalyst’s properties to its performance.”

To overcome these barriers, she engineers materials the size of one ten-thousandth the diameter of a human hair and then tracks how they change during reactions using XAS.

In one study, Aitbekova and her colleagues engineered a catalyst using a combination of ruthenium and iron oxide nanoparticles, which they think act in a tag-team fashion to improve the synthesis of fuels from carbon dioxide and hydrogen. Using a prototype in the lab, they achieved much higher yields of ethane, propane and butane than previous catalysts.

Switching gears

While engineering catalysts that convert carbon dioxide into chemicals, she developed a new approach for preparing materials, where small particles are encapsulated inside porous oxide materials – for example, encapsulating ruthenium within a sheath of iron.

However, Aitbekova recognized a completely different application for this new approach: creating a palladium-platinum catalyst that works inside a car’s emission control system.

To eliminate the discharge of noxious emission gases, cars are equipped with a catalytic converter. Exhaust gases pass into the catalytic converter, where they are turned into less harmful gases. The catalysts inside these units are platinum and palladium metals, but these metals gradually lose their efficiency due to their extreme working conditions, she said.

“My platinum and palladium catalysts show excellent stability and performance after being subjected to air and steam at 1,100 degrees Celsius, the harshest operating environment automotive exhaust emission control catalysts could be subjected to,” explained Aitbekova. “Further improvements in these materials and successful testing under true exhaust conditions have a potential to revolutionize the field of automotive exhaust emission control.”

Her nominators agreed, citing it as the highlight of her graduate career.

“This work, currently under review for publication, is truly the remarkable result of Aisulu’s hard work and experience in pivoting from one area to another to make an impact and of her ability to connect multiple fields and solve important problems,” Cargnello wrote.

Amplifying impact

Despite this success, Aitbekova is already focused on how to make an even greater impact through mentoring and future research.

Her nominators all applauded her passion and commitment to mentor the next generation of STEM scholars, as demonstrated by mentoring “a countless number of undergraduates” according to Cargnello and by exchanging letters with middle school students from underrepresented groups.

Part of this passion, Cargnello and others wrote, stems from her experiences growing up in a highly conservative environment with the understanding that homemaking would be her eventual job. Aitbekova’s nominators wrote that they admired the fact that she made her way to Stanford and has acted as an ambassador for the values and principles of diversity and inclusion.

Aitbekova said she embraces the role. “Since my first summer research experience in the USA, I’ve wanted to serve as a bridge to science and graduate school to those who, like me, didn’t have access to such knowledge and resources.”

She will continue to act as a bridge in her next endeavor as a Kavli Nanoscience Institute Prize Postdoctoral Fellow at Caltech, where she plans to expand her work of converting carbon dioxide into fuels by running the chemical transformations with solar energy. That will “bring society one step closer to sustainable energy sources,” she said.

Bare and others praised her drive to make an everyday impact. “She has a natural passion for wanting to understand the physical principles behind the phenomena that she has observed in her research. But this passion for understanding is nicely balanced by her desire to discover something new, and to make a real difference — the practicality that is often missing in someone early in their career,” wrote Bare.

The award will be presented to Aitbekova at the 2021 SSRL/LCLS Annual Users’ Meeting during the plenary session on September 24. 

This is a reposting of my news story, courtesy of SLAC National Accelerator Laboratory.

Scientists uncover surprising behavior of a fatty acid enzyme with potential biofuel applications

Derived from microscopic algae, the rare, light-driven enzyme converts fatty acids into starting ingredients for solvents and fuels.

A study using SLAC’s LCLS X-ray laser captured how light drives a series of complex structural changes in an enzyme called FAP, which catalyzes the transformation of fatty acids into starting ingredients for solvents and fuels. This drawing captures the starting state of the catalytic reaction. The dark green background represents the protein’s molecular structure. The enzyme’s light-sensing part, called the FAD cofactor, is shown at center right with its three rings absorbing a photon coming from bottom left. A fatty acid at upper left awaits transformation. The amino acid shown at middle left plays an important role in the catalytic cycle, and the red dot near the center is a water molecule. (Damien Sorigué/Université Aix-Marseille)

By Jennifer Huber

Although many organisms capture and respond to sunlight, it’s rare to find enzymes – proteins that promote chemical reactions in living things – that are driven by light. Scientists have identified only three so far. The newest one, discovered in 2017, is called fatty acid photodecarboxylase (FAP). Derived from microscopic algae, FAP uses blue light to convert fatty acids into hydrocarbons that are similar to those found in crude oil.

“A growing number of researchers envision using FAPs for green chemistry applications because they can efficiently produce important components of solvents and fuels, including gasoline and jet fuels.” says Martin Weik, the leader of a research group at the Institut de Biologie Structurale at the Université Grenoble Alpes.

Weik is one of the primary investigators in a new study that has captured the complex sequence of structural changes, or photocycle, that FAP undergoes in response to light, which drives this fatty acid transformation. Researchers had proposed a possible FAP photocycle, but the fundamental mechanism was not understood, partly because the process is so fast that it’s very difficult to measure. Specifically, scientists didn’t know how long it took FAP to split a fatty acid and release a hydrocarbon molecule.

Experiments at the Linac Coherent Light Source (LCLS) at the Department of Energy’s SLAC National Accelerator Laboratory helped answer many of these outstanding questions. The researchers describe their results in Science.

All the tools in a toolbox

To understand a light-sensitive enzyme like FAP, scientists use many different techniques to study processes that take place over a broad range of time scales. For instance, photon absorption happens in femtoseconds, or millionths of a billionth of a second, while biological responses on the molecular level often happen in thousandths of a second.

“Our international, interdisciplinary consortium, led by Frédéric Beisson at the Université Aix-Marseille, used a wealth of techniques, including spectroscopy, crystallography and computational approaches,” Weik says. “It’s the sum of these different results that enabled us to get a first glimpse of how this unique enzyme works as a function of time and in space.”

The consortium first studied the complex steps of the catalytic process at their home labs using optical spectroscopy methods, which investigate the electronic and geometric structure of atoms in the samples, including chemical bonding and charge. Spectroscopic experiments identified the intermediate states of the enzyme that accompanied each step, measured their lifetimes and provided information on their chemical nature. These results revealed the need for the ultrafast capabilities of the LCLS X-ray free-electron laser (XFEL), which can track the molecular motion with atomic precision.

A structural view of changes in the FAP molecule during the catalytic process was provided by serial femtosecond crystallography (SFX) at the LCLS. During these experiments, a jet of tiny FAP microcrystals was hit with optical laser pulses to kick off the catalytic reaction. This ensured that all the molecules react at a similar time, synchronizing their behavior and making it possible to track the process in detail. Extremely brief, ultrabright X-ray pulses then measured the resulting changes in the enzyme’s structure.

By integrating thousands of these measurements – acquired using various time delays between the optical and X-ray pulses – the researchers were able to follow structural changes in the enzyme. They also determined the structure of the enzyme’s resting state by probing without the optical laser.

Surprisingly, the researchers found that in the resting state, the light-sensing part of the enzyme has a bent shape. “This small molecule, called the FAD cofactor, is a derivative of vitamin B2 that acts like an antenna to capture photons,” Weik says. “It absorbs blue light and initiates the catalytic process. We thought the starting point of the FAD cofactor was planar, so this bent configuration was unexpected.”

The bent shape of the FAD cofactor was first discovered by X-ray crystallography at the European Synchrotron Radiation Facility, but the scientists had suspected this bend was an artifact of radiation damage, a common problem for crystallographic data collected at synchrotron light sources.

“Only SFX experiments could confirm this unusual configuration because of their unique ability to capture structural information before damaging the sample,” Weik says. “These experiments were complemented by computations. Without the high-level quantum calculations performed by Tatiana Domratcheva of Moscow State University, we wouldn’t have understood our experimental results.”

Next steps

Even with this improved understanding of FAP’s photocycle, unanswered questions remain. For example, researchers know carbon dioxide is formed during a certain step of the catalytic process at a specific time and location, but they don’t know if it is transformed into another molecule before leaving the enzyme.

“In future XFEL work, we want to identify the nature of the products and to take pictures of the process with a much smaller step size so as to resolve the process in much finer detail,” says Weik. “This is important for fundamental research, but it can also help scientists modify the enzyme to do a task for a specific application.”

Such precision experiments will be fully enabled by upcoming upgrades to the LCLS facility that will increase its pulse repetition rate from 120 pulses per second to 1 million pulses per second, transforming scientists’ ability to track complex processes like this.

Other researchers are already working towards industrial FAP applications, including a group that is designing an economic way to produce gases such as propane and butane.

The interdisciplinary consortium included researchers from the Institute of Structural Biology in Grenoble, Max Planck Institute for Medical Research in Heidelberg, Université Aix-Marseille, Ecole Polytechnique in Paris-Palaiseau, the Integrative Biology of the Cell Institute in Paris-Saclay, Moscow State University, the ESRF and SOLEIL synchrotrons in Grenoble and Paris-Saclay, and the team at SLAC National Accelerator Laboratory.

LCLS is a DOE Office of Science user facility. Major funding for this work came from the French National Research Agency (ANR).

Citation: D. Sorigué et al., Science, 9 April 2021 ((https://doi.org/10.1126/science.abd5687)

For questions or comments, contact the SLAC Office of Communications at communications@slac.stanford.edu.

This reposting of my news release, courtesy of SLAC National Accelerator Center.

Harnessing AMReX for Wind Turbine Simulations

ECP ExaWind Project Taps Bereley Lab’s AMReX to Help Model Next-Generation Wind Farms

Driving along Highway 580 over the Altamont Pass in Northern California, you can’t help but marvel at the 4,000+ wind turbines slowly spinning on the summer-golden hillsides. Home to one of the earliest wind farms in the United States, Altamont Pass today remains one of the largest concentrations of wind turbines in the world. It is also a symbol of the future of clean energy.

Before utility grids can achieve wide-scale deployment of wind energy, however, they need more efficient wind plants. This requires advancing our fundamental understanding of the flow physics governing wind-plant performance.

ExaWind, a U.S. Department of Energy (DOE) Exascale Computing Project, is tackling this challenge by developing new simulation capabilities to more accurately predict the complex flow physics of wind farms. The project entails a collaboration between the National Renewable Energy Laboratory (NREL), Sandia National Laboratories, Oak Ridge National Laboratory, the University of Texas at Austin, Parallel Geometric Algorithms, and — as of a few months ago — Lawrence Berkeley National Laboratory (Berkeley Lab).

“Our ExaWind challenge problem is to simulate the air flow of nine wind turbines arranged as a three-by-three array inside a space five kilometers by five kilometers on the ground and a kilometer high,” said Shreyas Ananthan, a research software engineer at NREL and lead technical expert on the project. “And we need to run about a hundred seconds of real-time simulation.” 

By developing this virtual test bed, the researchers hope to revolutionize the design, operational control, and siting of wind plants, plus facilitate reliable grid integration. And this requires a combination of advanced supercomputers and unique simulation codes.

Unstructured + Structured Calculations

The principle behind a wind turbine is simple: energy in the wind turns the turbine blades, which causes an internal gearbox to rotate and spin a generator that produces electricity. But simulating this is complicated. The flexible turbine blades rotate, bend, and twist as the wind shifts direction and speed. The yaw and pitch of these blades are controlled in real time to extract as much energy as possible from a wind event. The air flow also entails complex dynamics  — such as influences from the ground terrain, formation of a turbulent wakefield downstream from the blades, and turbine-turbine interactions.

To improve on current simulations, scientists need more computing power and higher resolution models that better capture the crucial dynamics. The ExaWind team is developing a predictive, physics-based, and high-resolution computational model — progressively building from petascale simulations of a single turbine toward exascale simulations of a nine-turbine array in complex terrain.

A Nalu-Wind solution to the differential equations of motion for a wind turbine operating in uniform air flow (moving from left to right). Two of the three wind turbine’s blades are pictures (think blue rectangles on left). The slice in the background represents the contours of the whirling air’s motion, showing the vertical direction of the wake structure behind the turbine blades (red indicates swirl in counterclockwise direction and blue clockwise direction around blade tip).

“We want to know things like the air velocity and air temperature across a big three-dimensional space,” said Ann Almgren, who leads the Center for Computational Sciences and Engineering in Berkeley Lab’s Computational Research Division. “But we care most about what’s happening right at the turbines where things are changing quickly. We want to focus our resources near these turbines, without neglecting what’s going on in the larger space.”

To achieve the desired accuracy, the researchers are solving fluid dynamics equations near the turbines using a computational code called Nalu-Wind, a fully unstructured code that gives users the flexibility to more accurately describe the complex geometries near the turbines, Ananthan explained.

But this flexibility comes at a price. Unstructured mesh calculations have to store information not just about the location of all the mesh points but also about which points are connected to which. Structured meshes, meanwhile, are “logically rectangular,” which makes a lot of operations much simpler and faster.

“Originally, ExaWind planned to use Nalu-Wind everywhere, but coupling Nalu-Wind with a structured grid code may offer a much faster time-to-solution,” Almgren said.

Enter AMReX

Luckily, Ananthan knew about Berkeley Lab’s AMReX, a C++ software framework that supports block-structured adaptive-mesh algorithms for solving systems of partial differential equations. AMReX supports simulations on a structured mesh hierarchy; at each level the mesh is made up of regular boxes, but the different levels have different spatial resolution.

Ananthan explained they actually want the best of both worlds: unstructured mesh near the turbines and structured mesh elsewhere in the domain. The unstructured mesh and structured mesh have to communicate with each other, so the ExaWind team validated an overset mesh approach with an unstructured mesh near the turbines and a background structured mesh. That’s when they reached out to Almgren to collaborate.

“AMReX allows you to zoom in to get fine resolution in the regions you care about but have coarse resolution everywhere else,” Almgren said. The plan is for ExaWind to use an AMReX-based code (AMR-Wind) to resolve the entire domain except right around the turbines, where the researchers will use Nalu-Wind. AMR-Wind will generate finer and finer cells as they get closer to the turbines, basically matching the Nalu-Wind resolution where the codes meet. Nalu-Wind and AMR-Wind will talk to each other using a coupling code called TIOGA.

Even with this strategy, the team needs high performance computing. Ananthan’s initial performance studies were conducted on up to 1,024 Cori Haswell nodes at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC) and 49,152 Mira nodes at the Argonne Leadership Computing Facility.

“For the last three years, we’ve been using NERSC’s Cori heavily, as well as NREL’s Peregrine and Eagle,” said Ananthan. Moving forward, they’ll also be using the Summit system at the Oak Ridge Leadership Computing Facility and, ultimately, the Aurora and Frontier exascale supercomputers — all of which feature different types of GPUs: NVIDIA on Summit (and NERSC’s next-generation Perlmutter system), Intel on Aurora, and AMD on Frontier. 

Although Berkeley Lab just started partnering with the ExaWind team this past fall, the collaboration has already made a lot of progress. “Right now we’re still doing proof-of-concept testing for coupling the AMR-Wind and Nalu-Wind codes, but we expect to have the coupled software running on the full domain by the end of FY20,” said Almgren.

NERSC is a DOE Office of Science user facility.

Top figure: Some of the 4000+ wind turbines in Northern California’s Altamont Pass wind farm. Credit: David Laporte

This is a reposting of my news feature, courtesy of Berkeley Lab.

Searching for Photocathodes that Convert CO2 into Fuels

Figure
Six-step selection criteria used in the search for photocathodes for CO2 reduction. The search began with 68,860 inorganic compounds. The number of materials that satisfied the requirements of each step are shown in red, with 52 meeting all the requirements.

Carbon dioxide (CO2) has a bad reputation due to its pivotal role in the greenhouse gas effect at the Earth’s surface. But scientists at the Joint Center for Artificial Photosynthesis (JCAP), a U.S. Department of Energy (DOE) Innovation Hub, view CO2 as a promising solution to clean, low-cost, renewable energy.

JCAP is a team led by the California Institute of Technology (Caltech) that brings together more than 100 world-class scientists and engineers, primarily from Caltech and its lead partner, Lawrence Berkeley National Laboratory (Berkeley Lab).

The JCAP team is developing new ways to produce transportation fuels from CO2, sunlight, and water using a process called artificial photosynthesis, which harvests solar energy and stores it in chemical bonds. If successful, they’ll be able to produce fuels while also eliminating some CO2 — a “win-win,” according to Arunima Singh, an assistant professor of physics at Arizona State University and a former member of the JCAP team.

Singh became involved in the research as a postdoctoral associate at Berkeley Lab, where she searched for new photocathodes to efficiently convert CO2 to chemical fuels — a major hurtle to realizing scalable artificial photosynthesis.

“There is a dire need to find new materials to enable the photocatalytic conversion of CO2. The existing photocathodes have very low efficiencies and product selectivity, which means the CO2 produces many products that are expensive to distill,” said Singh. “Previous experimental attempts found new photocatalytic materials by trial and error, but we wanted to do a more directed search.”

Searching for Needles in a Materials Project Haystack

Using supercomputing resources at the National Energy Research Scientific Computing Center (NERSC), the Berkeley Lab team performed a massive photocathode search, starting with 68,860 materials and screening them for specific intrinsic properties. Their results were published in the January issue of Nature Communications.

“The candidate materials need to be thermodynamically stable so they can be synthesized in the lab. They need to absorb visible light. And they need to be stable in water under the highly reducing conditions of CO2 reduction, ” said first author Singh. “These three key properties were already available through the Materials Project.”

The Materials Project is a DOE-funded database of materials properties calculated based on predictive quantum-mechanical simulations using supercomputing clusters at NERSC, which is a DOE Office of Science User Facility. The database includes both experimentally known materials and hypothetical structures predicted by machine learning algorithms or various other procedures. Of the 68,860 candidate materials screened in the Nature Communications study, about half had already been experimentally synthesized, while the remaining were hypothetical.

The researchers screened these materials in six steps. First they used the Materials Project to identify the materials that were thermodynamically stable, able to absorb visible light, stable in water, and electrochemically stable. This strategy reduced the candidate pool to 235 materials — dramatically narrowing the list for the final two steps, which required computationally intensive calculations.

“By leveraging a large amount of data already available in the Materials Project, we were able to cut the computational cost of the project by several millions of CPU hours,” said Kristin Persson, a faculty scientist in Berkeley Lab’s Energy Technologies Area and senior author on the paper.

Additional Screening with First-Principles Calculations

However, the Materials Project database did not have all the necessary data. So the final screening required new first-principles simulations of materials properties based on quantum mechanics to accurately estimate the electronic structures and understand the energy of the excited electrons. These calculations were computed at NERSC and the Texas Advanced Computing Center (TACC) for the remaining 235 candidate materials.

“NERSC is the backbone of the Materials Project computation and database. But we also used about two million NERSC core hours to do the step 5 and 6 calculations,” said Singh. “Without NERSC, we would have been running our simulations on 250 cores for 24 hours a day for a year, versus being able to do these calculations in parallel on NERSC in a matter of a few months.”

The team also used about half a million core hours for these calculations at TACC, which were allocated through the National Science Foundation’s Extreme Science and Engineering Discovery Environment (XSEDE).

These theoretical calculations showed that 52 materials met all of the stringent requirements of the screening process, but that only nine of these had been previously studied for CO2 reduction. Among the 43 newly identified photocathodes, 35 have previously been synthesized and eight are hypothetical materials.

“We performed the largest exploratory search for CO2 reduction photocathodes to date, covering 68,860 materials and identifying 43 new photocathode materials exhibiting promising properties,” Persson said.

Finally, the researchers narrowed the list down to 39 promising candidates by looking at the vibrational properties of the eight hypothetical materials and ruling out the four predicted to be unstable by themselves.

However, more work is needed before artificial photosynthesis becomes are reality, including working with their experimental colleagues like Caltech’s John Gregoire (a leader of JCAPS’s high-throughput experimentation laboratory) to validate their computational results.

“We have collaborators at Berkeley Lab and Caltech who are actively trying to grow these materials and test them,” Singh said. “I’m excited to see our study opening up new avenues of research.”

This is a reposting of my Computing Sciences news feature, courtesy of Berkeley Lab.

Low-cost “magic box” could decontaminate water in rural communities

Photo by Shawn

More than a billion people drink water that is contaminated and can spread deadly diseases such as cholera, dysentery, hepatitis A, typhoid, polio and diarrhea.

Most contaminated water could be purified by adding hydrogen peroxide, which safely kills many of the disease-causing organisms and oxidizes organic pollutants to make them less harmful. Hydrogen peroxide disinfects water in a similar way as standard water chlorination, but it leaves no harmful residual chemicals. Unfortunately, it’s difficult to make or obtain hydrogen peroxide in rural settings with limited energy sources.

Now, researchers from Stanford University and SLAC National Accelerator Laboratory have developed a portable device that produces hydrogen peroxide from oxygen gas and water — and it can be powered by a battery or conventional solar panels. You can hold the small device in one hand.

“The idea is to develop an electrochemical cell that generates hydrogen peroxide from oxygen and water on site, and then use that hydrogen peroxide in ground water to oxidize organic contaminants that are harmful for humans to ingest,” said Christopher Hahn, PhD, a SLAC associate staff scientist, in a recent news release.

First, the researchers designed and synthesized a catalyst that selectively speeds up the chemical reaction of converting oxygen gas into hydrogen peroxide. For this application, standard platinum-mercury or gold-plated catalysts were too expensive, so they investigated cheaper carbon-based materials.

Next, they used their carbon-based material to build a low-cost, simple and robust device that generates and stores hydrogen peroxide at the concentration needed for water purification, which is one-tenth the concentration of the hydrogen peroxide you buy at the drug store for cleaning a cut. Although this device uses materials not available in rural communities, it could be cheaply manufactured and shipped there.

Their results were recently reported in Reaction Chemistry and Engineering. However, more work needs to be done before a higher-capacity device will be available for use.

“Currently it’s just a prototype, but I personally think it will shine in the area of decentralized water purification for the developing world,” said Bill Chen, first author and a chemistry graduate student at Stanford. “It’s like a magic box. I hope it can become a reality.”

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine. For more details, please read my SLAC news release.

Berkeley Lab Tackles Vaccine Delivery Problem with Portable Solar-Powered Vaccine Fridge

LBNL Institute for Globally Transformative Technologies research team with prototype vaccine fridge and backpack for developing countries. (Berkeley Lab / Roy Kaltschmidt)
LBNL Institute for Globally Transformative Technologies research team with prototype vaccine fridge and backpack for developing countries. (Berkeley Lab / Roy Kaltschmidt)

Vaccines are arguably one of the most important inventions of mankind. Unfortunately, vaccines must be produced and stored in an environment with very tight temperature regulation – between 36 °F and 46 °F – to keep the vaccine bugs alive. So vaccine delivery is a major problem due to the absence of reliable refrigeration in many remote countries.

Approximately 30 million children worldwide – roughly one in five – do not receive immunizations, leaving them at significant risk of disease. As a result, 1.5 million children under the age of five die annually from vaccine-preventable diseases, such as pneumonia and diarrhea. Perhaps more surprising, almost half of the vaccines in developing countries are thrown away because they get too warm during delivery so they are no longer viable. Some administered vaccines are also ineffective because they froze during transport, but there is no easy way to test this.

Scientists at Lawrence Berkeley National Laboratory (LBNL) are trying to solve this vaccine delivery problem by developing a portable solar-powered fridge. Fabricated entirely at LBNL, their portable solar-powered vaccine fridge will be transported by bicycle or motorcycle in remote areas of the developing world. Zach Friedman and Reshma Singh are leading the project as part of the LBNL Institute for Globally Transformative Technologies, which seeks to bring scientific and technological breakthroughs to address global poverty and related social ills.

The team’s first prototype portable fridge uses a thermoelectric heat pump, rather than a traditional vapor compression heat pump that relies on a circulating liquid refrigerant to absorb and remove heat. The thermoelectric chips were initially developed to keep laptops cool, so laptops could be made thinner without fans. The technology was adapted for this global application to reduce the size and weight of the fridge.

Their portable units have a one to three-liter capacity, much smaller than standard solar fridges that are typically 50 liters or more. Once the fridge cools down to the right temperature (36 °F – 46 °F), it is designed to run within that temperature range for at least five days without any power, at an ambient outside temperature as hot as 110 °F.

Before the researchers can field test their first prototype fridge in Africa, they need to pass the World Health Organization’s Performance, Quality and Safety testing protocol for products used in immunization programs. They are currently busy performing in-house testing at LBNL to ensure that they pass the formal tests, which will be conducted by an independent laboratory in the UK.

“We aren’t in the process of field testing yet, but we have established field testing agreements in both Kenya and Nigeria and have locations identified,” said Friedman. “We expect to start testing this coming year.”

Meanwhile, they are continuing their portable fridge development. “Currently, we are pursuing both thermoelectric and vapor compression heat pumps, even for these smaller devices,” explained Jonathan Slack, lead engineer. “It is not clear which will win out in terms of manufacturability and affordability.”

They are also developing a backpack version of the vaccine fridge. However, human-carried devices have to meet stricter World Health Organization standards, so they are focusing at this stage on the small portable fridge instead.

Ultimately their goal is to make it easy for health care workers to deliver viable vaccines to children in remote areas, solving the “last miles” of vaccine delivery.

This is a repost of my KQED Science blog.

Solar Research Shines

sunshine
Courtesy of Creative Commons

Everyone loves the idea of solar power — heating and cooling your home using the sun as a clean, free source of power. It sounds like the ultimate way to lower your carbon foot print! However, solar cells are expensive and typically only about 15% efficient, as I discussed in an earlier blog.

In order to make solar power more practical on a wide scale, a lot of research is underway to increase solar power efficiency. Stanford researchers have just reported a significant breakthrough in such solar power research, as described in their new paper in Nature Materials. They have developed a novel solar technology that uses both the light and heat of the sun to generate electricity. This new technology could double solar power efficiency and make it more affordable.

When most people think of solar power, they think of rooftop solar panels. These sort of solar panels (or arrays of photovoltaic solar cells) use expensive semiconductor materials to convert photons of light into electricity. The photons from sunlight are absorbed by the semiconductor material, so the energy from the photons is given to the electrons in the semiconductor. The energy given to an electron can “excite” it from the valence band to the conduction band, where it is free to move around within the semiconductor to produce electricity. Solar panels basically convert solar energy into direct current electricity. However, these types of solar panels aren’t very efficient. If an excited photon doesn’t absorb enough energy, then it can’t make it to the conduction band to produce electricity. On the other hand, if an excited photon absorbs more energy than needed (to make it to the conduction band) then the excess energy is lost as heat. In silicon solar panels, half of the solar energy that hits the solar panel is lost due to these two processes. Ideally you would like to somehow harvest the energy that is lost as heat, in order to make solar cells more efficient.

Solar power can also be generated by a thermionic energy convertor, which directly converts heat into electricity. A thermionic converter produces electricity by causing a heat-induced flow of electrons from a hot cathode across a vacuum gap to a cooler anode. However, only a small fraction of the electrons gain sufficient thermal energy to generate this kind of electricity, and very high temperatures are needed for efficient thermionic conversion.

The Stanford researchers have recently developed a new process that exploits the benefits of both solar and thermal cell conversion. The research was led by Nicholas Melosh, as a joint venture of Stanford and SLAC National Accelerator Laboratory. Melosh’s group coated a piece of semiconducting material with a thin layer of metal cesium, demonstrating that this allowed the material to use both light and heat to generate electricity. This new PETE (photon-enhanced thermionic emission) device used the same basic architecture as a thermionic converter except with this special semiconductor as the cathode.

Although the physical process of this PETE device is different than the standard solar cell mechanisms, the new device gives a similar response at very high temperatures. In fact, the PETE device is most efficient at over 200 C. This means that PETE devices won’t replace rooftop solar panels, since they require higher temperatures to be efficient. Instead, they could be used in combination with solar concentrators as part of a large scale solar power plant, for instance in the Mojave Desert.

Melosh’s initial “proof of concept” research was performed with the semiconductor galium nitride to demonstrate that the new energy conversion process works, but galium nitride isn’t suitable for solar applications. They plan to extend their research to other semiconductors, such as gallium arsenide which is commonly used in household electronics. Based on theoretical calculations, they expect to develop PETE devices that operate with a 50 percent efficiency at temperatures exceeding 200 C. They hope to design the new PETE devices so they can be easily incorporated into existing solar power plants, significantly increasing the efficiency of solar power to make it competitive with oil.