Superconductivity and charge density waves caught intertwining at the nanoscale

The team aimed infrared laser pulses at the YBCO sample to switch off its superconducting state, then used X-ray laser pulses to illuminate the sample and examined the X-ray light scattered from it. Their results revealed that regions of superconductivity and charge density waves were arranged in unexpected ways. (Courtesy Giacomo Coslovich/SLAC National Accelerator Laboratory)

Room-temperature superconductors could transform everything from electrical grids to particle accelerators to computers – but before they can be realized, researchers need to better understand how existing high-temperature superconductors work.

Now, researchers from the Department of Energy’s SLAC National Accelerator Laboratory, the University of British Columbia, Yale University and others have taken a step in that direction by studying the fast dynamics of a material called yttrium barium copper oxide, or YBCO.

The team reports May 20 in Science that YBCO’s superconductivity is intertwined in unexpected ways with another phenomenon known as charge density waves (CDWs), or ripples in the density of electrons in the material. As the researchers expected, CDWs get stronger when they turned off YBCO’s superconductivity. However, they were surprised to find the CDWs also suddenly became more spatially organized, suggesting superconductivity somehow fundamentally shapes the form of the CDWs at the nanoscale.

“A big part of what we don’t know is the relationship between charge density waves and superconductivity,” said Giacomo Coslovich, a staff scientist at the Department of Energy’s SLAC National Accelerator Laboratory, who led the study. “As one of the cleanest high-temperature superconductors that can be grown, YBCO offers us the opportunity to understand this physics in a very direct way, minimizing the effects of disorder.”

He added, “If we can better understand these materials, we can make new superconductors that work at higher temperatures, enabling many more applications and potentially addressing a lot of societal challenges – from climate change to energy efficiency to availability of fresh water.”

Observing fast dynamics

The researchers studied YBCO’s dynamics at SLAC’s Linac Coherent Light Source (LCLS) X-ray laser. They switched off superconductivity in the YBCO samples with infrared laser pulses, and then bounced X-ray pulses off those samples. For each shot of X-rays, the team pieced together a kind of snapshot of the CDWs’ electron ripples. By pasting those together, they recreated the CDWs rapid evolution.

“We did these experiments at the LCLS because we needed ultrashort pulses of X-rays, which can be made at very few places in the world. And we also needed soft X-rays, which have longer wavelengths than typical X-rays, to directly detect the CDWs,” said staff scientist and study co-author Joshua Turner, who is also a researcher at the Stanford Institute for Materials and Energy Sciences. “Plus, the people at LCLS are really great to work with.”

These LCLS runs generated terabytes of data, a challenge for processing. “Using many hours of supercomputing time, LCLS beamline scientists binned our huge amounts of data into a more manageable form so our algorithms could extract the feature characteristics,” said MengXing (Ketty) Na, a University of British Columbia graduate student and co-author on the project.

The team found that charge density waves within the YBCO samples became more correlated – that is, more electron ripples were periodic or spatially synchronized – after lasers switched off the superconductivity.

“Doubling the number of waves that are correlated with just a flash of light is quite remarkable, because light typically would produce the opposite effect. We can use light to completely disorder the charge density waves if we push too hard,” Coslovich said.

Blue areas are superconducting regions, and yellow areas represent charge density waves. After a laser pulse (red), the superconducting regions are rapidly turned off and the charge density waves react by rearranging their pattern, becoming more orderly and coherent. (Greg Stewart/SLAC National Accelerator Laboratory)

To explain these experimental observations, the researchers then modeled how regions of CDWs and superconductivity ought to interact given a variety of underlying assumptions about how YBCO works. For example, their initial model assumed that a uniform region of superconductivity when shut off with light would become a uniform CDW region – but of course that didn’t agree with their results.  

“The model that best fits our data so far indicates that superconductivity is acting like a defect within a pattern of the waves. This suggests that superconductivity and charge density waves like to be arranged in a very specific, nanoscopic way,” explained Coslovich. “They are intertwined orders at the length scale of the waves themselves.”

Illuminating the future

Coslovich said that being able to turn superconductivity off with light pulses was a significant advance, enabling observations on the time scale of less than a trillionth of a second, with major advantages over previous approaches.

“When you use other methods, like applying a high magnetic field, you have to wait a long time before making measurements, so CDWs rearrange around disorder and other phenomena can take place in the sample,” he said. “Using light allowed us to show this is an intrinsic effect, a real connection between superconductivity and charge density waves.”

The research team is excited to expand on this pivotal work, Turner said. First, they want to study how the CDWs become more organized when the superconductivity is shut off with light. They are also planning to tune the laser’s wavelength or polarization in future LCLS experiments in hopes of also using light to enhance, instead of quench, the superconducting state, so they could readily turn the superconducting state off and on.

“There is an overall interest in trying to do this with pulses of light on very fast timescales, because that can potentially lead to the development of superconducting, light-controlled devices for the new generation of electronics and computing,” said Coslovich. “Ultimately, this work can also help guide people who are trying to build room-temperature superconductors.”

This research is part of a collaboration between researchers from LCLS, SLAC’s Stanford Synchrotron Radiation Lightsource (SSRL), UBC, Yale University, the Institut National de la Recherche Scientifique in Canada, North Carolina State University, Universita CAattolica di Brescia and other institutions. This work was funded in part by the DOE Office of Science. LCLS and SSRL are DOE Office of Science user facilities.

Citation: Scott Wandel et al., Science, 20 May 2022 (10.1126/science.abd7213)

This is a reposting of my news feature courtesy of Stanford Linear Accelerator Laboratory.

Revitalizing batteries by bringing ‘dead’ lithium back to life


An animation shows how charging and discharging a lithium battery test cell causes an island of “dead,” or detached, lithium metal to creep back and forth between the electrodes. The movement of lithium ions back and forth through the electrolyte creates areas of negative (blue) and positive (red) charge at the ends of the island, which swap places as the battery charges and discharges. Lithium metal accumulates at the negative end of the island and dissolves at the positive end; this continual growth and dissolution causes the back-and-forth movement seen here. SLAC and Stanford researchers discovered that adding a brief, high-current discharging step right after charging the battery nudges the island to grow in the direction of the anode, or negative electrode. Reconnecting with the anode brings the islands dead lithium back to life and increases the batterys lifetime by nearly 30%. (Greg Stewart/SLAC National Accelerator Laboratory.)

Menlo Park, Calif. — Researchers at the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University may have found a way to revitalize rechargeable lithium batteries, potentially boosting the range of electric vehicles and battery life in next-gen electronic devices.

As lithium batteries cycle, they accumulate little islands of inactive lithium that are cut off from the electrodes, decreasing the battery’s capacity to store charge. But the research team discovered that they could make this “dead” lithium creep like a worm toward one of the electrodes until it reconnects, partially reversing the unwanted process.

Adding this extra step slowed the degradation of their test battery and increased its lifetime by nearly 30%.

“We are now exploring the potential recovery of lost capacity in lithium-ion batteries using an extremely fast discharging step,” said Stanford postdoctoral fellow Fang Liu, the lead author of a study published Dec. 22 in Nature.

Lost connection

A great deal of research is looking for ways to make rechargeable batteries with lighter weight, longer lifetimes, improved safety, and faster charging speeds than the lithium-ion technology currently used in cellphones, laptops and electric vehicles. A particular focus is on developing lithium-metal batteries, which could store more energy per volume or weight. For example, in electric cars, these next-generation batteries could increase the mileage per charge and possibly take up less trunk space.

Both battery types use positively charged lithium ions that shuttle back and forth between the electrodes. Over time, some of the metallic lithium becomes electrochemically inactive, forming isolated islands of lithium that no longer connect with the electrodes. This results in a loss of capacity and is a particular problem for lithium-metal technology and for the fast charging of lithium-ion batteries.

However, in the new study, the researchers demonstrated that they could mobilize and recover the isolated lithium to extend battery life.

“I always thought of isolated lithium as bad, since it causes batteries to decay and even catch on fire,” said Yi Cui, a professor at Stanford and SLAC and investigator with the Stanford Institute for Materials and Energy Research (SIMES) who led the research. “But we have discovered how to electrically reconnect this ‘dead’ lithium with the negative electrode to reactivate it.”

Creeping, not dead

The idea for the study was born when Cui speculated that applying a voltage to a battery’s cathode and anode could make an isolated island of lithium physically move between the electrodes – a process his team has now confirmed with their experiments.

The scientists fabricated an optical cell with a lithium-nickel-manganese-cobalt-oxide (NMC) cathode, a lithium anode and an isolated lithium island in between. This test device allowed them to track in real time what happens inside a battery when in use.

They discovered that the isolated lithium island wasn’t “dead” at all but responded to battery operations. When charging the cell, the island slowly moved towards the cathode; when discharging, it crept in the opposite direction.

“It’s like a very slow worm that inches its head forward and pulls its tail in to move nanometer by nanometer,” Cui said. “In this case, it transports by dissolving away on one end and depositing material to the other end. If we can keep the lithium worm moving, it will eventually touch the anode and reestablish the electrical connection.”

Boosting lifetime

The results, which the scientists validated with other test batteries and through computer simulations, also demonstrate how isolated lithium could be recovered in a real battery by modifying the charging protocol.

“We found that we can move the detached lithium toward the anode during discharging, and these motions are faster under higher currents,” said Liu. “So we added a fast, high-current discharging step right after the battery charges, which moved the isolated lithium far enough to reconnect it with the anode. This reactivates the lithium so it can participate in the life of the battery.”

She added, “Our findings also have wide implications for the design and development of more robust lithium-metal batteries.”

This work was funded by the DOE Office of Energy Efficiency and Renewable Energy, Office of Vehicle Technologies under the Battery Materials Research (BMR), Battery 500 Consortium and eXtreme Fast Charge Cell Evaluation of Li-ion batteries (XCEL) programs.

This is a reposting of a press release, courtesy of SLAC National Accelerator Laboratory.

Stanford graduate student Aisulu Aitbekova wins 2021 Melvin P. Klein Award

Aisulu Aitbekova

Aisulu Aitbekova, a 2021 doctoral graduate from Stanford University, discovered her passion for research when she traveled from Kazakhstan to the U.S. for a summer internship as a chemical engineering undergraduate. She said that experience inspired her to go to graduate school.

After earning a master’s in chemical engineering at the Massachusetts Institute of Technology, she continued her studies at Stanford University under the supervision of Matteo Cargnello, an assistant professor of chemical engineering and Aitbekova’s doctoral advisor. Much of her thesis work involved beamline studies at the Stanford Synchrotron Radiation Lightsource (SSRL) at the Department of Energy’s SLAC National Accelerator Laboratory.  

Now, Aitbekova has been selected to receive the 2021 Melvin P. Klein Scientific Development Award, which recognizes outstanding research accomplishments by undergraduates, graduate students and postdoctoral fellows within three years of completing their doctoral degrees.

In a nomination letter for the award, SLAC Distinguished Staff Scientist Simon Bare praised Aitbekova’s initiative. “She has quickly become proficient in the application of X-ray techniques available at the synchrotron at SLAC. This proficiency and mastery include everything from operating the beamline to analyzing and interpreting the data,” he wrote.

Aitbekova said she felt “absolutely thrilled and grateful” to all of her mentors when she found out about winning the award.

“I’m so thankful for my PhD advisor Matteo Cargnello. My success would not have been possible without his mentorship,” Aitbekova said. “I’m also particularly grateful to Simon Bare, who I consider to be my second advisor. His continuous excitement about X-ray absorption spectroscopy has been the driving force for my work at SSRL.” 

Catalyzing change

Aitbekova said she is passionate about finding solutions to combat climate change. She designs materials to convert harmful pollutant gases into useful fuels and chemicals. To perform these chemical transformations, she develops catalysts and studies their properties using X-ray absorption spectroscopy (XAS). Catalysts are substances that increase rates of chemical reactions without being consumed themselves.

“I have identified that a catalyst’s size, shape and composition profoundly affect its performance in eliminating these gases,” but exactly how those properties affect performance remains unknown, she said. “This problem is further complicated by the dynamic nature of catalytic materials. As a catalyst performs chemical transformations, its structure changes, making it challenging to precisely map a catalyst’s properties to its performance.”

To overcome these barriers, she engineers materials the size of one ten-thousandth the diameter of a human hair and then tracks how they change during reactions using XAS.

In one study, Aitbekova and her colleagues engineered a catalyst using a combination of ruthenium and iron oxide nanoparticles, which they think act in a tag-team fashion to improve the synthesis of fuels from carbon dioxide and hydrogen. Using a prototype in the lab, they achieved much higher yields of ethane, propane and butane than previous catalysts.

Switching gears

While engineering catalysts that convert carbon dioxide into chemicals, she developed a new approach for preparing materials, where small particles are encapsulated inside porous oxide materials – for example, encapsulating ruthenium within a sheath of iron.

However, Aitbekova recognized a completely different application for this new approach: creating a palladium-platinum catalyst that works inside a car’s emission control system.

To eliminate the discharge of noxious emission gases, cars are equipped with a catalytic converter. Exhaust gases pass into the catalytic converter, where they are turned into less harmful gases. The catalysts inside these units are platinum and palladium metals, but these metals gradually lose their efficiency due to their extreme working conditions, she said.

“My platinum and palladium catalysts show excellent stability and performance after being subjected to air and steam at 1,100 degrees Celsius, the harshest operating environment automotive exhaust emission control catalysts could be subjected to,” explained Aitbekova. “Further improvements in these materials and successful testing under true exhaust conditions have a potential to revolutionize the field of automotive exhaust emission control.”

Her nominators agreed, citing it as the highlight of her graduate career.

“This work, currently under review for publication, is truly the remarkable result of Aisulu’s hard work and experience in pivoting from one area to another to make an impact and of her ability to connect multiple fields and solve important problems,” Cargnello wrote.

Amplifying impact

Despite this success, Aitbekova is already focused on how to make an even greater impact through mentoring and future research.

Her nominators all applauded her passion and commitment to mentor the next generation of STEM scholars, as demonstrated by mentoring “a countless number of undergraduates” according to Cargnello and by exchanging letters with middle school students from underrepresented groups.

Part of this passion, Cargnello and others wrote, stems from her experiences growing up in a highly conservative environment with the understanding that homemaking would be her eventual job. Aitbekova’s nominators wrote that they admired the fact that she made her way to Stanford and has acted as an ambassador for the values and principles of diversity and inclusion.

Aitbekova said she embraces the role. “Since my first summer research experience in the USA, I’ve wanted to serve as a bridge to science and graduate school to those who, like me, didn’t have access to such knowledge and resources.”

She will continue to act as a bridge in her next endeavor as a Kavli Nanoscience Institute Prize Postdoctoral Fellow at Caltech, where she plans to expand her work of converting carbon dioxide into fuels by running the chemical transformations with solar energy. That will “bring society one step closer to sustainable energy sources,” she said.

Bare and others praised her drive to make an everyday impact. “She has a natural passion for wanting to understand the physical principles behind the phenomena that she has observed in her research. But this passion for understanding is nicely balanced by her desire to discover something new, and to make a real difference — the practicality that is often missing in someone early in their career,” wrote Bare.

The award will be presented to Aitbekova at the 2021 SSRL/LCLS Annual Users’ Meeting during the plenary session on September 24. 

This is a reposting of my news story, courtesy of SLAC National Accelerator Laboratory.

Harnessing AMReX for Wind Turbine Simulations

ECP ExaWind Project Taps Bereley Lab’s AMReX to Help Model Next-Generation Wind Farms

Driving along Highway 580 over the Altamont Pass in Northern California, you can’t help but marvel at the 4,000+ wind turbines slowly spinning on the summer-golden hillsides. Home to one of the earliest wind farms in the United States, Altamont Pass today remains one of the largest concentrations of wind turbines in the world. It is also a symbol of the future of clean energy.

Before utility grids can achieve wide-scale deployment of wind energy, however, they need more efficient wind plants. This requires advancing our fundamental understanding of the flow physics governing wind-plant performance.

ExaWind, a U.S. Department of Energy (DOE) Exascale Computing Project, is tackling this challenge by developing new simulation capabilities to more accurately predict the complex flow physics of wind farms. The project entails a collaboration between the National Renewable Energy Laboratory (NREL), Sandia National Laboratories, Oak Ridge National Laboratory, the University of Texas at Austin, Parallel Geometric Algorithms, and — as of a few months ago — Lawrence Berkeley National Laboratory (Berkeley Lab).

“Our ExaWind challenge problem is to simulate the air flow of nine wind turbines arranged as a three-by-three array inside a space five kilometers by five kilometers on the ground and a kilometer high,” said Shreyas Ananthan, a research software engineer at NREL and lead technical expert on the project. “And we need to run about a hundred seconds of real-time simulation.” 

By developing this virtual test bed, the researchers hope to revolutionize the design, operational control, and siting of wind plants, plus facilitate reliable grid integration. And this requires a combination of advanced supercomputers and unique simulation codes.

Unstructured + Structured Calculations

The principle behind a wind turbine is simple: energy in the wind turns the turbine blades, which causes an internal gearbox to rotate and spin a generator that produces electricity. But simulating this is complicated. The flexible turbine blades rotate, bend, and twist as the wind shifts direction and speed. The yaw and pitch of these blades are controlled in real time to extract as much energy as possible from a wind event. The air flow also entails complex dynamics  — such as influences from the ground terrain, formation of a turbulent wakefield downstream from the blades, and turbine-turbine interactions.

To improve on current simulations, scientists need more computing power and higher resolution models that better capture the crucial dynamics. The ExaWind team is developing a predictive, physics-based, and high-resolution computational model — progressively building from petascale simulations of a single turbine toward exascale simulations of a nine-turbine array in complex terrain.

A Nalu-Wind solution to the differential equations of motion for a wind turbine operating in uniform air flow (moving from left to right). Two of the three wind turbine’s blades are pictures (think blue rectangles on left). The slice in the background represents the contours of the whirling air’s motion, showing the vertical direction of the wake structure behind the turbine blades (red indicates swirl in counterclockwise direction and blue clockwise direction around blade tip).

“We want to know things like the air velocity and air temperature across a big three-dimensional space,” said Ann Almgren, who leads the Center for Computational Sciences and Engineering in Berkeley Lab’s Computational Research Division. “But we care most about what’s happening right at the turbines where things are changing quickly. We want to focus our resources near these turbines, without neglecting what’s going on in the larger space.”

To achieve the desired accuracy, the researchers are solving fluid dynamics equations near the turbines using a computational code called Nalu-Wind, a fully unstructured code that gives users the flexibility to more accurately describe the complex geometries near the turbines, Ananthan explained.

But this flexibility comes at a price. Unstructured mesh calculations have to store information not just about the location of all the mesh points but also about which points are connected to which. Structured meshes, meanwhile, are “logically rectangular,” which makes a lot of operations much simpler and faster.

“Originally, ExaWind planned to use Nalu-Wind everywhere, but coupling Nalu-Wind with a structured grid code may offer a much faster time-to-solution,” Almgren said.

Enter AMReX

Luckily, Ananthan knew about Berkeley Lab’s AMReX, a C++ software framework that supports block-structured adaptive-mesh algorithms for solving systems of partial differential equations. AMReX supports simulations on a structured mesh hierarchy; at each level the mesh is made up of regular boxes, but the different levels have different spatial resolution.

Ananthan explained they actually want the best of both worlds: unstructured mesh near the turbines and structured mesh elsewhere in the domain. The unstructured mesh and structured mesh have to communicate with each other, so the ExaWind team validated an overset mesh approach with an unstructured mesh near the turbines and a background structured mesh. That’s when they reached out to Almgren to collaborate.

“AMReX allows you to zoom in to get fine resolution in the regions you care about but have coarse resolution everywhere else,” Almgren said. The plan is for ExaWind to use an AMReX-based code (AMR-Wind) to resolve the entire domain except right around the turbines, where the researchers will use Nalu-Wind. AMR-Wind will generate finer and finer cells as they get closer to the turbines, basically matching the Nalu-Wind resolution where the codes meet. Nalu-Wind and AMR-Wind will talk to each other using a coupling code called TIOGA.

Even with this strategy, the team needs high performance computing. Ananthan’s initial performance studies were conducted on up to 1,024 Cori Haswell nodes at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC) and 49,152 Mira nodes at the Argonne Leadership Computing Facility.

“For the last three years, we’ve been using NERSC’s Cori heavily, as well as NREL’s Peregrine and Eagle,” said Ananthan. Moving forward, they’ll also be using the Summit system at the Oak Ridge Leadership Computing Facility and, ultimately, the Aurora and Frontier exascale supercomputers — all of which feature different types of GPUs: NVIDIA on Summit (and NERSC’s next-generation Perlmutter system), Intel on Aurora, and AMD on Frontier. 

Although Berkeley Lab just started partnering with the ExaWind team this past fall, the collaboration has already made a lot of progress. “Right now we’re still doing proof-of-concept testing for coupling the AMR-Wind and Nalu-Wind codes, but we expect to have the coupled software running on the full domain by the end of FY20,” said Almgren.

NERSC is a DOE Office of Science user facility.

Top figure: Some of the 4000+ wind turbines in Northern California’s Altamont Pass wind farm. Credit: David Laporte

This is a reposting of my news feature, courtesy of Berkeley Lab.

“Poor air quality affects everyone” — How to protect yourself and clean the air

I remember when you could ride BART for free on a “Spare the Air” day, when smog was expected to reach unhealthy levels based on standards set by the Environmental Protection Agency. Now, there are too many of these days — 26 in the Bay Area last year — to enjoy that perk.

This bad air is making us sick, according to Stanford allergy specialist and clinical associate professor Sharon Chinthrajah, MD. In a recent episode of the Sirius radio show “The Future of Everything,” she spoke with Stanford professor and radio host Russ Altman, MD, PhD, about how we can combat the negative health impacts of air pollution.

“Poor air quality affects everybody: healthy people and people with chronic heart and lung conditions,” said Chinthrajah. “And you know, in my lung clinic I see people coming in with exacerbations of their underlying lung diseases like asthma or COPD.”

On Spare the Air days, Chinthrajah said even healthy people can suffer from eye, nose, throat and skin irritations caused by air pollution. And the health impacts can be far more serious for her patients. So she tells them to prepare for bad air quality days and to monitor the air quality index (AQI) in their area, she said.

The AQI measures the levels of ozone and other tiny pollutants in the air. The air is considered unhealthy when the AQI is above 100 for sensitive groups — like people with chronic illnesses, older adults and children. It’s unhealthy for everyone when the AQI is above 150.

On these unhealthy air days, Chinthrajah recommends taking precautions:

  • Limit the time you spend outdoors.
  • When outside, use a well-fitted air mask that filters out pollutants larger than 2.5 microns (which is about 20 times smaller than the thickness of an average human hair).
  • When driving, recirculate the air in your car and keep your windows closed.
  • Stay hydrated.
  • Once inside, change your clothes and take a quick shower before you go to bed, removing any air particulates that collected on you during the day.

In the radio show, Chinthrajah explained that published studies by the World Health organization and others demonstrate that people who live in developing countries like India and Asia — where they suffer poor air quality many days of the year — have a shortened life span.

“You know, there’s premature deaths. There’s exacerbation of underlying lung issues and cardiovascular issues. There’s more deaths from heart attacks and strokes in countries where there is poor air quality,” she said.

She admitted that it is difficult to definitively say these health problems are due to poor air quality — given the other problems in the developing country

es like limited access to clean water, food and health care — but she thinks poor air quality is a major contributor.

Chinthrajah said she believes we need to address the problem of air pollution at a societal level. And that means we need to target cars that burn fossil fuel, which account for much of the air pollution in California, she said. Instead, we need to move towards using public transportation and electric vehicles, as well as generating electricity from clean energy sources like solar, wind and water.

She noted that California is now offering a $9,5000 subsidy to qualifying low-income families to purchase low emission vehicles like all-electric cars or plug-in hybrids, on top of the standard federal and state rebates.

“So it seems like an overwhelming, daunting task, right? But I think we each have to take ownership of what we can do to reduce our carbon footprint. And then lobby within our local organizations to create practices that are sustainable,” she said.

Chinthrajah hopes that addressing air pollution and energy consumption at a societal level will lead to less asthma and other health problems, she said.

Image by U.S. Environmental Protection Agency 

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

Designing buildings to improve health

Are the buildings that we live and work in stressing us out?

The answer is probably yes, according to Stanford engineer Sarah Billington, PhD, and her colleagues. They also believe this stress is taking a significant toll on our mental and physical health because Americans typically spend almost 90% of their lives indoors.

During a recent talk at a Stanford Reunion Homecoming alumni celebration, Billington described a typical noisy office cut off from nature and filled with artificial light and artificial materials. This built environment makes workers feel stress, anxiety and distraction, which reduces their productivity and their ability to collaborate with others, she explained.

Now, Billington’s multidisciplinary research team is working to design buildings that instead reduce stress and increase a sense of belonging, physical activity and creativity.

Their first step is to measure how building features — such as airflow, lighting and views of nature — affect human well-being. They are quantifying well-being by measuring levels of stress, belonging, creativity, physical activity and environmental behavior.

In a preliminary online study, the team showed about 300 participants pictures of different office environments and asked them to envision working there at a new job. Across the board, the fictitious work environment was viewed as important to well-being.

“In eight out of the nine things that we were looking at, there were statistically significant increases in their sense of belonging, their self-efficacy and their environmental efficacy when they believed they were going to be working in an environment that had natural materials, natural light or diverse representations,” said Billington.

The researchers are now expanding this work by performing larger lab studies and designing future field studies. They plan to collect data from “smart buildings,” which use high-tech sensors to control the heating, air conditioning, ventilation, lighting, security and other systems. The team also plans to collect data from personal devices such as smartwatches, smartphones and laptops.

By analyzing all of this data, they plan to infer the participants’ behaviors, emotions and physiological states. For example, the researchers will use the building’s occupancy sensors to detect if a worker is interacting with other people who are nearby. Or they will figure out someone’s stress level based on how he or she uses a laptop trackpad and mouse, Billington said.

Stanford computer scientist Pablo Paredes, PhD, who collaborates on the project, explained in a paper how their simple model of arm-hand dynamics can detect stress from mouse motion. Basically, your muscles get tense and stiff when you’re stressed, which changes how you move a computer mouse.

Next, the team plans to use statistical modeling and machine learning to connect these human states to specific building features. They believe this will allow them to design better buildings that improve the occupants’ health.

The researchers said they intend to bring nature indoors by engineering living walls with adaptable acoustic and thermal properties.

They also plan to incorporate dynamic digital displays — such as a large art display on the wall or a small one on an individual’s personal devices — that reflect occupant activity and well-being. For example, a digital image of a flower might represent the energy level of a working group based on how open the petals are, and this could nudge their behavior, Billington said in the talk.

“Our idea is, what if we could make our buildings shape us in a positive way and keep improving over time?” Billington said.

Photo by Nastuh Abootalebi

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

Searching for Photocathodes that Convert CO2 into Fuels

Figure
Six-step selection criteria used in the search for photocathodes for CO2 reduction. The search began with 68,860 inorganic compounds. The number of materials that satisfied the requirements of each step are shown in red, with 52 meeting all the requirements.

Carbon dioxide (CO2) has a bad reputation due to its pivotal role in the greenhouse gas effect at the Earth’s surface. But scientists at the Joint Center for Artificial Photosynthesis (JCAP), a U.S. Department of Energy (DOE) Innovation Hub, view CO2 as a promising solution to clean, low-cost, renewable energy.

JCAP is a team led by the California Institute of Technology (Caltech) that brings together more than 100 world-class scientists and engineers, primarily from Caltech and its lead partner, Lawrence Berkeley National Laboratory (Berkeley Lab).

The JCAP team is developing new ways to produce transportation fuels from CO2, sunlight, and water using a process called artificial photosynthesis, which harvests solar energy and stores it in chemical bonds. If successful, they’ll be able to produce fuels while also eliminating some CO2 — a “win-win,” according to Arunima Singh, an assistant professor of physics at Arizona State University and a former member of the JCAP team.

Singh became involved in the research as a postdoctoral associate at Berkeley Lab, where she searched for new photocathodes to efficiently convert CO2 to chemical fuels — a major hurtle to realizing scalable artificial photosynthesis.

“There is a dire need to find new materials to enable the photocatalytic conversion of CO2. The existing photocathodes have very low efficiencies and product selectivity, which means the CO2 produces many products that are expensive to distill,” said Singh. “Previous experimental attempts found new photocatalytic materials by trial and error, but we wanted to do a more directed search.”

Searching for Needles in a Materials Project Haystack

Using supercomputing resources at the National Energy Research Scientific Computing Center (NERSC), the Berkeley Lab team performed a massive photocathode search, starting with 68,860 materials and screening them for specific intrinsic properties. Their results were published in the January issue of Nature Communications.

“The candidate materials need to be thermodynamically stable so they can be synthesized in the lab. They need to absorb visible light. And they need to be stable in water under the highly reducing conditions of CO2 reduction, ” said first author Singh. “These three key properties were already available through the Materials Project.”

The Materials Project is a DOE-funded database of materials properties calculated based on predictive quantum-mechanical simulations using supercomputing clusters at NERSC, which is a DOE Office of Science User Facility. The database includes both experimentally known materials and hypothetical structures predicted by machine learning algorithms or various other procedures. Of the 68,860 candidate materials screened in the Nature Communications study, about half had already been experimentally synthesized, while the remaining were hypothetical.

The researchers screened these materials in six steps. First they used the Materials Project to identify the materials that were thermodynamically stable, able to absorb visible light, stable in water, and electrochemically stable. This strategy reduced the candidate pool to 235 materials — dramatically narrowing the list for the final two steps, which required computationally intensive calculations.

“By leveraging a large amount of data already available in the Materials Project, we were able to cut the computational cost of the project by several millions of CPU hours,” said Kristin Persson, a faculty scientist in Berkeley Lab’s Energy Technologies Area and senior author on the paper.

Additional Screening with First-Principles Calculations

However, the Materials Project database did not have all the necessary data. So the final screening required new first-principles simulations of materials properties based on quantum mechanics to accurately estimate the electronic structures and understand the energy of the excited electrons. These calculations were computed at NERSC and the Texas Advanced Computing Center (TACC) for the remaining 235 candidate materials.

“NERSC is the backbone of the Materials Project computation and database. But we also used about two million NERSC core hours to do the step 5 and 6 calculations,” said Singh. “Without NERSC, we would have been running our simulations on 250 cores for 24 hours a day for a year, versus being able to do these calculations in parallel on NERSC in a matter of a few months.”

The team also used about half a million core hours for these calculations at TACC, which were allocated through the National Science Foundation’s Extreme Science and Engineering Discovery Environment (XSEDE).

These theoretical calculations showed that 52 materials met all of the stringent requirements of the screening process, but that only nine of these had been previously studied for CO2 reduction. Among the 43 newly identified photocathodes, 35 have previously been synthesized and eight are hypothetical materials.

“We performed the largest exploratory search for CO2 reduction photocathodes to date, covering 68,860 materials and identifying 43 new photocathode materials exhibiting promising properties,” Persson said.

Finally, the researchers narrowed the list down to 39 promising candidates by looking at the vibrational properties of the eight hypothetical materials and ruling out the four predicted to be unstable by themselves.

However, more work is needed before artificial photosynthesis becomes are reality, including working with their experimental colleagues like Caltech’s John Gregoire (a leader of JCAPS’s high-throughput experimentation laboratory) to validate their computational results.

“We have collaborators at Berkeley Lab and Caltech who are actively trying to grow these materials and test them,” Singh said. “I’m excited to see our study opening up new avenues of research.”

This is a reposting of my Computing Sciences news feature, courtesy of Berkeley Lab.

Sick people are worse for the environment, a study shows

Photo by ryan harvey

Environmental degradation is widely recognized to contribute to human illness. However, little research has been done to investigate the impact of human illness on the environment. This is a critical question particularly for the millions of people around the world who depend on natural resources for food and income while coping with high burdens of infectious diseases.

When people are sick, they often alter their use of natural resources in ways that harm the environment, according to a new study reported in the Proceedings of the National Academy of Sciences.

Specifically, the researchers examined how illness influenced fishing practices in the community around Lake Victoria, Kenya, which has high rates of HIV and other illnesses. They interviewed about 300 households several times over 16 months, collecting and analyzing data about household fishing habits and mental and physical health. They found that healthy people are better for the environment.

“Studies suggest that people will spend less time on their livelihoods when they are sick, but we didn’t see that trend in our study. Instead, we saw a shift toward more destructive fishing methods when people were ill,” said lead author Kathryn Fiorella, PhD, a postdoctoral scholar at Cornell University, in a recent news release.

The study found that sick fishermen were less likely to legally fish in deep waters or overnight to target the more sustainable mature fish. Instead, they used destructive fishing methods that were concentrated along the shoreline — such as using a beach dragnet that captures a high proportion of juvenile fish and disturbs shallow fish breeding habits.

Basically, sick fishermen just wanted to get their catch quickly with less energy. They were focused on their short-term goal and not worried about depleting the fish stock.

In light of this study, the authors suggest that institutions and organizations focused on protecting the environment may need to more deeply consider the health of communities. The paper concludes, “Our study emphasizes the importance of considering health, governance, and ecosystems through an integrative lens.”

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

Berkeley Lab Tackles Vaccine Delivery Problem with Portable Solar-Powered Vaccine Fridge

LBNL Institute for Globally Transformative Technologies research team with prototype vaccine fridge and backpack for developing countries. (Berkeley Lab / Roy Kaltschmidt)
LBNL Institute for Globally Transformative Technologies research team with prototype vaccine fridge and backpack for developing countries. (Berkeley Lab / Roy Kaltschmidt)

Vaccines are arguably one of the most important inventions of mankind. Unfortunately, vaccines must be produced and stored in an environment with very tight temperature regulation – between 36 °F and 46 °F – to keep the vaccine bugs alive. So vaccine delivery is a major problem due to the absence of reliable refrigeration in many remote countries.

Approximately 30 million children worldwide – roughly one in five – do not receive immunizations, leaving them at significant risk of disease. As a result, 1.5 million children under the age of five die annually from vaccine-preventable diseases, such as pneumonia and diarrhea. Perhaps more surprising, almost half of the vaccines in developing countries are thrown away because they get too warm during delivery so they are no longer viable. Some administered vaccines are also ineffective because they froze during transport, but there is no easy way to test this.

Scientists at Lawrence Berkeley National Laboratory (LBNL) are trying to solve this vaccine delivery problem by developing a portable solar-powered fridge. Fabricated entirely at LBNL, their portable solar-powered vaccine fridge will be transported by bicycle or motorcycle in remote areas of the developing world. Zach Friedman and Reshma Singh are leading the project as part of the LBNL Institute for Globally Transformative Technologies, which seeks to bring scientific and technological breakthroughs to address global poverty and related social ills.

The team’s first prototype portable fridge uses a thermoelectric heat pump, rather than a traditional vapor compression heat pump that relies on a circulating liquid refrigerant to absorb and remove heat. The thermoelectric chips were initially developed to keep laptops cool, so laptops could be made thinner without fans. The technology was adapted for this global application to reduce the size and weight of the fridge.

Their portable units have a one to three-liter capacity, much smaller than standard solar fridges that are typically 50 liters or more. Once the fridge cools down to the right temperature (36 °F – 46 °F), it is designed to run within that temperature range for at least five days without any power, at an ambient outside temperature as hot as 110 °F.

Before the researchers can field test their first prototype fridge in Africa, they need to pass the World Health Organization’s Performance, Quality and Safety testing protocol for products used in immunization programs. They are currently busy performing in-house testing at LBNL to ensure that they pass the formal tests, which will be conducted by an independent laboratory in the UK.

“We aren’t in the process of field testing yet, but we have established field testing agreements in both Kenya and Nigeria and have locations identified,” said Friedman. “We expect to start testing this coming year.”

Meanwhile, they are continuing their portable fridge development. “Currently, we are pursuing both thermoelectric and vapor compression heat pumps, even for these smaller devices,” explained Jonathan Slack, lead engineer. “It is not clear which will win out in terms of manufacturability and affordability.”

They are also developing a backpack version of the vaccine fridge. However, human-carried devices have to meet stricter World Health Organization standards, so they are focusing at this stage on the small portable fridge instead.

Ultimately their goal is to make it easy for health care workers to deliver viable vaccines to children in remote areas, solving the “last miles” of vaccine delivery.

This is a repost of my KQED Science blog.

Time to Invest in Delta Levees

US Army Corp of Engineers inspect a Sacramento river levee ( U.S. Army photo by Chris Gray-Garcia, Flickr).
US Army Corp of Engineers inspect a Sacramento river levee ( U.S. Army photo by Chris Gray-Garcia, Flickr).

Two hundred years ago most of the Sacramento-San Joaquin Delta (Delta) was a vast wetland. Early settlers built an intricate levee system to create dry “islands” suitable for farming.

Today, these levees help protect people, property, natural resources, and infrastructure of statewide importance. The Delta is home to more than 515,000 people and 750 animal and plant species; supplies drinking water to 25 million Californians and irrigation water for the majority of California’s agricultural industry; and attracts 12 million recreational visits annually.

Unfortunately the Delta levees are vulnerable to damage caused by floods, wave action, seepage, subsidence, earthquakes, and sea-level rise. While the occasional levee break is a fact of Delta life, a catastrophic levee failure could cause injury to people or loss of life. It could also damage property, highways, energy utilities, water supply systems, and the environment —all with regional and statewide consequences.

A variety of actions can be used to reduce flood risk in the Delta. The Delta Levees Council is developing a strategy to evaluate and guide future California investments to reduce the likelihood and consequences of levee failures. Interested? Learn more about this project and get involved by attending public meetings.