Eos: The Dawn of a New Era of Neutrino Detection

The bright yellow forklift crept forward, gracefully maneuvering the 20-ton steel tank through the entrance of Etcheverry Hall’s basement with only two millimeters to spare. Relying on the expertise of Berkeley Lab riggers, this tight squeeze was by design to maximize the size of the outer vessel of the Eos experiment.

“Named for the Titan goddess of dawn, Eos represents the dawn of a new era of neutrino detection technology,” says Gabriel Orebi Gann, a Berkeley Physics associate professor, Berkeley Lab faculty scientist, and the leader of Eos, an international collaboration of 24 institutions jointly led by UC Berkeley Physics and Berkeley Lab Nuclear Science.

Neutrinos are abundant, neutral, almost massless subatomic “ghost particles” created whenever atomic nuclei come together or break apart, including during fusion reactions at the core of the Sun and fission reactions inside nuclear reactors on Earth. Neutrinos are difficult to detect because they rarely interact with matter—about 100 trillion neutrinos harmlessly pass through the Earth and our bodies every second as if we don’t exist.

Berkeley researchers are using Eos as a testbed to explore advanced, hybrid technologies for detecting these mysterious particles.

“While at Berkeley, we’re characterizing the response of the detector using deployable optical and radioactive sources to understand how well our technologies are performing. And we’re developing detailed simulations of our detector performance to make sure they agree with the data,” says Berkeley Physics Postdoctoral Fellow Tanner Kaptanoglu. “Once we complete this validation, we hope to move Eos to a neutrino source for further testing.”

Ultimately, the team hopes to use their experimental results and simulations to design a much larger version of Eos—named after the Titan goddess Theia, mother of Eos—to realize an astonishing breadth of nuclear physics, high energy physics, and astrophysics research.

The Eos collaboration is also investigating whether these technologies could someday detect nuclear security threats, in partnership with the funding sponsor, National Nuclear Security Administration.

 “One nonproliferation application is using the absence of a neutrino signature to demonstrate that stockpile verification experiments are not nuclear,” says Orebi Gann. “A second application is verifying that nuclear-powered marine vessels are operating correctly.”

Like a nesting doll, Eos comprises several detector layers. The inner layer is a 4-ton acrylic tank, filled in stages during testing with air, then deionized water, and finally a water-based liquid scintillator (WbLS).

The barrel of this inner vessel is surrounded by 168 fast, high-performance, 8-inch photomultiplier tubes (PMTs) with electromagnetic shielding. Attached above the vessel are two dozen 12-inch PMTs. And attached below it are three dozen 8-inch “front-row” PMTs, with another dozen 10-inch PMTs below them.

In January, this detector assembly was gently lowered inside the 20-ton steel outer vessel, with Berkeley Physics Assistant Project Scientist Leon Pickard operating the crane as other team members anxiously watched.

“The big lift this was nerve-wracking. More than a year’s worth of work, dedication, and time from lots of people and then I was lifting it all together into the outer tank,” describes Pickard. “I knew the Berkeley Lab riggers taught me well so I was confident, excited, and definitely nervous.”

The buffer region between the acrylic and steel vessels is filled with water, submerging the PMTs. The outermost Eos layer is a muon tracker system consisting of solid scintillator paddles with PMTs.

By combining several novel detector technologies, Eos measures both Cherenkov radiation and scintillation light simultaneously. Its main challenge is to separate the faint Cherenkov signal from the overwhelming scintillation signal.

When neutrinos pass through Eos, one very occasionally interacts with the detector’s water or scintillator, transferring its energy to a charged particle. This charged particle then travels through the medium, emitting light that is detected by the PMTs.

When the charged particle travels faster than the speed of light in the medium, it creates a photonic boom—similar to the sonic boom created by a plane traveling faster than the speed of sound. This cone of Cherenkov light travels in the direction of the charged particle, making a ring-like image that is detected by the PMTs. In contrast, the scintillation light emits equally in all directions. Reconstructing the pattern of PMT hits helps distinguish between the two signals.

In addition to topological differences, Cherenkov radiation is emitted almost instantaneously in a picosecond burst, whereas scintillation light lasts for nanoseconds. The PMTs detect this time difference.

Finally, the observable Cherenkov radiation has a longer, redder wavelength spectra than the bluer scintillation light, which inspired the creation of dichroic photosensors that sort photons by wavelength. These dichroicons consist of an 8-inch PMT with a long-pass optical filter above the bulb and a crown of short-pass filters surrounding it. A dozen of the 8-inch, front-row PMTs attached to the bottom of the inner vessel are dichroicons. The concept for these novel photosensors was developed under the leadership of Eos collaborator Professor Joshua Klein, with Kaptanoglu playing a central role as part of his PhD thesis at the University of Pennsylvania.

If the light’s wavelength is above a certain threshold, a dichroicon guides Cherenkov light onto the central PMT. If the light is below that threshold, it passes through and is detected by the 10-inch, back-row PMTs.

“You effectively guide the Cherenkov light to specific PMTs and the scintillation light to other PMTs without losing light,” says Orebi Gann. “This gives us an additional way to separate Cherenkov and scintillation light.”

Another unique thing about Eos is its location.

“Although Eos is a Berkeley Physics project, the Nuclear Engineering department let us work in their space in the Etcheverry basement,” says Orebi Gann. “It’s unusual to work across departmental boundaries in this way. It’s a sign of how great and supportive Nuclear Engineering has been.”

Delivering the outer vessel into the building wasn’t the only tight squeeze—the Eos installation was temporally and physically tight.

Neutrino experiments often struggle to get their steel tanks manufactured, so everyone was excited last June when the tank headed towards Berkeley. Unfortunately, Orebi Gann received an email the next morning saying the tank was destroyed in a non-injury accident when the truck collided with an overpass in Saint Louis. After immediately calling her sponsor with the bad news, she mobilized.

“I started sweating. They would have killed our three-year project if we had to wait for the insurance claim,” says Orebi Gann. “Luckily, Berkeley Lab Nuclear Science Division Director Reiner Kruecken and others were really supportive, and we had enough contingency in the budget to buy another one. Within two weeks, we were under contract for a replacement. And the steel tank arrived three months later.”

Despite this delay, the collaboration assembled the detector, acquired and analyzed the data, and finished developing the detector simulations during the last year of funding.

“That’s the biggest setback you can have—your tank is crumpled. But with prudent planning, preparation, and scheduling agility, we were able to get right back on track,” says Pickard, also the installation manager.

In addition to Orebi Gann, Pickard, and Kaptanoglu, the Berkeley Physics installation team included former Project Scientists Zara Bagdasarian, Morgan Askins, and Guang Yang, Junior Specialist Sawyer Kaplan, graduate students Max Smiley, Ed Callaghan, and Martina Hebert, and undergraduate students Joseph Koplowitz, Ashley Rincon, and Hong Joo Ryoo. They were assisted by Berkeley Lab Staff Scientist Richard Bonventre, Senior Scientific Engineer Associate Joe Wallig, mechanical engineer Joseph Saba, and machinist James Daniel Boldi.

Given the tight timeline and limited space, another installation challenge was where to put all the detector components. Eos collaborators across the country coordinated to bring everything in at just the right time, fully tested and ready to go for the build.

“Some of the deliveries stayed temporarily at Berkeley Lab. Gabriel let us use her office to store hundreds of PMTs for a while. And the Nuclear Science folks were phenomenally accommodating, allowing us to store muon paddles, PMTs, and other parts on the Etcheverry mezzanine,” Pickard says. “We played a huge game of Tetris to get the detector put together.”

Once assembled, Eos acquired and analyzed data in three phases.

This March, it measured “first light” by flashing a blue LED into an optical fiber that points into the detector and then detecting this light with the PMTs. During initial tests, the inner vessel contained air while ensuring all the detector channels were working and the PMTs were measuring single photons.

Next, they filled the inner tank with optically-pure deionized water and took data using various radioactive sources, optical sources, a laser injection system, and cosmic muons to fully evaluate detector performance. During this phase, Eos operated as a water Cherenkov detector.

“In a water Cherenkov detector, you have only Cherenkov light so you can do a precise directional reconstruction of the event. This helps with particle identification at high energies and background discrimination at low energies,” says Kaptanoglu, also the commissioning manager who helps identify the data needed. Among his other roles, he co-leads the simulations and analysis team with Marc Bergevin, a staff scientist at Lawrence Livermore National Lab.

Lastly, the researchers turned Eos into a hybrid detector by injecting into the water a water-based liquid scintillator, which was supplied by Eos collaborator Minfang Yeh at Brookhaven National Laboratory. This allowed the team to explore the stability and neutrino detection capabilities of the novel scintillator. Adding WbLS improves energy and position reconstruction, but it makes event direction reconstruction difficult. A key goal was to show that Eos could still reconstruct the event direction with the WbLS—proving WbLS as a viable, effective, and impressive neutrino detection medium.

“Our hybrid detector gives us the best of both worlds. We measure event directionality with the Cherenkov light, and we achieve excellent energy and position resolution and low detector thresholds using the scintillation light,” says Kaptanoglu, “But by combining Cherenkov and scintillation, we get additional benefits. For example, we can better tell what type of particle is interacting in our detector— whether it’s an electron, neutron, or gamma.”

Eos data analysis combines traditional likelihood and machine learning algorithms to reconstruct events. These novel reconstruction algorithms simultaneously use the Cherenkov and scintillation light, finding a ring of PMTs hit by the Cherenkov light on top of the much larger isotropic scintillation light background. The team also compared the two methods to see if machine learning gave them any advantages.

 “Our goal was to show that we can do this hybrid reconstruction and that we can simulate it well to match with the experimental data,” says Kaptanoglu.  

Their simulations entail microphysical modeling of every aspect of the Eos detector, characterizing in detail how the light is created, propagated, and detected. In addition to producing cool 3D renderings of the detector, Eos simulations will be used to help design future neutrino experiments.

“Our Monte Carlo simulations make predictions, and we compare those to our experimental data. That allows us to validate and improve the Monte Carlo simulations,” say Orebi Gann. “We can use that improved Monte Carlo to predict performance in other scenarios. It’s the step that allows us to go from the measurements we make at Berkeley to predicting how this technology would perform in different application scenarios.”

Although their three-year project recently completed, Orebi Gann has applied for another three years of funding to extend Eos testing at Berkeley.

If funded, the team plans to explore different WbLS cocktails and various photosensor parameters. They are also considering upgrading to custom electronics.

During the additional three years, the team would also devise a plan for moving Eos to a neutrino source if they get follow-on funding. A likely location is the Spallation Neutron Source at Oak Ridge National Laboratory. This facility basically smashes neutrons into a target to produce a huge number of neutrinos.

“Moving Eos to the Spallation Neutron Source would allow us to demonstrate that we can see neutrinos with this technology, in a regime where it’s not as subject to the low energy backgrounds that make reactor neutrino or fission neutrino detection challenging. It’s a step on the road,” says Orebi Gann.

According to Orebi Gann, the next step after that would be to move Eos to a nuclear reactor to prove it can detect neutrino signals in an operational environment with all relevant backgrounds.

However, the ultimate plan is to use Eos experimental results and simulation models to guide how to design Theia-25 (or Theia-100), a massive hybrid neutrino detector with a 25-kiloton (or 100-kiloton) WbLS tank and tens of thousands of ultrafast photosensors.

Orebi Gann is a lead proponent of Theia, a Berkeley-led “experiment in the making.” If funded, Theia will likely reside at the Deep Underground Neutrino Experiment (DUNE) located in an abandoned gold mine in South Dakota.

Theia has two potential areas of fundamental physics research. The first is understanding the neutrinos themselves.

“In particle physics, we don’t know of any fundamental property that differentiates neutrinos from antineutrinos, so they could in fact be incarnations of the same particle,” she explains. “Understanding their fundamental properties and how they differ could, for example, help explain how the Universe evolved, including offering insights into why it is dominated by matter.”

The second area of fundamental physics research uses the very weakly interacting neutrinos to probe the world around us.

“A large WbLS detector would enable us to look at solar neutrinos, supernova neutrinos, geo-neutrinos naturally produced in the Earth, and a vast array of other measurements,” says Orebi Gann. “For example, solar neutrinos would give us a real-time monitor of the Sun.”

“What’s interesting about Theia is the breadth of its program. I can go on for an hour about the physics of Theia,” Orebi Gann adds. “I think Eos, and the other R&D technology demonstrators around the world, will allow us to realize something like Theia, which would have a rich program of world leading physics across nuclear physics, high energy physics, and astrophysics.”

This is a reposting of my magazine feature, courtesy of UC Berkeley’s 2024 Berkeley Physics Magazine.

Islands Protruding from Black Holes Are Key to Solving Paradox

Stephen Hawking showed a black hole constantly emits radiation that contains almost no information about its interior, causing the black hole to slowly evaporate. This suggests some information is irretrievably lost when the black hole dies. Theorists ever since have struggled to resolve the Hawking Information Paradox, which states information can neither be emitted from a black hole nor preserved inside it forever.

Based on Hawking’s calculations, the radiation and black hole are quantum mechanically linked, and this entanglement keeps rising until the black hole evaporates with quantum information. But theorists later determined the entanglement peaks when the black hole is massive and then drops to zero—so information can escape.

As part of this work, Berkeley Physics Assistant Professor Geoff Penington co-discovered “entanglement islands” sticking out of black holes, created when particles deep inside a black hole are reassigned to the radiation. Why this rearrangement occurs is a mystery, but entanglement islands may be the key to identifying how information escapes.

“Complementarity” theory hypothesizes information is stored in the black hole’s surface while also passing inside, creating two copies of information representing different viewpoints that can’t be simultaneously observed. “Firewall” theory hypothesizes everything falling into a black hole is incinerated by a physical firewall of energy surrounding an empty black hole, contradicting general relativity.

New research by Berkeley Physics Professor Raphael Bousso and Penington suggests entanglement islands protrude further than initially thought—as much as an atom beyond a black hole’s surface.

“Getting a scientific instrument within an atom’s width of a black hole horizon requires far more advanced technology than our current spaceships,” says Bousso. “But in principle we can tell which theory is correct by experimentally probing a black hole from the outside. This was a huge surprise.”

This is a reposting of my magazine research highlight, courtesy of UC Berkeley’s 2024 Berkeley Physics Magazine.

Image: Science Lab/Alamy

How does radiation in space affect the brain?

Exposure to deep space poses many potential risks to the health of astronauts, but one of the biggest dangers is space radiation. Above Earth’s protective shielding, astronauts are exposed to radiation from energetic charged particles that increases their risk of cancer, damage to the central nervous system and a host of other health problems.

A new study has now investigated how chronic, space-like irradiation impacts the brain function of mice. To learn more, I spoke with Ivan Soltesz, PhD, a senior author on the study and a professor of neurosurgery and neurosciences at Stanford.

What was the goal of your study?

“Our basic question was ‘what happens to your brain during a mission to Mars?’ So far, only the Apollo astronauts have traveled far enough beyond the Earth’s protective magnetic field to be exposed to similar galactic cosmic radiation levels, albeit only for short durations.

In previous rodent studies, my lab observed that neuronal function is disrupted by low levels of radiation, a fraction of the dose used for cancer therapy. However, technical constraints required us to deliver the entire radiation dose within minutes, rather than across several months as during a mission to Mars. In the current study, we are the first to investigate the impact of prolonged radiation exposures, at Mars-relevant doses and dose rates, on the neurological function. We used a new neutron irradiation facility at Colorado State University.”

What part of the brain did you study?

“The hippocampus, which is critical for several important brain functions, including the formation of new memories and spatial navigation. And the medial prefrontal cortex, which is important for retrieving preexisting memories, making decisions and processing social information. Thus, deficits in either of these two brain regions could detrimentally impact the ability of astronauts to safely and successfully carry out a mission to Mars.”

What did you find?

“My lab at Stanford measured electrical properties of individual neurons from mice that were exposed to six months of chronic neutron radiation. We determined that after chronic radiation exposure, neurons in the hippocampus were less likely to respond to incoming stimuli and they received a reduced frequency of communication from neighboring neurons.

Our collaborators at UC, Irvine found that chronic neutron radiation also caused neuronal circuits in both the hippocampus and medial prefrontal cortex to no longer show long-lasting strengthening of their responses to electrical stimulation, normally referred to as long-term potentiation. Long-term potentiation is a cellular mechanism that allows memory formation.

Our collaborators also conducted behavioral tests. The mice displayed lasting deficits in learning, memory, anxiety and social behavior — even months after radiation exposure. Based on these results, our team predicts that nearly 1 in 5 astronauts would experience elevated anxiety behavior during a mission to Mars, while 1 in every 3 astronauts would struggle with memory recall.”

How can these findings facilitate safe space exploration?

“By understanding radiation risks, future missions can plan practical changes — such as locating astronaut sleeping spaces towards the center of the spacecraft where intervening material blocks more incoming radiation — that may help to mitigate the risks associated with interplanetary travel.

However, my lab believes the best way to protect astronauts from the harmful effects of space radiation is to understand at a basic science level how neuronal activity is disrupted by chronic radiation exposures.

One promising sign is that radiation exposures that occur in space rarely cause neurons in the brain to die, but rather cause smaller scale cellular changes. Thus, we should be able to develop strategies to modulate neuronal activity to compensate for radiation-induced changes. Our team is already starting a new set of chronic space-radiation experiments to test a candidate countermeasure drug.”

Would you ever go to space, given how harmful it is on the human body?

“The radiation risks we discovered are mostly a concern for travel beyond low earth orbit, such as months-long missions to Mars. Shorter trips to the moon — such as the Apollo missions — or months spent in Earth orbit aboard the International Space Station appear to pose a much lower risk of radiation-induced cognitive deficits. I would definitely like to go into space for at least a few quick orbits.

I’m also confident that my lab and others will expand our understanding of how chronic radiation impacts the nervous system and to develop the effective countermeasures needed to enable safe missions towards the moon or Mars within the next decade. However, I’m not sure I’m ready to leave my lab unattended for two years while I take a sabbatical to Mars.”

Photo by ColiN00B

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

Cleaning cosmic microwave background data to measure gravitational lensing

NERSC facilitates development of new analysis filter to better map the invisible universe

A set of cosmic microwave background 2D images with no lensing effects (top row) and with exaggerated cosmic microwave background gravitational lensing effects (bottom row). Image: Wayne Hu and Takemi Okamoto/University of Chicago

Cosmic microwave background (CMB) radiation is everywhere in the universe, but its frigid (-460° F), low-energy microwaves are invisible to the human eye. So cosmologists use specialized telescopes to map out the temperature spectrum of this relic radiation — left over from the Big Bang — to learn about the origin and structure of galaxy clusters and dark matter.

Gravity from distant galaxies cause tiny distortions in the CMB temperature maps, a process called gravitational lensing, which are detected by data analysis software run on supercomputers like the Cori system at Lawrence Berkeley National Laboratory’s (Berkeley Lab’s) National Energy Research Scientific Computing (NERSC) facility. Unfortunately, this temperature data is often corrupted by “foreground emissions” from extragalactic dust, gas, and other noise sources that are challenging to model.

“CMB images get distorted by gravitational lensing. This distortion is not a nuisance, it’s the signal we’re trying to measure,” said Emmanuel Schaan, a postdoctoral researcher in the Physics Division at Berkeley Lab. “However, various foreground emissions always contaminate CMB maps. These foregrounds are nuisances because they can mimic the effect of lensing and bias our lensing measurements. So we developed a new method for analyzing CMB data that is largely immune to the foreground noise effects.”

Schaan collaborated with Simone Ferraro, a Divisional Fellow in Berkeley Lab’s Physics Division, to develop their new statistical method, which is described in a paper published May 8, 2019 in Physical Review Letters.

“Our paper is mostly theoretical, but we also demonstrated that the method works on realistic simulations of the microwave sky previously generated by Neelima Sehgal and her collaborators,” Schaan said.

These publicly available simulations were originally generated using computing resources at the National Science Foundation’s TeraGrid project and Princeton University’s TIGRESS file system. Sehgal’s team ran N-body three-dimensional simulations of the gravitational evolution of dark matter in the universe, which they then converted into two-dimensional (2D) simulated maps of various components of the microwave sky at different frequencies — including 2D temperature maps of foreground emissions.

These 2D images show different types of foreground emissions that can interfere with CMB lensing measurements, as simulated by Neelima Sehgal and collaborators. From left to right: The cosmic infrared background, composed of intergalactic dust; radio point sources, or radio emission from other galaxies; the kinematic Sunyaev-Zel’dovich effect, a product of gas in other galaxies; and the thermal Sunyaev-Zel’dovich effect, which also relates to gas in other galaxies. Image: Emmanuel Schaan and Simone Ferraro/Berkeley Lab

Testing Theory On Simulated Data

NERSC provided resources that weren’t otherwise available to the team. Schaan and Ferraro applied their new analysis method on the existing 2D CMB temperature maps using NERSC. They wrote their analysis code in Python and used a library called pathos to run across multiple nodes in parallel. The final run that generated all the published results were run on  NERSC’s Cori supercomputer.

“As we progressively improved our analysis, we had to test the improved methods,” Schaan said. “Having access to NERSC was very useful for us.”

The Berkeley Lab researchers did many preliminary runs on NERSC’s Edison supercomputer before it was decommissioned because the wait time for the Edison queue was much shorter than the Cori queues. Schaan said they haven’t yet optimized the code for the Cori many-core energy-efficient KNL nodes, but they need to do that soon.

It might be time to speed up that code given their future research plans. Schaan and Ferraro are still perfecting their analysis, so they may need to run an improved method on the same 2D CMB simulations using NERSC. They also hope to begin working with real CMB data.

“In the future, we want to apply our method to CMB data from Simons Observatory and CMB S4, two upcoming CMB experiments that will have data in a few years. For that data, the processing will very likely be done on NERSC,” Schaan said.

NERSC is a U.S. Department of Energy Office of Science User Facility.

For more information, see this Berkeley Lab news release: A New Filter to Better Map the Dark Universe.

This is a reposting of my news feature orginally published by Berkeley Lab’s Computing Sciences.

Shedding New Light on Luminous Blue Variable Stars

3D Simulations Disperse Some of the Mystery Surrounding Massive Stars

ResizedImage449270-LVB
A snapshot from a simulation of the churning gas that blankets a star 80 times the sun’s mass. Intense light from the star’s core pushes against helium-rich pockets in the star’s exterior, launching material outward in spectacular geyser-like eruptions. The solid colors denote radiation intensity, with bluer colors representing regions of larger intensity. The translucent purplish colors represent the gas density, with lighter colors denoting denser regions. Image: Joseph Insley, Argonne Leadership Computing Facility.

Three-dimensional (3D) simulations run at two of the U.S. Department of Energy’s national laboratory supercomputing facilities and the National Aeronautics and Space Administration (NASA) have provided new insights into the behavior of a unique class of celestial bodies known as luminous blue variables (LBVs) — rare, massive stars that can shine up to a million times brighter than the Sun.

Astrophysicists are intrigued by LBVs because their luminosity and size dramatically fluctuate on a timescale of months. They also periodically undergo giant eruptions, violently ejecting gaseous material into space. Although scientists have long observed the variability of LBVs, the physical processes causing their behavior are still largely unknown. According to Yan-Fei Jiang, an astrophysicist at UC Santa Barbara’s Kavli Institute for Theoretical Physics, the traditional one-dimensional (1D) models of star structure are inadequate for LBVs.

“This special class of massive stars cycles between two phases: a quiescent phase when they’re not doing anything interesting, and an outburst phase when they suddenly become bigger and brighter and then eject their outer envelope,” said Jiang. “People have been seeing this for years, but 1D, spherically-symmetric models can’t determine what is going on in this complex situation.”

Instead, Jiang is leading an effort to run first-principles, 3D simulations to understand the physics behind LBV outbursts — using large-scale computing facilities provided by Lawrence Berkeley National Laboratory’s National Energy Research Scientific Computing Center (NERSC), the Argonne Leadership Computing Facility (ALCF), and NASA. NERSC and ALCF are DOE Office of Science User Facilities.

Physics Revealed by 3D

In a study published in Nature, Jiang and his colleagues from UC Santa Barbara, UC Berkeley, and Princeton University ran three 3D simulations to study three different LBV configurations. All the simulations included convection, the action when a warm gas or liquid rises while its cold counterpart sinks. For instance, convection causes hot water at the bottom of a pot on a stove to rise up to the top surface. It also causes gas from a star’s hot core to push to its outer layers.

During the outburst phase, the new 3D simulations predict that convection causes a massive star’s radius to irregularly oscillate and its brightness to vary by 10 to 30 percent on a timescale of just a few days — in agreement with current observations.

“Convection causes the star to expand significantly to a much larger size than predicted by our 1D model without convection. As the star expands, its outer layers become cooler and more opaque,” Jiang said.

Opacity describes how a gas interacts with photons. The researchers discovered that the helium opacity in the star’s outer envelope doubles during the outburst phase, making it more difficult for photons to escape. This leads the star to reach an effective temperature of about 9,000 degrees Kelvin (16,000 degrees Fahrenheit) and triggers the ejection of mass.

“The radiation force is basically a product of the opacity and the fixed luminosity coming from the star’s core. When the helium opacity doubles, this increases the radiation force that is pushing material out until it overcomes the gravitational force that is pulling the material in,” said Jiang. “The star then generates a strong stellar wind, blowing away its outer envelope.”

Massive Simulations Required

Massive stars require massive and expensive 3D simulations, according to Jiang. So he and his colleagues needed all the computing resources available to them, including about 15 million CPU hours at NERSC, 60 million CPU hours at ALCF, and 10 million CPU hours at NASA. In addition, NERSC played a special role in the project.

“The Cori supercomputer at NERSC was essential to us in the beginning because it is very flexible,” Jiang said. “We did all of the earlier exploration at NERSC, figuring out the right parameters to use and submissions to do. We also got a lot of support from the NERSC team to speed up our input/output and solve problems.”

In addition to spending about 5 million CPU hours at NERSC on the early phase of the project, Jiang’s team used another 10 million CPU hours running part of the 3D simulations.

“We used NERSC to run half of one of the 3D simulations described in the Nature paper and the other half was run at NASA. Our other two simulations were run at Argonne, which has very different machines,” said Jiang. “These are quite expensive simulations, because even half a run takes a lot of time.”

Even so, Jiang believes that 3D simulations are worth the expense because illuminating the fundamental processes behind LBV outbursts is critical to many areas of astrophysics — including understanding the evolution of these massive stars that become black holes when they die, as well as understanding how their stellar winds and supernova explosions affect galaxies.

Jiang also used NERSC for earlier studies, and his collaboration is already running follow-up 3D simulations based on their latest results. These new simulations incorporate additional parameters — including the LBV star’s rotation and metallicity — varying the value of one of these parameters per run. For example, the speed from rotation is larger at the star’s equator than at its poles. The same is true on Earth, which is one of the reasons NASA launches rockets from Florida and California near the equator.

“A massive star has a strong rotation, which is very different at the poles and the equator. So rotation is expected to affect the symmetry of the mass loss rate,” said Jiang.

The team is also exploring metallicity, which in astrophysics refers to any element heavier than helium.

“Metallicity is important because it affects opacity. In our previous simulations, we assumed a constant metallicity, but massive stars can have very different metallicities,” said Jiang. “So we need to explore the parameter space to see how the structure of the stars change with metallicity. We’re currently running a simulation with one metallicity at NERSC, another at Argonne, and a third at NASA. Each set of calculations will take about three months to run.”

Meanwhile, Jiang and his colleagues already have new 2018 data to analyze. And they have a lot more simulations planned due to their recent allocation awards from INCITE, NERSC, and NASA.

“We need to do a lot more simulations to understand the physics of these special massive stars, and I think NERSC will be very helpful for this purpose,” he said.

This is a reposting of my news feature originally published by Berkeley Lab’s Computing Sciences.

Space, the new surgical frontier? A Q&A

Photo by WikiImages

Many medical treatments — in their current form — would be unfeasible on deep space missions, such as a journey to Mars.

How will we diagnose and treat the ailments of future space travelers? And what medical issues will they likely encounter? I posed these questions to Sandip Panesar, MD, a postdoctoral research fellow at Stanford who wrote a recent article about surgery in space in the British Journal of Surgery.

What inspired you to research surgery in space?

“From a young age, I’ve always been interested in space travel. I also have a background in surgery, trauma and emergency medicine. So it just clicked one day when I was reading about SpaceX. I realized they may actually send people to Mars, so we need to consider the medical implications of that. Specifically, how would you perform surgery?

The need for surgical care in space in the near future will likely revolve around emergency situations — such as crushes, impacts, falls and burns — since the possibility of trauma occurring during exploratory missions can never be ruled out. In cases of severe trauma, significant internal bleeding may necessitate invasive surgical procedures.”

What adverse conditions do space travelers face?

“People are exposed to a few key physical conditions in space — solar particle radiation, temperature extremes and a lack of gravity. Solar particle radiation is a lot different than the particles people are exposed to on Earth. It has a higher chance of causing DNA damage, leading to an increased risk of high-grade cancers prone to metastasize. However, a lack of gravity causes a whole host of even more critical changes in the human body.”

How does this extraterrestrial environment impact human physiology?

“One of the biggest changes is the redistribution of bodily fluids. On Earth, gravity and walking upright pulls most of our fluids down to our legs. In space, these fluids distribute evenly throughout the body. This affects heart rate and blood pressure, increases intracranial pressure and causes face swelling. And it decreases leg size, a phenomenon called ‘chicken legs.’

An absence of gravity also causes the bones and muscles to atrophy.

In addition, the makeup of white blood cells changes in space. Plus, the body produces more stress hormones, called glucocorticoids, which further weaken the immune system. This may negatively affect wound healing, which is critical to surgical recovery.

Microbes are also known to be more pathological in space, making the risk of a serious infection after surgery even higher.”

How can surgery be adapted for space?

“One idea is to include a trauma pod, an enclosed medical suite, in the space station or vessel — a concept that originated in military medicine.

We’ve also proposed minimally-invasive keyhole surgery, but it has limited use in trauma situations and a pretty large learning curve. So open surgery is likely but challenging in space. For instance, the bowel is free-floating within the abdominal cavity,  so it can float out when you open the stomach if there’s no gravity. This carries a risk of infection, contamination and perforation. One potential solution is to use a hermetically sealed enclosure — placing a clear plastic box over the wound and working essentially in a glove box with a pressure differential.”

Could surgical robots or other equipment help?

“Mars is 48 million miles away and the radio signal delay is 20 minutes, so using robots controlled by surgeons on Earth isn’t feasible. Instead, researchers are developing robots that can perform surgery by themselves or with really minimal human assistance. There have already been trials of robots that can suture together pig bowels with minimal assistance.

Finally, the size and weight of the payload is a huge barrier and surgical specialties all use different tools. A feasible solution is to bring a 3D printer that can print bandages, casts, surgical tools and even maybe pharmaceuticals. Also, you could diagnosis with an ultrasound scanner and a compact CT scanner like the ones used in ambulances in the UK.”

Would you want to be an on-board surgeon?

“Not just yet. I still have a lot of things I want to do on Earth.”

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

NASA videos highlight using omics to study what happens to a body in space

Space is a hostile place, even inside a spacecraft. Radiation, weightlessness and isolation are only a few of the unique stressors faced by astronauts during space travel.

As NASA prepares for a manned journey to Mars, researchers are studying what happens to the human body in space to determine the health risks of a several-year mission. This research includes a unique study of identical twin astronauts to investigate the effects of spaceflight at a molecular level — comparing data from Scott Kelly, who recently completed a one-year space mission, with data from his brother who led a normal life on Earth.

NASA recently produced a series of web videos, “Omics: Exploring Space Through You, ” that discusses its twins study and features Michael Snyder, MD, professor and chair of genetics at Stanford and principal investigator on one of the projects. Omics is a field of study that integrates different types of molecular information and, as Snyder explains in the introductory video:

“In many respects, it’s like a jigsaw puzzle. A jigsaw puzzle can be made of 1000 pieces but you don’t really see the picture until you put all those pieces together. That’s the same for omics; you basically try and understand all of the individual pieces so you can see the whole picture.”

NASA is making billions of measurements of both twins to see what space really does to the human body. And researchers hope that one day omics profiles will be conducted on a large scale in clinics, not just on astronauts, so we can switch from a “one size fits all” approach to personalized medicine.

“OMICS is really an amazing field where we can look at people and their health at a level that’s never been possible before,” Snyder comments. “And with that we’ll be able to better manage people’s health and try and keep them healthy long before they get sick.”

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

Perseids – Nature’s Annual Light Show

Photograph courtesy of reway2007 via a Creative Commons license.
Photograph courtesy of reway2007 via a Creative Commons license.

Whenever I think of meteor showers, I think years back to a perfect moment. I was crashed out with friends on a sandy beach alongside the Tuolumne River during a 2-day white water rafting trip. We were enjoying a balmy summer night as we lay on top of our sleeping bags, looking up at the amazing display of stars in a sky free of city light pollution. As we chatted and sipped wine, I noticed an incredibly bright “shooting star” flaming across the sky. Then another. And another. I’d never seen so many “shooting stars” (meteors). I stayed up most of the night to watch the nearly continuous celestial display. When I got home, I learned that it was actually an annual event – the prolific Perseid meteor shower.

Meteor showers can appear anywhere in the sky. But if you trace their path, the meteors appear to come from the same region in the sky. In the case of the Perseids, the meteors appear to originate from the constellation Perseus.

Meteor showers are caused by comets. As a comet orbits the Sun, it sheds a debris stream of ice and dust along its orbit. When Earth travels through this cloud of debris, the bits of interplanetary rock strike the Earth’s upper atmosphere where they are heated by friction and ignited.

The Perseid meteor shower comes from the Swift-Tuttle, a huge comet with a nucleus of 26 km and meteoroids hitting our atmosphere at 132,000 mph. According to new research by NASA, the Perseids are the most prolific meteor shower. The number of resulting meteors can top 100 per hour.

Although the meteor shower is active for several days, the peak will happen tonight through the early hours of tomorrow morning. A crescent moon will set shortly after midnight, leaving the skies dark for optimal viewing until pre-dawn. You just need to search out a secluded spot away from the glow of city lights, like a state or city park, then lie back and enjoy the show.

Curiosity and Nervousness over the Mars Landing

Artist's concept animation depicting the moment that NASA's Curiosity rover touches down onto Mars.
Artist’s animation depicting the moment that NASA’s Curiosity rover touches down onto Mars. (NASA/JPL-Caltech image)

When I tried to make lunch plans with a friend for next week, he didn’t know yet whether he could meet me. That’s because his plans depend on how smoothly the Curiosity rover lands on Mars tonight. His research team put together the Radiation Assessment Detector that is mounted on the top deck of the Curiosity rover.

NASA’s Mars Science Laboratory spacecraft with the Curiosity rover are approaching Mars at this moment. It’s expected to land tonight at 10:31 p.m. PDT (Pacific Daylight Time). The technical challenges involved in the Curiosity’s landing are daunting. The final minutes to landing are described beautifully in the NASA Jet Propulsion Laboratory’s popular video dubbed “The Seven Minutes of Terror.”

We still aren’t sure if life ever existed on Mars. From past missions, researchers know that there used to be water there. Now they want to determine if Mars once had the kind of environment that could be habitable or conducive for the formation of microbial life.

The Curiosity rover is a car-like rover that will search Mars for past or present conditions favorable for life on the planet. It is basically a science lab on wheels, including 10 complex scientific instruments. These instruments are designed to study the chemistry of rocks, soil and atmosphere — searching for signs of past life on Mars.

One of those scientific instruments is the Radiation Assessment Detector, which is designed to characterize the energetic particle spectrum at the surface of Mars. This will allow researchers to determine the radiation dose for humans at the planet surface, as well as provide input on the effects of particles on the surface and atmosphere. The surface is thought to have much higher radiation than Earth, since Mars has a thinner atmosphere and no global magnetic shield to divert charged particles.

Although all research requires patience, hurling your research instrument at a far away planet requires both patience and guts. The landing may cause 7 minutes of terror, but the days of waiting must include its own nail-biting nervousness. When I get together with my friend for lunch, I’ll check his nails. Hopefully the landing will be a success, so he’ll be at the Jet Propulsion Laboratory for the next couple weeks though. I can wait.

The Eclipse is Coming!

solar eclipse
Photograph courtesy of the Exploratorium via Creative Commons.

When viewed from the Earth, a solar eclipse happens when the Moon passes between the Sun and Earth so the Moon blocks the Sun. If the Moon only blocks part of the Sun, then it is a partial solar eclipse. If the Sun is fully obscured by the Moon, then it is a total solar eclipse. Total eclipses are rare at any one location, because the Moon fully blocks the Sun along only a narrow path on the Earth’s surface traced by the Moon’s shadow.

According to the National Aeronautics and Space Administration (NASA), a partial solar eclipse will occur on May 20, 2012. This is the first solar eclipse to happen in the United States since 1994. In San Francisco, this eclipse will begin at 5:15 pm and end at 7:40 pm. The maximum eclipse will occur at 6:32 pm when 85% of the sun will be obscured. This partial eclipse will look the like the Moon has a ring of fire surrounding it.

Although it is tempting, you shouldn’t view a solar eclipse with the naked eye. Your eye-lens will concentrate the sun’s light onto your retina, and this can cause permanent eye damage. You can safely view a solar eclipse wearing inexpensive solar glasses (with a “CE” label), which have filters that block out 99.99% of the sun’s light and 100% of the harmful ultraviolet rays. Don’t have solar glasses? You can also safely view a solar eclipse by indirect projection – projecting the image of the sun onto a white piece of paper using a pinhole camera. The San Francisco Exploratorium has directions on how to make a pinhole camera.

You can also view the partial solar eclipse at science centers, such as the Lawrence Hall of Science in Berkeley and the Chabot Space and Science Center in Oakland.