Cleaning cosmic microwave background data to measure gravitational lensing

NERSC facilitates development of new analysis filter to better map the invisible universe

A set of cosmic microwave background 2D images with no lensing effects (top row) and with exaggerated cosmic microwave background gravitational lensing effects (bottom row). Image: Wayne Hu and Takemi Okamoto/University of Chicago

Cosmic microwave background (CMB) radiation is everywhere in the universe, but its frigid (-460° F), low-energy microwaves are invisible to the human eye. So cosmologists use specialized telescopes to map out the temperature spectrum of this relic radiation — left over from the Big Bang — to learn about the origin and structure of galaxy clusters and dark matter.

Gravity from distant galaxies cause tiny distortions in the CMB temperature maps, a process called gravitational lensing, which are detected by data analysis software run on supercomputers like the Cori system at Lawrence Berkeley National Laboratory’s (Berkeley Lab’s) National Energy Research Scientific Computing (NERSC) facility. Unfortunately, this temperature data is often corrupted by “foreground emissions” from extragalactic dust, gas, and other noise sources that are challenging to model.

“CMB images get distorted by gravitational lensing. This distortion is not a nuisance, it’s the signal we’re trying to measure,” said Emmanuel Schaan, a postdoctoral researcher in the Physics Division at Berkeley Lab. “However, various foreground emissions always contaminate CMB maps. These foregrounds are nuisances because they can mimic the effect of lensing and bias our lensing measurements. So we developed a new method for analyzing CMB data that is largely immune to the foreground noise effects.”

Schaan collaborated with Simone Ferraro, a Divisional Fellow in Berkeley Lab’s Physics Division, to develop their new statistical method, which is described in a paper published May 8, 2019 in Physical Review Letters.

“Our paper is mostly theoretical, but we also demonstrated that the method works on realistic simulations of the microwave sky previously generated by Neelima Sehgal and her collaborators,” Schaan said.

These publicly available simulations were originally generated using computing resources at the National Science Foundation’s TeraGrid project and Princeton University’s TIGRESS file system. Sehgal’s team ran N-body three-dimensional simulations of the gravitational evolution of dark matter in the universe, which they then converted into two-dimensional (2D) simulated maps of various components of the microwave sky at different frequencies — including 2D temperature maps of foreground emissions.

These 2D images show different types of foreground emissions that can interfere with CMB lensing measurements, as simulated by Neelima Sehgal and collaborators. From left to right: The cosmic infrared background, composed of intergalactic dust; radio point sources, or radio emission from other galaxies; the kinematic Sunyaev-Zel’dovich effect, a product of gas in other galaxies; and the thermal Sunyaev-Zel’dovich effect, which also relates to gas in other galaxies. Image: Emmanuel Schaan and Simone Ferraro/Berkeley Lab

Testing Theory On Simulated Data

NERSC provided resources that weren’t otherwise available to the team. Schaan and Ferraro applied their new analysis method on the existing 2D CMB temperature maps using NERSC. They wrote their analysis code in Python and used a library called pathos to run across multiple nodes in parallel. The final run that generated all the published results were run on  NERSC’s Cori supercomputer.

“As we progressively improved our analysis, we had to test the improved methods,” Schaan said. “Having access to NERSC was very useful for us.”

The Berkeley Lab researchers did many preliminary runs on NERSC’s Edison supercomputer before it was decommissioned because the wait time for the Edison queue was much shorter than the Cori queues. Schaan said they haven’t yet optimized the code for the Cori many-core energy-efficient KNL nodes, but they need to do that soon.

It might be time to speed up that code given their future research plans. Schaan and Ferraro are still perfecting their analysis, so they may need to run an improved method on the same 2D CMB simulations using NERSC. They also hope to begin working with real CMB data.

“In the future, we want to apply our method to CMB data from Simons Observatory and CMB S4, two upcoming CMB experiments that will have data in a few years. For that data, the processing will very likely be done on NERSC,” Schaan said.

NERSC is a U.S. Department of Energy Office of Science User Facility.

For more information, see this Berkeley Lab news release: A New Filter to Better Map the Dark Universe.

This is a reposting of my news feature orginally published by Berkeley Lab’s Computing Sciences.

Advertisements

Shedding New Light on Luminous Blue Variable Stars

3D Simulations Disperse Some of the Mystery Surrounding Massive Stars

ResizedImage449270-LVB
A snapshot from a simulation of the churning gas that blankets a star 80 times the sun’s mass. Intense light from the star’s core pushes against helium-rich pockets in the star’s exterior, launching material outward in spectacular geyser-like eruptions. The solid colors denote radiation intensity, with bluer colors representing regions of larger intensity. The translucent purplish colors represent the gas density, with lighter colors denoting denser regions. Image: Joseph Insley, Argonne Leadership Computing Facility.

Three-dimensional (3D) simulations run at two of the U.S. Department of Energy’s national laboratory supercomputing facilities and the National Aeronautics and Space Administration (NASA) have provided new insights into the behavior of a unique class of celestial bodies known as luminous blue variables (LBVs) — rare, massive stars that can shine up to a million times brighter than the Sun.

Astrophysicists are intrigued by LBVs because their luminosity and size dramatically fluctuate on a timescale of months. They also periodically undergo giant eruptions, violently ejecting gaseous material into space. Although scientists have long observed the variability of LBVs, the physical processes causing their behavior are still largely unknown. According to Yan-Fei Jiang, an astrophysicist at UC Santa Barbara’s Kavli Institute for Theoretical Physics, the traditional one-dimensional (1D) models of star structure are inadequate for LBVs.

“This special class of massive stars cycles between two phases: a quiescent phase when they’re not doing anything interesting, and an outburst phase when they suddenly become bigger and brighter and then eject their outer envelope,” said Jiang. “People have been seeing this for years, but 1D, spherically-symmetric models can’t determine what is going on in this complex situation.”

Instead, Jiang is leading an effort to run first-principles, 3D simulations to understand the physics behind LBV outbursts — using large-scale computing facilities provided by Lawrence Berkeley National Laboratory’s National Energy Research Scientific Computing Center (NERSC), the Argonne Leadership Computing Facility (ALCF), and NASA. NERSC and ALCF are DOE Office of Science User Facilities.

Physics Revealed by 3D

In a study published in Nature, Jiang and his colleagues from UC Santa Barbara, UC Berkeley, and Princeton University ran three 3D simulations to study three different LBV configurations. All the simulations included convection, the action when a warm gas or liquid rises while its cold counterpart sinks. For instance, convection causes hot water at the bottom of a pot on a stove to rise up to the top surface. It also causes gas from a star’s hot core to push to its outer layers.

During the outburst phase, the new 3D simulations predict that convection causes a massive star’s radius to irregularly oscillate and its brightness to vary by 10 to 30 percent on a timescale of just a few days — in agreement with current observations.

“Convection causes the star to expand significantly to a much larger size than predicted by our 1D model without convection. As the star expands, its outer layers become cooler and more opaque,” Jiang said.

Opacity describes how a gas interacts with photons. The researchers discovered that the helium opacity in the star’s outer envelope doubles during the outburst phase, making it more difficult for photons to escape. This leads the star to reach an effective temperature of about 9,000 degrees Kelvin (16,000 degrees Fahrenheit) and triggers the ejection of mass.

“The radiation force is basically a product of the opacity and the fixed luminosity coming from the star’s core. When the helium opacity doubles, this increases the radiation force that is pushing material out until it overcomes the gravitational force that is pulling the material in,” said Jiang. “The star then generates a strong stellar wind, blowing away its outer envelope.”

Massive Simulations Required

Massive stars require massive and expensive 3D simulations, according to Jiang. So he and his colleagues needed all the computing resources available to them, including about 15 million CPU hours at NERSC, 60 million CPU hours at ALCF, and 10 million CPU hours at NASA. In addition, NERSC played a special role in the project.

“The Cori supercomputer at NERSC was essential to us in the beginning because it is very flexible,” Jiang said. “We did all of the earlier exploration at NERSC, figuring out the right parameters to use and submissions to do. We also got a lot of support from the NERSC team to speed up our input/output and solve problems.”

In addition to spending about 5 million CPU hours at NERSC on the early phase of the project, Jiang’s team used another 10 million CPU hours running part of the 3D simulations.

“We used NERSC to run half of one of the 3D simulations described in the Nature paper and the other half was run at NASA. Our other two simulations were run at Argonne, which has very different machines,” said Jiang. “These are quite expensive simulations, because even half a run takes a lot of time.”

Even so, Jiang believes that 3D simulations are worth the expense because illuminating the fundamental processes behind LBV outbursts is critical to many areas of astrophysics — including understanding the evolution of these massive stars that become black holes when they die, as well as understanding how their stellar winds and supernova explosions affect galaxies.

Jiang also used NERSC for earlier studies, and his collaboration is already running follow-up 3D simulations based on their latest results. These new simulations incorporate additional parameters — including the LBV star’s rotation and metallicity — varying the value of one of these parameters per run. For example, the speed from rotation is larger at the star’s equator than at its poles. The same is true on Earth, which is one of the reasons NASA launches rockets from Florida and California near the equator.

“A massive star has a strong rotation, which is very different at the poles and the equator. So rotation is expected to affect the symmetry of the mass loss rate,” said Jiang.

The team is also exploring metallicity, which in astrophysics refers to any element heavier than helium.

“Metallicity is important because it affects opacity. In our previous simulations, we assumed a constant metallicity, but massive stars can have very different metallicities,” said Jiang. “So we need to explore the parameter space to see how the structure of the stars change with metallicity. We’re currently running a simulation with one metallicity at NERSC, another at Argonne, and a third at NASA. Each set of calculations will take about three months to run.”

Meanwhile, Jiang and his colleagues already have new 2018 data to analyze. And they have a lot more simulations planned due to their recent allocation awards from INCITE, NERSC, and NASA.

“We need to do a lot more simulations to understand the physics of these special massive stars, and I think NERSC will be very helpful for this purpose,” he said.

This is a reposting of my news feature originally published by Berkeley Lab’s Computing Sciences.

Space, the new surgical frontier? A Q&A

Photo by WikiImages

Many medical treatments — in their current form — would be unfeasible on deep space missions, such as a journey to Mars.

How will we diagnose and treat the ailments of future space travelers? And what medical issues will they likely encounter? I posed these questions to Sandip Panesar, MD, a postdoctoral research fellow at Stanford who wrote a recent article about surgery in space in the British Journal of Surgery.

What inspired you to research surgery in space?

“From a young age, I’ve always been interested in space travel. I also have a background in surgery, trauma and emergency medicine. So it just clicked one day when I was reading about SpaceX. I realized they may actually send people to Mars, so we need to consider the medical implications of that. Specifically, how would you perform surgery?

The need for surgical care in space in the near future will likely revolve around emergency situations — such as crushes, impacts, falls and burns — since the possibility of trauma occurring during exploratory missions can never be ruled out. In cases of severe trauma, significant internal bleeding may necessitate invasive surgical procedures.”

What adverse conditions do space travelers face?

“People are exposed to a few key physical conditions in space — solar particle radiation, temperature extremes and a lack of gravity. Solar particle radiation is a lot different than the particles people are exposed to on Earth. It has a higher chance of causing DNA damage, leading to an increased risk of high-grade cancers prone to metastasize. However, a lack of gravity causes a whole host of even more critical changes in the human body.”

How does this extraterrestrial environment impact human physiology?

“One of the biggest changes is the redistribution of bodily fluids. On Earth, gravity and walking upright pulls most of our fluids down to our legs. In space, these fluids distribute evenly throughout the body. This affects heart rate and blood pressure, increases intracranial pressure and causes face swelling. And it decreases leg size, a phenomenon called ‘chicken legs.’

An absence of gravity also causes the bones and muscles to atrophy.

In addition, the makeup of white blood cells changes in space. Plus, the body produces more stress hormones, called glucocorticoids, which further weaken the immune system. This may negatively affect wound healing, which is critical to surgical recovery.

Microbes are also known to be more pathological in space, making the risk of a serious infection after surgery even higher.”

How can surgery be adapted for space?

“One idea is to include a trauma pod, an enclosed medical suite, in the space station or vessel — a concept that originated in military medicine.

We’ve also proposed minimally-invasive keyhole surgery, but it has limited use in trauma situations and a pretty large learning curve. So open surgery is likely but challenging in space. For instance, the bowel is free-floating within the abdominal cavity,  so it can float out when you open the stomach if there’s no gravity. This carries a risk of infection, contamination and perforation. One potential solution is to use a hermetically sealed enclosure — placing a clear plastic box over the wound and working essentially in a glove box with a pressure differential.”

Could surgical robots or other equipment help?

“Mars is 48 million miles away and the radio signal delay is 20 minutes, so using robots controlled by surgeons on Earth isn’t feasible. Instead, researchers are developing robots that can perform surgery by themselves or with really minimal human assistance. There have already been trials of robots that can suture together pig bowels with minimal assistance.

Finally, the size and weight of the payload is a huge barrier and surgical specialties all use different tools. A feasible solution is to bring a 3D printer that can print bandages, casts, surgical tools and even maybe pharmaceuticals. Also, you could diagnosis with an ultrasound scanner and a compact CT scanner like the ones used in ambulances in the UK.”

Would you want to be an on-board surgeon?

“Not just yet. I still have a lot of things I want to do on Earth.”

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

NASA videos highlight using omics to study what happens to a body in space

Space is a hostile place, even inside a spacecraft. Radiation, weightlessness and isolation are only a few of the unique stressors faced by astronauts during space travel.

As NASA prepares for a manned journey to Mars, researchers are studying what happens to the human body in space to determine the health risks of a several-year mission. This research includes a unique study of identical twin astronauts to investigate the effects of spaceflight at a molecular level — comparing data from Scott Kelly, who recently completed a one-year space mission, with data from his brother who led a normal life on Earth.

NASA recently produced a series of web videos, “Omics: Exploring Space Through You, ” that discusses its twins study and features Michael Snyder, MD, professor and chair of genetics at Stanford and principal investigator on one of the projects. Omics is a field of study that integrates different types of molecular information and, as Snyder explains in the introductory video:

“In many respects, it’s like a jigsaw puzzle. A jigsaw puzzle can be made of 1000 pieces but you don’t really see the picture until you put all those pieces together. That’s the same for omics; you basically try and understand all of the individual pieces so you can see the whole picture.”

NASA is making billions of measurements of both twins to see what space really does to the human body. And researchers hope that one day omics profiles will be conducted on a large scale in clinics, not just on astronauts, so we can switch from a “one size fits all” approach to personalized medicine.

“OMICS is really an amazing field where we can look at people and their health at a level that’s never been possible before,” Snyder comments. “And with that we’ll be able to better manage people’s health and try and keep them healthy long before they get sick.”

This is a reposting of my Scope blog story, courtesy of Stanford School of Medicine.

Perseids – Nature’s Annual Light Show

Photograph courtesy of reway2007 via a Creative Commons license.
Photograph courtesy of reway2007 via a Creative Commons license.

Whenever I think of meteor showers, I think years back to a perfect moment. I was crashed out with friends on a sandy beach alongside the Tuolumne River during a 2-day white water rafting trip. We were enjoying a balmy summer night as we lay on top of our sleeping bags, looking up at the amazing display of stars in a sky free of city light pollution. As we chatted and sipped wine, I noticed an incredibly bright “shooting star” flaming across the sky. Then another. And another. I’d never seen so many “shooting stars” (meteors). I stayed up most of the night to watch the nearly continuous celestial display. When I got home, I learned that it was actually an annual event – the prolific Perseid meteor shower.

Meteor showers can appear anywhere in the sky. But if you trace their path, the meteors appear to come from the same region in the sky. In the case of the Perseids, the meteors appear to originate from the constellation Perseus.

Meteor showers are caused by comets. As a comet orbits the Sun, it sheds a debris stream of ice and dust along its orbit. When Earth travels through this cloud of debris, the bits of interplanetary rock strike the Earth’s upper atmosphere where they are heated by friction and ignited.

The Perseid meteor shower comes from the Swift-Tuttle, a huge comet with a nucleus of 26 km and meteoroids hitting our atmosphere at 132,000 mph. According to new research by NASA, the Perseids are the most prolific meteor shower. The number of resulting meteors can top 100 per hour.

Although the meteor shower is active for several days, the peak will happen tonight through the early hours of tomorrow morning. A crescent moon will set shortly after midnight, leaving the skies dark for optimal viewing until pre-dawn. You just need to search out a secluded spot away from the glow of city lights, like a state or city park, then lie back and enjoy the show.

Curiosity and Nervousness over the Mars Landing

Artist's concept animation depicting the moment that NASA's Curiosity rover touches down onto Mars.
Artist’s animation depicting the moment that NASA’s Curiosity rover touches down onto Mars. (NASA/JPL-Caltech image)

When I tried to make lunch plans with a friend for next week, he didn’t know yet whether he could meet me. That’s because his plans depend on how smoothly the Curiosity rover lands on Mars tonight. His research team put together the Radiation Assessment Detector that is mounted on the top deck of the Curiosity rover.

NASA’s Mars Science Laboratory spacecraft with the Curiosity rover are approaching Mars at this moment. It’s expected to land tonight at 10:31 p.m. PDT (Pacific Daylight Time). The technical challenges involved in the Curiosity’s landing are daunting. The final minutes to landing are described beautifully in the NASA Jet Propulsion Laboratory’s popular video dubbed “The Seven Minutes of Terror.”

We still aren’t sure if life ever existed on Mars. From past missions, researchers know that there used to be water there. Now they want to determine if Mars once had the kind of environment that could be habitable or conducive for the formation of microbial life.

The Curiosity rover is a car-like rover that will search Mars for past or present conditions favorable for life on the planet. It is basically a science lab on wheels, including 10 complex scientific instruments. These instruments are designed to study the chemistry of rocks, soil and atmosphere — searching for signs of past life on Mars.

One of those scientific instruments is the Radiation Assessment Detector, which is designed to characterize the energetic particle spectrum at the surface of Mars. This will allow researchers to determine the radiation dose for humans at the planet surface, as well as provide input on the effects of particles on the surface and atmosphere. The surface is thought to have much higher radiation than Earth, since Mars has a thinner atmosphere and no global magnetic shield to divert charged particles.

Although all research requires patience, hurling your research instrument at a far away planet requires both patience and guts. The landing may cause 7 minutes of terror, but the days of waiting must include its own nail-biting nervousness. When I get together with my friend for lunch, I’ll check his nails. Hopefully the landing will be a success, so he’ll be at the Jet Propulsion Laboratory for the next couple weeks though. I can wait.

The Eclipse is Coming!

solar eclipse
Photograph courtesy of the Exploratorium via Creative Commons.

When viewed from the Earth, a solar eclipse happens when the Moon passes between the Sun and Earth so the Moon blocks the Sun. If the Moon only blocks part of the Sun, then it is a partial solar eclipse. If the Sun is fully obscured by the Moon, then it is a total solar eclipse. Total eclipses are rare at any one location, because the Moon fully blocks the Sun along only a narrow path on the Earth’s surface traced by the Moon’s shadow.

According to the National Aeronautics and Space Administration (NASA), a partial solar eclipse will occur on May 20, 2012. This is the first solar eclipse to happen in the United States since 1994. In San Francisco, this eclipse will begin at 5:15 pm and end at 7:40 pm. The maximum eclipse will occur at 6:32 pm when 85% of the sun will be obscured. This partial eclipse will look the like the Moon has a ring of fire surrounding it.

Although it is tempting, you shouldn’t view a solar eclipse with the naked eye. Your eye-lens will concentrate the sun’s light onto your retina, and this can cause permanent eye damage. You can safely view a solar eclipse wearing inexpensive solar glasses (with a “CE” label), which have filters that block out 99.99% of the sun’s light and 100% of the harmful ultraviolet rays. Don’t have solar glasses? You can also safely view a solar eclipse by indirect projection – projecting the image of the sun onto a white piece of paper using a pinhole camera. The San Francisco Exploratorium has directions on how to make a pinhole camera.

You can also view the partial solar eclipse at science centers, such as the Lawrence Hall of Science in Berkeley and the Chabot Space and Science Center in Oakland.