Figure 1. Radial velocity at 37.7 days of star time. Orange-red is outward flow, blue-turquoise is inward-directed flow. The giant dipole mode is directed from the top-left to the bottom right. (Used with permission of Professor Herwig and University of Victoria.)

Lately, it seems that not a month goes by before we hear of discovery of yet another extrasolar planet (or exoplanet)—a planet orbiting a distant star. As of May 1, 2018, 3,767 exoplanets have been confirmed in 2,816 star systems.

To help drive the extensive search for extrasolar planets, NASA installed sensors in the Keppler and TESS space telescopes that can measure very, very small light fluctuations in a star. As a planet transits a star, the telescopes detect the slight changes in light from the star. And, while intended to find planets, these fluctuations also carry a scientific byproduct that excites scientists like Professor Falk Herwig of the University of Victoria in British Columbia and Professor Paul Woodward of the University of Minnesota in the United States.

“On earth, seismologists look at the acoustic oscillations of materials in the mantle to determine the structure of the strata hidden below,” said Professor Herwig. “All materials oscillate at their own frequencies when excited; the components of stars oscillate light energy. So, in asteroseismology we look at the light oscillations from a star, and they can tell us about its mechanical makeup. The search for planets has given astrophysicists an incredible new tool.”

At the core of a star more massive than our sun, large amounts of energy are released from nuclear fusion reactions of hydrogen to helium at temperatures of many million degrees Celsius. Heat energy travels outward through the sphere, first through convection and mixing that goes along with it (in what’s called the convection zone) and then through radiation (in a stable radiative zone). The motions of stellar fluid in the convective core excite waves in the stable envelope, and these can be ultimately detected by the space telescopes as tiny fluctuations of the light emerging from the star’s surface.

“One aspect that is still poorly known about a star’s structure,” added Professor Herwig, “is the transition region at the boundary between the star’s convective core and the stable envelope.  Inside the convection zone, plasma mixes, but it’s not clear how the mixing process turns off in the transition from the convective core to the stable envelope.”

To understand these processes, researchers must simulate them on a supercomputer using complex hydrodynamics calculations over a long series of time steps.

Simulating Stars

Professor Herwig, along with Dr. Robert Andrassy and PhD candidate Ondrea Clarkson are part of the Computational Stellar Astrophysics group at the University of Victoria. Professor Woodward is the author of the PPMstar simulation computer code that Herwig’s group uses. Together the team is investigating how atomic elements are formed in stars, specifically looking at how stellar convection, nuclear reactions, and atomic element formation processes work together in the final stages of the lives of stars.

In March of 2018, they were granted access to Niagara, SciNet’s newest 60,000-core supercomputer at the University of Toronto to simulate core convection and the mixing processes to help unlock some of these mysteries. The Canada Foundation for Innovation and Provincial Governments (Ontario and British Columbia) provided funding through Compute Canada for four new systems across the country, including Niagara. Niagara was built by Lenovo with Intel Xeon Gold 6148 processors. The supercomputer is a game changer for science.

“We had three main objectives,” commented Professor Herwig. “We wanted to improve the general understanding of convection in stars, provide simulations and visualizations of the mixing process at the core convection boundary, and to start providing detailed 3D simulation-based understanding of the light oscillations the scientists were seeing from the Keppler telescope, and would see from TESS.”

Herwig and Woodward have worked together for many years, using the PPMstar code. Over the last two years, Professor Woodward rewrote the code, and the arrival of Niagara earlier this year presented an opportune time to run the new code on the largest supercomputer in Canada.

“In the past,” stated Herwig, “we were limited at what we could do on computers in Canada. We ran limited calculations of low-resolution 3843-grids of a star to set up the system, and then 7683-grids for the full simulation over millions of time steps, but it would take up to two weeks for a single run on a high-performance computer in Canada.”

Waiting for results really slows down how much the teams could do in terms of scientific experimentation and discovery. They couldn’t define the next problem to solve until a run completed, after which they would need to request more time on the system for another simulation. The other challenge is that they needed very high-resolution simulations on 1,5363-grids to simulate the outer envelope of the star where the structure can host internal gravity waves and to determine the properties of the narrow transition region from convective core to stable envelope.

“Scientists have already realized one-dimensional simulations to help interpret the observable oscillations in stars,” explained Herwig. “But these oscillations provide us much more information and data than we can interpret with 1D simulations and 2D approximations. To really understand the oscillations in the envelope, we need to first simulate convection in as high resolution as possible and portray it as accurately as possible in a three-dimensional 4p sphere.”

Runs at this high of resolution could not realistically be done on a supercomputer in Canada. Until now, they had to turn to the Blue Waters machine at the National Center for Supercomputing Applications (NCSA) at University of Illinois at Urbana-Champaign. But, with Niagara, they can run these calculations in-country.

Star Revelations

The calculations Professors Herwig and Woodward ran on Niagara with the PPMstar 2.0 code were of the turbulent core convection in a massive star that is 25 times the mass of our sun (25 solar masses). The simulations were performed on a Cartesian 3D grid of 1,5363 cells over 2.02 million time steps covering 57.69 days of the star’s life, and essentially solved the mass, momentum, and energy conservation equations for stellar conditions.

“These simulations provide unprecedented detail of the convective flow in its natural three-dimensional, full-sphere geometry,” said Herwig. “They also show a picture of the physics of mixing at the convective boundary and the oscillations in the stable envelope above the core.”

According to Professor Herwig, the convective simulations show a large-scale, global dipole mode flow pattern in the star, as seen in a snapshot of the radial velocities in Figure 1 at star time 37.7 days. The flow consists of two giant cells in which the heated fluid travels outward from the center of the sphere (shown in orange/red) toward the bottom-right, and the cooling fluid flow proceeds from the top-left toward the center of the sphere (shown in blue). Once the rising flow approaches the convective boundary, the stability of the stratification beyond the boundary prohibits any substantial convective fluid motion to proceed beyond it, so it has to turn around.

At the boundary, the flow travels horizontally as seen by the orange component in a snapshot of the tangential velocities in Figure 2 at the same 37.7-day star time. The material flows along the convective boundary in a sweeping motion and approaches the antipode (top-left) from all sides. Here, the cooling flows converge and again are confined by the stable layers above the convection zone. The flows must separate from the boundary and turn inward again.

The internal gravity waves in the stable layer are very clear in Figure 2. These waves have implications for asteroseismology observations of massive stars. Contrary to these well-ordered waves, the motions in the convective core are highly turbulent. This is revealed by the fine detail that the simulations of 1,5363-grids can show. The high resolving power of the smaller scales will be needed as Professor Herwig and his team perform more detailed analyses of the mixing processes at the very narrow convective boundary transition layer.

While the figures here show unprecedented views of convective flows and the stable layer, movie renderings show the incredible detail of these and other simulations run on Niagara during the same time period. They can be found on Professor Herwig’s university web page and here.


Figure 2. Magnitude of the tangential velocity 37.7 days of star time. The outer stable region is characterized by oscillations associated with internal gravity waves. The inner region is dominated by a global dipole mode oriented from top-left to bottom-right in the diagonal direction. (Used with permission of Professor Herwig and University of Victoria.)

New Discovery

“Our team has extensive experience running on the largest supercomputers available,” explained Herwig. “While we could do 7683-grid runs on machines in Canada, they would take nearly weeks to complete. We can do the same runs in one or two 8-hour jobs on 288 nodes on Niagara. Our team is now, for the first time, able to perform simulations on Niagara that are second to none internationally in our own research area.”

Having the massive resources of Niagara available to scientists like Herwig creates entirely new research directions. Herwig and Woodward experienced this first-hand as they puzzled out a problem during their time on Niagara, arriving at a possible solution, which they could not have found on a more limited supercomputer in the same allotted time.

“Although the stable envelope and the convective core appear to be separated by a rather stiff boundary,” explained Herwig, “material from the stable layer appears to mix into the core, which would provide fresh nuclear fuel for the star and extend its lifetime—with potentially significant observable consequences.”

Woodward’s code tracks the mixing, or entrainment, of the material in the envelope into the convection zone with high accuracy. The simulations reveal that the mass entrainment rate from the envelope into the core is two to three orders of magnitude higher compared to the mass that is available to be entrained over the lifetime of the core convection phase.

“This is obviously impossible,” concluded Herwig. “We considered possible mechanisms that could squelch this high entrainment rate. Our simulations showed that the numerical grid resolution was high enough, and not a problem in the simulation. We then zeroed in on the possible effect of star rotation.”

Woodward introduced a rotating initial state in the calculations, and then they ran several simulations with different rotation rates over a few days. They discovered that rotation does indeed affect the convective flow substantially, but it would not reduce the mass entrainment sufficiently.

“Maybe this is the most important initial result of our simulations on Niagara. We have found a new puzzle and a new question to investigate.”

Making Science Interactive

“Niagara allows us to really accelerate science,” said Herwig. “We are able to complete in a few weeks work that would normally take a year to do. That allows us to become more interactive with our research. As one simulation runs, we can consider the next problem to give to the computer, analyze the run when it completes after a couple days, and immediately set up and re-run the simulation with new parameters. We get a lot done in a very short period of time, and that allows us to more quickly expand what we know in our field.”

Ken Strandberg is a technical story teller. He writes articles, white papers, seminars, web-based training, video and animation scripts, and technical marketing and interactive collateral for emerging technology companies, Fortune 100 enterprises, and multi-national corporations. Mr. Strandberg’s technology areas include Software, HPC, Industrial Technologies, Design Automation, Networking, Medical Technologies, Semiconductor, and Telecom. He can be reached at

This article was produced as part of Intel’s HPC editorial program, with the goal of highlighting cutting-edge science, research and innovation driven by the HPC community through advanced technology. The publisher of the content has final editing rights and determines what articles are published.