The National Institute for Computational Sciences

Let's Get Ready to Rumble

Simulation Results

The blue shaded region in this image of southern California shows the 600km x 300km area in which the Shake Out earthquake wave propagation simulation was run. The heavily populated Los Angeles basin is identified by a smaller interior box that includes Los Angeles, Long Beach, Westwood, and Whittier Narrows. Warmer colors show cumulative peak accelerations throughout southern California produced by the simulated M7.8 Shake Out rupture. The high peak ground accelerations in the Los Angeles region indicate significant hazard to buildings and people in the region from large, but distant, southern San Andreas earthquakes.

(Image Credit: Kim Olsen San Diego State University, Yifeng Cui and Amit Chourasia, San Diego Supercomputer Center)


NSF supercomputer helps SoCal prepare for the Big One

by Gregory Scott Jones

Very few things in life are certain. If you live in Southern California, however, rest assured that some time in the next few decades you will experience an earthquake of significant magnitude.

And while the disaster itself is probably unavoidable, knowing which areas will be most affected can do a great deal to mitigate the aftermath. For example, where will the strongest ground movement occur, and how long will the shaking last? Obviously, when it comes to new construction in an area with a high probability of an earthquake in the relatively near future, this knowledge is invaluable. Engineers crave this sort of data when they are designing the buildings of tomorrow.

Enter the Southern California Earthquake Center (SCEC), an organization funded by the National Science Foundation (NSF) and the U. S. Geological Survey in recognition of the seismic hazard that looms over Southern California like the heavy Los Angeles smog. In an effort to better understand when and where the next Big One will hit hardest, SCEC is taking advantage of the recently launched, NSF-funded supercomputer known as Kraken.

Located at the National Institute for Computational Sciences (NICS) at Oak Ridge National Laboratory and managed by the University of Tennessee, Kraken is the world’s fastest academic supercomputer with a peak performance of more than 607 teraflops, or 607 trillion calculations per second. To simulate the Big One, scientists need a big computer.

Simulations like those conducted on Kraken help SCEC create both scenario and probabilistic seismic-hazard maps. The former show the distribution of ground motions for a possible future earthquake; the latter reveal peak ground motions a site is likely to experience during a specified period in the future. Data gleaned from these maps is directly incorporated into building codes, prepping future structures for the next great quake and possibly saving lives and dollars when the ground finally decides to shake things up.

Understanding ground motion and the physics of fault ruptures, or areas along a fault line that slip and thus cause seismic events, is crucial to determining the potential aftermath of an earthquake. Specifically, SCEC is modeling both these phenomena in a hypothetical 7.8-magnitude earthquake, noting where in Southern California the ground is most sensitive to movement. Scenario seismic-hazard maps produced from these large-scale simulations have been used for emergency preparedness training in southern California and could potentially help identify everything from at-risk bridges to casualties to number of people displaced.

The fault-rupture and earthquake-wave-propagation simulations (one for each phenomenon) use seven different rupture models, and each rupture model produces different ground-motion scenarios. Each of the rupture models represents a physically realistic representation of a 7.8 earthquake, and each rupture produces essentially the same final surface slip. What differ between the models are the distribution of slip on the fault beneath the surface and the velocity at which the rupture propagates down the fault surface. By simulating each of these potential earthquake ruptures, said SCEC’s Phil Maechling, researchers can get a good picture of areas prone to motion by averaging the results of the seven outcomes.

Most recently SCEC used Kraken to ensure that the organization’s software (known as AWP for Anelastic Wave Propagation) could run at the next level. And run it did. In fact, the simulations used more than 65,000 compute cores (out of 66,048 total), pushing the machine to its very limit and establishing a new record as far as sheer scale in seismological simulations. “We wanted to verify that we could run at this scale,” said Maechling. Now that the researchers are confident of AWP’s scalability, they can begin to take their science to the next level.

Simulations thus far have proved to be very accurate—so much so that if you compare the graph of a simulated quake (using data from an historical quake) with that of the observed data from the real earthquake, “they are nearly identical,” said Maechling.

Simulation Results

This image shows the results of the two types of simulations combined. The fault surface divides the map of southern California into two parts. The colors on the surface of the vault show where on the fault surface slip is occurring as calculated by the dynamic rupture simulation. Colors on the map surface show earthquake waves propagating away from the fault and the ground velocities produced by the fault slip.

(Image Credit: Kim Olsen San Diego State University, Yifeng Cui and Amit Chourasia, San Diego Supercomputer Center)


Unfortunately, this accuracy pertains to only lower frequencies, in this case 1 Hz and below. Lower frequencies affect only larger buildings, namely those 16 stories high or taller. Anyone who has ever seen a picture of Los Angeles knows that city is largely composed of shorter buildings, extending more outward than upward.

So to get a really good picture of how a 7.8 quake would shake up the shorter buildings that make up the majority of Southern California, the team needs to simulate higher frequencies and thus rev up the resolution. (A finer mesh—or three-dimensional grid used in the simulations—is needed to observe shorter wavelengths, that is, higher frequencies). Because of the geometry of the mesh used in the simulations, a twofold increase in resolution requires a 16-fold increase in computing power. Before computers such as Kraken, this was little more than wishful thinking. (Currently, higher frequencies are calculated using a stochastic, or random, approach. While not included in the simulations, these frequencies are incorporated into the final outcome products).

With Kraken, however, refined resolution is not only possible, it’s forthcoming. Specifically, SCEC would like the next round of simulations to be in the neighborhood of 2 Hz, double the current resolution. Eventually, said Maechling, the team has its sights on the 10-Hz scale, but that would require a dramatic increase in both software efficiency and computing power. Given the rate at which supercomputers are accelerating, it might not be that far off. In fact, a whole new era of seismic science may soon be possible.

With the help of these newly enhanced simulations and other computational tools, SCEC would like to begin exploring the predictive side of seismology, eventually producing earthquake forecasts. However, unlike climate and weather, seismology has never readily lent itself to prediction. In fact, said Maechling, the science (or art) of predicting earthquakes is estimated by some scientists to be 100 years behind weather prediction. SCEC aims to close the gap. “We want to transform seismology into a more predictive science,” said Maechling, “like that of climate and weather.”

The difficulties in predicting seismic events are numerous. For instance, earthquakes occur on very short timescales, making long-term observation impossible. Furthermore, because seismologists, geologists, and other researchers don’t know when they will occur, they cannot make preparations to study them as meteorologists do a thunderstorm.

Also, the initial conditions behind these phrenetic phenomena are harder to observe than, say, lightning. After all, except for possible displacements of the ground surface, faults are hidden deep in the geological underground and beyond the bounds of traditional observation, which is precisely why simulation is so useful. However, future earthquake forecasts will probably never be as targeted as current weather forecasts, said Maechling, citing a common misconception. For instance, in the foreseeable future seismologists will not be able to tell us if an earthquake will occur tomorrow. Instead, earthquake forecasts will be more like current climate models, revealing a range of probability over longer time spans, such as 50 years.

“As the science gets better,” said Maechling, “the time periods will get shorter.” Depending on the progression of knowledge and the increased power of supercomputers, current 50-year forecasts could eventually be reduced to annual predictions, better preparing everyone for the Big One—and even smaller ones.

For the time being, however, SCEC will continue to refine its ground-motion and fault-rupture models. “Our investigation into ground motion and the physics of fault rupture is not done,” said Maechling.