Kraken will be decommissioned on April 30, 2014. For more information see Kraken Decommission FAQs
Kraken will be decommissioned on April 30, 2014. For more information see Kraken Decommission FAQs
The National Institute for Computational Sciences

Unraveling a Twister

University of Oklahoma researchers use Kraken to detail the inner workings of tornadoes

by Gregory Scott Jones

 


Anyone from Nebraska to Nashville knows a tornado when they see one. And, hopefully, they know to duck for cover. Tornadoes are among nature’s most powerful weather weapons.

Just last month, in a span of 24 hours beginning on April 27th, the southeastern United States saw a rare outbreak of tornadoes which resulted in a combined 344 deaths, according to estimates by the National Weather Service and the National Oceanic and Atmospheric Administration. Not since 1936 have more people been killed in a two-day period due to tornadoes. And that’s not to mention the likely billions of dollars in property damage.

Despite their prevalence in this country, however, and especially in the central United States in an area known as “tornado alley,” there is still much we don’t know about these much-feared funnels from the sky.

For starters, gathering any sort of data from actual tornadoes is risky business, with chasers physically following actual storms into the heart of harm’s way. These chasers might witness a handful a year, and their mobile radar systems only measure certain variables, such as wind velocity and intensity of precipitation. To truly understand tornadoes, and maybe even one day predict them, researchers need data out of the reach of the chasers, such as pressure and three-dimensional wind structure, and for that they need far more tornadoes than the actual atmosphere produces.

“I don’t need three, I need three hundred,” said Amy McGovern, an assistant professor in the School of Computer Science at the University of Oklahoma, located in the heart of tornado alley, and the principal investigator of a project that is using the University of Tennessee’s (UT’s) Kraken supercomputer to better understand, and hopefully one day predict, tornadoes.

In order to do that, McGovern’s team uses the data from the on-the-ground observations and other monitoring systems to create a complete set of variables to describe the conditions that may, or may not, create a tornado.

The research is funded by the National Science Foundation’s (NSF’s) Faculty Early Career Development (CAREER) Program, a Foundation-wide activity that offers the NSF’s most prestigious awards in support of junior faculty who exemplify the role of teacher-scholars through outstanding research, excellent education and the integration of education and research within the context of the mission of their organizations.

The roots of the project go back five years. Back then, said McGovern, the team used the observational data from a 20-year-old storm and tweaked a few environmental variables to create more than 250 simulated storms, at a 500-meter resolution. This means that that measurements were taken for all 40 variables every 500 meters.

The team quickly realized that a higher resolution was needed to achieve the accuracy they sought. Thanks to supercomputers such as Kraken, currently the eighth fastest in the world, this enhanced resolution is now possible. Funded by the National Science Foundation and managed by the University of Tennessee’s (UT’s) National Institute for Computational Sciences, Kraken is a Cray XT5 housed at Oak Ridge National Laboratory.

McGovern’s team is now generating 150 75-meter resolution possible tornado-precursor storms, with each simulation creating two to three storms and consuming 30 hours and 3,000 of Kraken’s more than 112,000 cores. Of these, says McGovern, approximately 50 to 75 of these storms will produce tornadoes, supplying researchers with a sufficient sample with which to unravel the mysteries of one of Mother Nature’s most common terrors.

These simulations delve into the most complex players in tornadic storms, such as rotating updrafts, or upward moving currents of air that are tilted and rotating, downdrafts, vorticity, which is a measure of the instantaneous spin, tilt, which measures how much horizontal vorticity has tilted towards the vertical, and the various relationships between these factors. The important thing, said McGovern, is understanding how these variables interact. If a storm does in fact generate a tornado, the team begins the process of “relational” data mining. Whereas in the past these variables have been studied individually, McGovern’s “relational” approach studies the relationship between these variables (more than 40 to be exact, of which 20 are intensely examined). In other words, they’re not looking at individual factors, “but how they change over space in time.”

Figure 1: Movie: A simulated storm showing reflectivity. Redder colors indicate a region with more intense precipitation. This simulation also shows a hook echo region, which is a region indicative of a tornado, in the southwest quadrant.

Data mining is necessary due to the fact that each simulation generates approximately a terabyte of data, far too great an amount to investigate traditionally. For example, while an updraft is just one of the variables being studied, the team will investigate all of the variables inside the updraft, such as the pressure gradient and the tilt of the updraft itself, to name a couple. With simulations this complex, at multiple space and time scales, the amount of data generated is insurmountable without the help of computers to quickly locate important figures in a sea of numbers.

While the simulations are being performed on Kraken, the majority of the data mining is being performed on Nautilus, an SGI Altix UV 1000 system that serves as the centerpiece of UT’s new Remote Data Analysis and Visualization Center, likewise located at Oak Ridge National Laboratory. Nautilus’s unique architecture provides an excellent platform for relational data mining. “Nautilus is fabulous,” said McGovern, adding the innovative system allowed her team to do 3 months of work in approximately 12 hours.

Overall, the team hopes their work will significantly reduce the false alarm rate for tornado warnings, currently about 75 percent, and increase the warning lead time, currently around 12-14 minutes. “If we can change the understanding of how tornadoes form,” said McGovern, “then hopefully that will lead to better prediction algorithms.”

For example, if the team’s simulations reveal that a certain set of storm conditions usually causes an F5 tornado (among the worst possible), perhaps observers on the ground should look for those conditions in actual storms. And even if those conditions only cause an F5 half of the time, said McGovern, it still might be worth it to sound a warning.

So far, the team has generated 30 of the planned 150 simulations. With Kraken’s recent upgrade to 1.17 petaflops, the team should be able to forge ahead even faster than before. But tornadoes are just the tip of the iceberg when it comes to the mining algorithms developed and employed by McGovern’s team. For example, they could potentially be used in other fields of science as well, such as atmospheric turbulence across the U.S. or even robotics.

For now the team will continue to analyze the enormous volumes of data from their tornado simulations, providing the scientific community with a new understanding of twisters and hopefully enabling an enhanced prediction capability that could give everyone from Nashville to Nebraska a little more time to duck for cover.