Friday, March 29, 2013
Meeting the computing challenges of next-generation climate models
Lawrence Berkeley National Laboratory News Center: ...As global climate models improve, they are generating ever larger amounts of data. For Michael Wehner, a climate scientist in the Computational Research Division of Lawrence Berkeley National Laboratory (Berkeley Lab) who focuses on extreme weather—such as the intense hurricanes, “derecho” and atmospheric rivers (or “pineapple express”) like the one that California saw last December—computing challenges are key to his work.
“In order to simulate these kinds of storms, you really do need high-resolution climate models,” he said. “A model run can produce 100 terabytes of model output. The reason it’s so high is that in order to look at extreme weather you need high-frequency data. It’s a challenge to analyze all this data.”
For Wehner, a dataset that would take 411 days to crunch on a single-processor computer takes just 12 days on a Hopper, a massively parallel supercomputer at the National Energy Research Scientific Computing Center (NERSC) at Berkeley Lab. Despite this advance, Wehner feels it should take only about an hour.
Berkeley Lab recently hosted an international workshop that brought together top climatologists, computer scientists and engineers from Japan and the United States to exchange ideas for the next generation of climate models as well as the hyper-performance computing environments that will be needed to process the data from those models. It was the 15th in a series of such workshops that have been taking place around the world since 1999. “The Japanese have big machines, as does the U.S.,” Wehner said. “They’re also leaders in high-performance computing for climate science.”
...Wehner offered an example of getting completely opposite results when running simulations with a lower-resolution climate model versus a high-resolution version of the same model. “My conclusion from a 100-kilometer model is that in the future we will see an increased number of hurricanes, but from this more realistic simulation from the 25-kilometer model, we draw the conclusion that the total number of hurricanes will decrease but the number of very intense storms will increase.”...
A computer simulation of hurricanes from category 1 through 5 over 18 years generated nearly 100 terabytes of data. Image from the LBL News Center website
“In order to simulate these kinds of storms, you really do need high-resolution climate models,” he said. “A model run can produce 100 terabytes of model output. The reason it’s so high is that in order to look at extreme weather you need high-frequency data. It’s a challenge to analyze all this data.”
For Wehner, a dataset that would take 411 days to crunch on a single-processor computer takes just 12 days on a Hopper, a massively parallel supercomputer at the National Energy Research Scientific Computing Center (NERSC) at Berkeley Lab. Despite this advance, Wehner feels it should take only about an hour.
Berkeley Lab recently hosted an international workshop that brought together top climatologists, computer scientists and engineers from Japan and the United States to exchange ideas for the next generation of climate models as well as the hyper-performance computing environments that will be needed to process the data from those models. It was the 15th in a series of such workshops that have been taking place around the world since 1999. “The Japanese have big machines, as does the U.S.,” Wehner said. “They’re also leaders in high-performance computing for climate science.”
...Wehner offered an example of getting completely opposite results when running simulations with a lower-resolution climate model versus a high-resolution version of the same model. “My conclusion from a 100-kilometer model is that in the future we will see an increased number of hurricanes, but from this more realistic simulation from the 25-kilometer model, we draw the conclusion that the total number of hurricanes will decrease but the number of very intense storms will increase.”...
A computer simulation of hurricanes from category 1 through 5 over 18 years generated nearly 100 terabytes of data. Image from the LBL News Center website
Labels:
modeling,
science,
technology
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment