By harnessing the power of supercomputers, research scientists can now predict how an earthquake will unfold and the scale of damage it might cause. The project is led by David McCallen, a senior research scientist, in collaboration with teams from the Lawrence Berkeley and Oak Ridge national laboratories.
The team is developing the most innovative and advanced simulations to date to study earthquake dynamics. These simulations have shed light on new information regarding how geological conditions affect earthquake intensity and impact building infrastructure.
“Our goal is to model earthquakes from beginning to end and track the seismic waves as they propagate through the Earth,” said McCallen.
“We want to understand how those waves interact with buildings and critical energy infrastructure to assess their vulnerability so they can be as prepared as possible before the next earthquake strikes,” he continued.
Birth of the EQSIM
Traditional earthquake simulations have relied on rough data in the past to study earthquake ground motions. That was because the scientists lacked computational power to create models of earthquakes in specific locations with sufficient fidelity.
Since this research project, the Exascale Computing Project (ECP), was launched, McCallen and his team have developed EQSIM (Earthquake Simulation Coder).
EQSIM shows how seismic waves interact with soils, mountains, and valleys, amplifying or dampening earthquake energy. These simulations reveal how buildings and critical infrastructure (like water and power systems) may respond or fail during quakes.
Surprising finds
McCallen also revealed that smaller earthquakes can cause more damage in some cases than larger ones. It depends on the underlying geological conditions, according to McCallen.
To study why this happens, it’s essential to understand how ground motion – the intense shaking during the earthquake – occurs. It is defined by three geological factors: fault type, soil composition, and surface topography.
The EQSIM is currently being used to model earthquake activity in three major US fault zones: The San Francisco Bay Area, the Los Angeles Basin, and the New Madrid region in the eastern Midwest.
This activity is being carried out to understand how earthquakes behave across different geologies.
Aided by Frontier
The EQSIM team in California uses the famous Frontier supercomputer in Oak Ridge, Tennessee, for simulations. Frontier delivers two exaflops per second, about 1000 times faster than the older peta-scale systems, and is powered by AMD GPUs.
Its models span hundreds of kilometers and use up to 500 billion grid points to capture geology and infrastructure in stunning detail.
“The incredible computing power behind the simulations allows us to see the hot spots of the seismic waves and where the energy gets directed through the different layers of rock and soil,” McCallen said.
“We can see clearly how and where the waves can stack up and how those ground motions translate into building risk and damage. And we can see they are exceedingly different at each location,” he added.
Decoding the results
Each simulation takes around 90 seconds of physical time and delivers about 3 petabytes of output. This is similar in size to around 750,000 feature-length films or 1.5 trillion pages of text.
“The best thing about these simulations is that we don’t have to wait for the next ‘Big One’ to strike to understand how it will impact us. If anyone needs information about a 7.5 earthquake in these critical areas, we can provide them with the comprehensive data that is being generated,” McCallen said.
This research is backed by the DOE’s Office of Cybersecurity, Energy Security, and Emergency Response, which leads efforts to safeguard and strengthen the resilience of U.S. energy infrastructure against all hazards.