What happens when a magnitude 9 earthquake rocks the ocean floor in the Pacific Northwest?
According to the first-ever three-dimensional simulations of such a "megathrust" earthquake, it could well bring Seattle high rises crumbling to the ground, as the earth shakes as much as 3 feet per second. Cities like from Los Angeles to Seattle, Tacoma, and Olympia could be swamped by a massive tsunami.
The supercomputer research, led by seismologist Kim Olsen of San Diego State University, was published in the Journal of Seismology.
Megathrust earthquakes are expected to occur, on average, about once every 400 to 500 years.
The researchers cautioned that modeling such a complex and large event is riddled with uncertainties, but also stressed that their results should help guide emergency planning in the region of 9 million people.
One thing these studies will hopefully do is to raise awareness of the possibility of megathrust earthquakes happening at any given time in the Pacific Northwest, said Olsen. Because these events will tend to occur several hundred kilometers from major cities, the study also implies that the region could benefit from an early warning system that can allow time for protective actions before the brunt of the shaking starts.
Even an early warning system would only give a few seconds, or at most a few tens of seconds, warning perhaps enough time to stop trains and elevators, Olsen said.
Here's how the University of San Diego described the research:
To help prepare for the next megathrust earthquake, a team of researchers led by seismologist Kim Olsen of San Diego State University (SDSU) used a supercomputer-powered virtual earthquake program to calculate for the first time realistic three-dimensional simulations that describe the possible impacts of megathrust quakes on the Pacific Northwest region. Also participating in the study were researchers from the San Diego Supercomputer Center at UC San Diego and the U.S. Geological Survey.
What the scientists learned from this simulation is not reassuring, as reported in the Journal of Seismology, particularly for residents of downtown Seattle.
With a rupture scenario beginning in the north and propagating toward the south along the 600-mile long Cascadia Subduction Zone, the ground moved about 1.5 feet per second in Seattle; nearly 6 inches per second in Tacoma, Olympia and Vancouver; and 3 inches in Portland, Oregon. Additional simulations, especially of earthquakes that begin in the southern part of the rupture zone, suggest that the ground motion under some conditions can be up to twice as large.
The long-duration shaking, combined with high ground velocities, raises the possibility that such an earthquake could inflict major damage on metropolitan areas especially on high-rise buildings in downtown Seattle. Compounding the risks, like Los Angeles to the south, Seattle, Tacoma, and Olympia sit on top of sediment-filled geological basins that are prone to greatly amplifying the waves generated by major earthquakes.
In addition to increasing scientific understanding of these massive earthquakes, the results of the simulations can also be used to guide emergency planners, to improve building codes, and help engineers design safer structures -- potentially saving lives and property in this region of some 9 million people.
Even with the large supercomputing and data resources at SDSC, creating virtual earthquakes is a daunting task. The computations to prepare initial conditions were carried out on SDSCs DataStar supercomputer, and then the resulting information was transferred for the main simulations to the centers Blue Gene Data supercomputer via SDSCs advanced virtual file system or GPFS-WAN, which makes data seamlessly available on different sometimes distant supercomputers.
Coordinating the simulations required a complex choreography of moving information into and out of the supercomputer as Olsens sophisticated Anelastic Wave Model simulation code was running. Completing just one of several simulations, running on 2,000 supercomputer processors, required some 80,000 processor hours equal to running one program continuously on your PC for more than 9 years.
In order to run the simulations, the scientists must recreate in their model the components that encompass all the important aspects of the earthquake. One component is an accurate representation of the earths subsurface layering, and how its structure will bend, reflect, and change the size and direction of the traveling earthquake waves. Co-author William Stephenson of the USGS worked with Olsen and Andreas Geisselmeyer, from Ulm University in Germany, to create the first unified velocity model of the layering for this entire region, extending from British Columbia to Northern California.
Another component is a model of the earthquake source from the slipping of the Juan de Fuca plate underneath the North American plate. Making use of the extensive measurements of the massive 2004 Sumatra-Andeman earthquake in Indonesia, the scientists developed a model of the earthquake source for similar megathrust earthquakes in the Pacific Northwest.
The sheer physical size of the region in the study was also challenging. The scientists included in their virtual model an immense slab of the earth more than 650 miles long by 340 miles by 30 miles deep -- more than 7 million cubic miles -- and used a computer mesh spacing of 250 meters to divide the volume into some 2 billion cubes. This mesh size allows the simulations to model frequencies up to 0.5 Hertz, which especially affect tall buildings.
The researchers are planning to conduct additional simulations to explore the range of impacts that depend on where the earthquake starts, the direction of travel of the rupture along the fault, and other factors that can vary.
Enter your city or zip code to get your local temperature and air quality and find local green food and recycling resources near you.