Scientists create a detailed 3D map of Earth's interior

29th March 2017
Posted By : Enaie Azambuja
Scientists create a detailed 3D map of Earth's interior

Using advanced modeling and simulation, seismic data generated by earthquakes, and one of the world's fastest supercomputers, a team led by Jeroen Tromp of Princeton University is creating a detailed 3D picture of Earth's interior. Currently, the team is focused on imaging the entire globe from the surface to the core-mantle boundary, a depth of 1,800 miles.

These high-fidelity simulations add context to ongoing debates related to Earth's geologic history and dynamics, bringing prominent features like tectonic plates, magma plumes, and hotspots into view.

In 2016, the team released its first-generation global model. Created using data from 253 earthquakes captured by seismograms scattered around the world, the team's model is notable for its global scope and high scalability.

"This is the first global seismic model where no approximations - other than the chosen numerical method - were used to simulate how seismic waves travel through Earth and how they sense heterogeneities," said Ebru Bozdag, a coprincipal investigator of the project and an assistant professor of geophysics at the University of Nice Sophia Antipolis. "That's a milestone for the seismology community. For the first time, we showed people the value and feasibility of running these kinds of tools for global seismic imaging."

The project's genesis can be traced to a seismic imaging theory first proposed in the 1980s. To fill in gaps within seismic data maps, the theory posited a method called adjoint tomography, an iterative full-waveform inversion technique.

This technique leverages more information than competing methods, using forward waves that travel from the quake's origin to the seismic receiver and adjoint waves, which are mathematically derived waves that travel from the receiver to the quake.

The problem with testing this theory? "You need really big computers to do this," Bozdag said, "because both forward and adjoint wave simulations are performed in 3D numerically." 

In 2012, just such a machine arrived in the form of the Titan supercomputer, a 27-petaflop Cray XK7 managed by the US Department of Energy's (DOE's) Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at DOE's Oak Ridge National Laboratory.

After trying out its method on smaller machines, Tromp's team gained access to Titan in 2013 through the Innovative and Novel Computational Impact on Theory and Experiment, or INCITE, program. Working with OLCF staff, the team continues to push the limits of computational seismology to deeper depths.

When an earthquake strikes, the release of energy creates seismic waves that often wreak havoc for life at the surface. Those same waves, however, present an opportunity for scientists to peer into the subsurface by measuring vibrations passing through Earth.

As seismic waves travel, seismograms can detect variations in their speed. These changes provide clues about the composition, density, and temperature of the medium the wave is passing through.

For example, waves move slower when passing through hot magma, such as mantle plumes and hotspots, than they do when passing through colder subduction zones, locations where one tectonic plate slides beneath another.

Each seismogram represents a narrow slice of the planet's interior. By stitching many seismograms together, researchers can produce a 3D global image, capturing everything from magma plumes feeding the Ring of Fire, to Yellowstone's hotspots, to subducted plates under New Zealand.

This process, called seismic tomography, works in a manner similar to imaging techniques employed in medicine, where 2D x-ray images taken from many perspectives are combined to create 3D images of areas inside the body.

In the past, seismic tomography techniques have been limited in the amount of seismic data they can use. Traditional methods forced researchers to make approximations in their wave simulations and restrict observational data to major seismic phases only. Adjoint tomography based on 3D numerical simulations employed by Tromp's team isn't constrained in this way. "We can use the entire data - anything and everything," Bozdag said.

Running its GPU version of the SPECFEM3D_GLOBE code, Tromp's team used Titan to apply full-waveform inversion at a global scale. The team then compared these "synthetic seismograms" with observed seismic data supplied by the Incorporated Research Institutions for Seismology (IRIS), calculating the difference and feeding that information back into the model for further optimisation. Each repetition of this process improves global models.

For its initial global model, Tromp's team selected earthquake events that registered between 5.8 and 7 on the Richter scale - a standard for measuring earthquake intensity. That range can be extended slightly to include more than 6,000 earthquakes in the IRIS database - about 20 times the amount of data used in the original model.

Getting the most out of all the available data requires a robust automated workflow capable of accelerating the team's iterative process. Collaborating with OLCF staff, Tromp's team has made progress toward this goal.

For the team's first-generation model, Bozdag carried out each step of the workflow manually, taking about a month to complete one model update. Team members Matthieu Lefebvre, Wenjie Lei, and Youyi Ruan of Princeton University and the OLCF's Judy Hill developed new automated workflow processes that hold the promise of reducing that cycle to a matter of days.

Additional support from OLCF staff has contributed to the efficient use and accessibility of project data. Early in the project's life, Tromp's team worked with the OLCF's Norbert Podhorszki to improve data movement and flexibility.

The end result, called Adaptable Seismic Data Format (ASDF), leverages the Adaptable I/O System (ADIOS) parallel library and gives Tromp's team a superior file format to record, reproduce, and analyse data on large-scale parallel computing resources.

In addition, the OLCF's David Pugmire helped the team implement in situ visualisation tools. These tools enabled team members to check their work more easily from local workstations by allowing visualisations to be produced in conjunction with simulation on Titan, eliminating the need for costly file transfers.

With visualisation, the magnitude of the team's project comes to light. The billion-year cycle of molten rock rising from the core-mantle boundary and falling from the crust - not unlike the motion of globules in a lava lamp - takes form, as do other geologic features of interest.

At this stage, the resolution of the team's global model is becoming advanced enough to inform continental studies, particularly in regions with dense data coverage. Making it useful at the regional level or smaller, such as the mantle activity beneath Southern California or the earthquake-prone crust of Istanbul, will require additional work.


You must be logged in to comment

Write a comment

No comments




More from Oak Ridge National Laboratory

Sign up to view our publications

Sign up

Sign up to view our downloads

Sign up

POWER & ENERGY 2017
22nd November 2017
Rwanda Kigali
SPS IPC Drives 2017
28th November 2017
Germany Nuremberg
Cyber Security - Oil, Gas, Power 2017
29th November 2017
United Kingdom London
AI Tech World
29th November 2017
United Kingdom Olymipa, London
Maker Faire 2017
1st December 2017
Italy Rome