The National Computational Infrastructure has generated new optimisation solutions to help resolve important details emerging from magnetotelluric studies of the Australian continent.

Magnetotellurics (MT) is a process by which the measurement of the variations in the Earth's magnetic field and electric field can reveal the geological makeup of the underground. The information helps researchers understand the composition of the Earth's interior and locate valuable resources.

A new study, the Australian Lithospheric Architecture Magnetotelluric Project (AusLAMP), led by Geoscience Australia, will see the placement of 2800 magnetotelluric instruments across Australia allowing researchers to create a much more detailed map of the earth beneath our feet.

While past surveys have mapped the underground in 2D, the high performance computing facilities at NCI will allow geophysicists to study the structure of the Australian continent in 3D. These new 3D models of the Earth's crust will help scientists generate valuable information about the Australian continental geodynamic framework, and will help identify characteristics in the crust and the upper mantle.

Although a three-dimensional approach has been available in the past this is the first time we can compute a large-scale model. This is only possible by combining groundbreaking research by Geoscience Australia with the experience of high performance computing experts at NCI.

The software used to analyse the MT surveys data requires both the petaflop infrastructure at NCI as well as access to the datasets housed at the facility. However, there were critical bottlenecks in the code that limited both its performance and scalability. NCI's HPC Scaling and Optimisation team were then brought in to analyse the code and improve its scaling to the resolution needed to undertake this work.

Senior HPC Specialist Dr Dale Roberts was able to identify and adjust key components of the code to address these issues. "The key concern with this software initially was its high memory usage which limited the ability to run even the smallest test cases. The examples we were given could only be run on the large memory nodes of Raijin, which make up 2% of the system, meaning that full scale jobs could not be run on Raijin at all," explains Dr Roberts. By adjusting the way that the code used disk instead of memory during processing we were able to both take advantage of Raijin's high performance parallel file system and reduce memory usage.

"The modified code can now scale with more CPUs; which means that instead of taking a couple of weeks, we can now generate a solution in less than a week" explains Geoscience Australia Section Leader Tristan Kemp. "We couldn't run the large models required for AusLAMP without using NCI or the expert assistance from the HPC Scaling and Optimisation team."

To learn more about magnetotellurics, click here: or