The National Computational Infrastructure (NCI) hosted at The Australian National University (ANU) is the nation's most highly integrated and highest performing supercomputing centre. It supports a vast amount of the computational data-intensive research undertaken in Australia.
With funding support from the Australian Research Council (ARC), National Collaborative Research Infrastructure Scheme (NCRIS), and a variety of partner organisations, there may be no other single piece of infrastructure that serves such a wide variety of research at Australia's universities, Government science agencies and in industry.
The main processor of the computer is housed in a single large open plan room taking up the whole first floor of the ANU's purpose built facility. From row after row of large black cabinets, the Fujitsu computer 'Raijin'—the name is that of a Japanese thunder god—roars around the clock with the sound of cooling fans.
Behind mesh screens, flickering diodes articulate the pulse of calculations and thickets of coloured wires spill out of rack upon rack of the warm processing units. A large part of the floor below is dedicated to cooling equipment—inner recesses of the computer are water-cooled 45°C down to 26°C.
The challenges of keeping such a complex piece of computer hardware alive are considerable.
"Sometimes the thunder god gets angry," said Professor Lindsay Botten, who has headed NCI operations for the past seven years. He has negotiated Raijin through good and bad—from the first power-up to an emergency power failure.
The rapid development of technology means that in the course of 20 years, what is cutting-edge supercomputing today soon becomes pocket-sized and ubiquitous.
"The moving parts like spinning hard drives just wear out…in fact much of this equipment will be scrap metal in three years," said Professor Botten.
"The average iPad of today has more processing power than the NCI's predecessor of the early 90s."
The implications for today's supercomputers mean that an unrelenting regime of upgrading is the breathless norm.
The integration with the research sector is so great, and the turnover of equipment so fast, that running NCI is like running a small business, said Professor Botten.
"We have to raise about $18m to cover running costs annually. One third of this is from NCRIS, some from the ARC and universities, and the rest from partner organisations including ANU, the Bureau of Meteorology, Commonwealth Scientific and Industrial Research Organisation (CSIRO) and Geoscience Australia."
"At every moment we have to be a step ahead, designing the next iteration…we are transitioning into a heterogeneous world where the needs of researchers really drive what we do."
"The excitement for me has been watching the whole enterprise here at NCI coalesce into a national asset," said Professor Botten.
Putting researchers in the driver's seat of the NCI is Professor Botten's vision, whose background is at the ARC Centre of Excellence for Ultrahigh bandwidth Devices for Optical Systems (CUDOS).
Many other ARC-funded recipients are benefiting from computing time at NCI.
"We reserve about 20% of the computer's time for use by individual researchers at no cost. Many of these are themselves ARC DP (Discovery Projects) grant recipients or have ARC fellowships."
In 2014, nearly 90% of the research that was conducted using NCI was co-funded through ARC grants. NCI services support 191 ARC projects in Centres of Excellence, research projects and fellowships that generate more than 700 publications each year, and which together are in receipt of more than $40 million annually from the ARC.
NCI also allocates high-performance computing (HPC), data-intensive and storage services to projects identified as being of high-impact and of national strategic importance.
In 2015 five ARC Centres of Excellence are being supported through this flagship project, including the ARC Centre of Excellence for Climate System Science and ARC Centre of Excellence for All-Sky Astrophysics (CAASTRO), who consume the largest allocations.
NCI computing power also underpins one of the ARC's newest Industrial Transformation Research Hubs—the $5.4 million ARC Research Hub for Basin Geodynamics and Evolution of Sedimentary Systems (GENESIS). The primary focus of this new Hub is in modelling enormous ocean basins right down to the level of individual sand grains.
"Without NCI the kind of detailed modelling that we run at GENESIS would be impossible," said Professor Dietmar Müller, Hub Director. "NCI has given us and our industry partners the edge to keep our research competitive at an international level."
NCI also applies itself to processing massive volumes of data produced by the SkyMapper survey, which is downloaded from the telescope at Coonabarabran after every night's observations, and to the Bureau of Meteorology and CSIRO's climate and earth system research, which, according to Professor Botten, "really gets the machine running hot". But many of the most interesting projects that make use of the facility occur in unexpected areas.
For instance, a project led by Professor Joseph Lai and supported by an ARC Discovery Projects grant aims to discover the origin of the squeal that we hear when a car's brakes are applied. The squeal is produced by a combination of design factors with the brake disc itself; and car manufacturers are keen to avoid expensive mistakes when developing new brake systems. Professor Lai is using NCI to run detailed simulations that have uncovered how complex the relationship between a brake's vibration and sound generation really is.
This project highlights that it is not necessarily the case that big models—for instance, of ocean basins of climate systems—take up any more computing power than models of short-lived small scale phenomena. One analysis of a brake model took eight days of continuous computation on Raijin to create a two-second simulation.
Researchers at The University of New South Wales (UNSW) have also used Raijin to run the world's first simulated model of a turbulent lifted hydrocarbon flame. Such flames occur in gas turbines and industrial furnaces.
"You don't want to build a hundred or a thousand engines to find the best design. What we're aiming to do is develop computational models that are cheap enough that they can be used by industry," said ARC Future Fellow, Professor Evatt Hawkes, from the UNSW team behind the modelling.
One frame from a 3D simulation of a flame, of the kind found in gas turbines and industrial furnaces, which is part of a project conducted by ARC Future Fellow Professor Evatt Hawkes using the Raijin supercomputer. Image courtesy Professor Hawkes, UNSW. Data rendering by Asst. Prof. Hongfeng Yu, University of Nebraska-Lincoln.
The incredible processing power of Raijin allows a flame to be simulated right down to the molecular level, with all of its intricacies mapped in detail, giving researchers an insight into the combustion process that would be unattainable through experimentation alone.
"One of these runs takes about one million hours on Raijin. We also generate a lot of data; almost 10 TB (Terabyte) per run," said Professor Hawkes, whose research team has received ARC LIEF and Discovery Projects grants to support the analysis.
"One example of this analysis recently reported in the Journal of Fluid Mechanics demonstrates how lifted flames are stabilised, which has direct implications for burner design as well as for the selection of industrially-applicable models of lifted flames." .
From projects in the biosciences, astronomy and environment, to industry applications and real-time Twitter data mining, an enormous range of research projects find NCI resources indispensable. And as big data keeps getting bigger and high-end computing faster, support for the NCI's supercomputer Raijin is more important than ever.
This story was first published on the ARC website.