Funded by the Australian Research Council and industry partners, project partners include the California Institute of Technology, Curtin University, University of Melbourne and Geoscience Australia.
Spread across the globe, a crucial requirement for the project was the ability to access and share massive amounts of data between their national and international partners – or nodes.
The project’s modelling is extremely data intensive: for example, their mantle convection models go back 410 million years, and require large input files in 1 million year intervals – such as plate velocities, age of seafloor and continents.
Managing massive volumes of research data
To manage such large volumes of data, the project relies on some of the most powerful network and computing resources available in the world today.
Thanks to research and education networks, high-speed connections link the national and international nodes, as well as the main supercomputer resource at the National Computational Infrastructure (NCI) facility, located at the Australian National University.
In Australia, the University of Sydney and the NCI supercomputing facility are both connected to AARNet, Australia’s research and education network.
AARNet’s national connectivity between research and education institutions in Australia, and interconnections with international research and education networks, helps ensure seamless and reliable connectivity between the nodes.
Sharing very large files
Finally, the project needed efficient ways to transfer very large files – for example, to share large raw datasets, model outputs and visualisations with international colleagues.
Dr Sabin Zahirovic, a Post-Doctoral Researcher at the University of Sydney, explains some of the initial challenges.
“The proprietary solutions we first used had very slow transfer speeds, and an unworkable file size limitation (usually several gigabytes), short file expiry timeframes, as well as obscure policies on how our sensitive data may be treated on their servers.”
The project turned to AARNet, whose CloudStor service enables researchers and staff to quickly and securely sync, share and store files using its high-speed network.
“CloudStor is essential for sharing very large files – ones that would be impossible to send in e-mail, and would very quickly clog up other cloud-based accounts. As CloudStor is part of the AARNet network, I can very quickly upload huge files, and then conveniently send a plain hyperlink to collaborators.”
“I have not come across any platform that comes even close to the functionality and speed of the CloudStor service, especially with all the new features launched in November 2016, such as end-to-end encryption with password protection, usability improvements and better performance stability,” Dr Zahirovic explained.