Skip to main content
Big science data needs a big network
31 October, 2017

Big science data needs a big network

Moving big science data and international science collaborations were common strands in the network development talks given by the AARNet team at the eResearch Australasia 2017 conference in Brisbane recently (16-20 October).

What’s AARNetX?

AARNet eResearch Data Program Manager, Brett Rosolen unpacked AARNetX as an experimental network for interconnecting large data sources and computing facilities.

Science data transfer applications have unique network requirements that increasingly cannot be met by networks that are optimized for normal business operations like web browsing and business systems.

Rosolen outlined the constraints of campus architecture (not typically designed for large research data flows) and the timeliness and cost-efficiency of using software defined networking to expedite big data flows (given the growth trends in big data and high-performance computing).

AARNet4 can support single data flows of 10 Gigabits per second (Gbps) to 50Gbps when a network termination unit is connected to a border router, whereas AARNetX is designed to support single data flows of 100Gbps when a Science DMZ (demilitarised zone) access technique is being used.

Developed by ESnet, the US Department of Energy research network, the Science DMZ architecture separates science traffic from general-purpose traffic and allows for domain-specific risk mitigation and security policy enforcement.

With AARNet X as a community pathfinder, the data-intensive sciences that are driving the challenge to scale up networking, storage, and computing in research infrastructure will be able to develop faster workflows on a friction-free network, and subsequently achieve faster science outcomes.

SKA & Indigo Project

Continuing with the theme of “big science”, two lightning talks by AARNet eResearch Director, Peter Elford touched on the very physical nature of delivering national research and education networks (NRENs).

The first talk covered the networking technologies being developed to support the vast astronomical data collection for the Square Kilometre Array (SKA), the world’s largest telescope co-located in outback Western Australia and South Africa.

Elford outlined the data transfer rates needed to support the movement of astronomical data from the SKA and the difference between the data transfer rate and time involved to move data across the NREN (in comparison to commodity internet services). An international speed test for a data transfer of 100TB, conducted in partnership with GÉANT in May 2017, clocked 9.27Gbps and highlighted the network capacity differences.

The second talk covered the Indigo Project, a consortium involving AARNet, Google, Indosat Ooredoo, Singtel, SubPartners and Telstra to build a new subsea cable system from Australia to Indonesia and Singapore). The new system will be ready for service in Q1 2019, with a minimum capacity of 18Tbps per fibre.

Elford highlighted the fact that through this consortium investment, the new cable system will enable greater flexibility, lower latency, and increased redundancy for international traffic. All of these network features are critical for meeting the needs of data-intensive “big sciences” such as astronomy, biomedicine, climate and earth sciences and the increasingly international and collaborative nature of both scientific research and the sharing of research infrastructure.

Author: Ingrid Mason, AARNet eResearch Deployment Strategist