EXCLUSIVE - Smart Nation Innovations at A*STAR (Part 2) – Supercomputing and Next-generation Networks
Furthering its work in supporting the Smart Nation Initiative (Read the first part of the OpenGov report on Smart Nation Innovations at A*STAR here), A*STAR is driving innovations in order to facilitate greater scientific outcomes.
The agency is working on several projects which empower researchers, position A*STAR as a world-class research agency and help Singapore attract the best research talent from around the world.
OpenGov spoke to Dr. John Kan, Chief Information Officer (CIO), Information Technology Shared Services at A*STAR to learn about the agency’s key role in Singapore’s Smart Nation Program. Mr Max Tsang, Deputy Director, Information Technology and Services and Dr. Govindasamy Rajasekaran, Head, IT Planning & Communications also provided inputs.
A*STAR commissioned ExaNet in January 2015 as a high-performance 100 Gigabit campus-wide network. The high speed network enabled researchers to share large scientific datasets such as high resolution imaging, molecular models, 3D models and genome sequencing data.
ExaNet is deployed via the SingAREN Lightwave Internet Exchange (SLIX). SingAREN stands for Singapore Advanced Research and Education Network. It facilitates efficient exchanges of local traffic from Singapore's Research and Education community and provides international connectivity with overseasResearch and Education Networks (RENs).
ExaNet segregates research traffic (open network) from corporate traffic (research network) to provide a dedicated super-highway for researchers to collaborate within A*STAR and with external partners, such as NTU (Nanyang Technological University) and NUS (National University of Singapore) and overseas research institutions. ExaNet has been already deployed for A*STAR’s research institutes.
High performance computing (HPC)
A*STAR Computational Resource Centre (A*CRC) provides HPC resources to the A*STAR research community. Currently, A*CRC supports HPC needs of over seven hundred strong user community.
A*STAR also has an Institute of High Performance Computing (IHPC). The institute provides leadership in high performance computing as a strategic resource for creating scientific breakthroughs and industry development. IHPC’s R&D projects in sectors such as aerospace, infocomm, marine, maritime and offshore, urban planning, personal care, and medical technology have benefited industry partners.
Earlier this year, OpenGov interviewed Dr. Kan about plans for the National Supercomputing Centre (NSCC) and reported on A*STAR working towards the launch. Its 1 Petaflop (PF) system comprises approximately 1,288 nodes with 128 Gigabyte memory per node connected by high-speed interconnects. The nodes are connected to a 13 Petabytes high performance storage with very high input-output burst rate of up to 500 Gigabytes/s.
NSCC is operational now and it provides HPC resources, multi-petabyte data storage and multi-Gigabit high-speed global connectivity. It is a shared facility for supporting scientific and industrial research in the public sector, educational and research institutes, as well as the private sector. Using InfiniCortex, NSCC can collaborate with supercomputing facilities around the world to create a Galaxy of Supercomputers.
InfiniCortex – A galaxy of Supercomputers
InfiniCortex network map (Image courtsey of A*STAR)
InfiniCortex is the concurrent supercomputing across the globe, utilising trans-continental InfiniBand (a computer-networking communications standard, with very high throughput and very low latency used in HPC) and a Galaxy of Supercomputers.
Effectively, it creates a Galaxy of Supercomputers, with the supercomputers at the 7 nodes leveraging the InfiniBand network to act as one and tackle the biggest computational challenges. It was the first project in the world, where a 100 Gbps connection was established between supercomputers separated by a geographical distance of greater than 26,000 km.
This distributed and collaborative approach has the added benefits of solving the problem of huge power requirements, data replication and disaster recovery. This opens up a possible avenue towards Exascale computing, which are computing systems capable of at least one exaFLOPS, or a billion calculations per second.
Traditionally, InfiniBand was meant for use over short distances, say 50 metres within a data centre. Then A*STAR started work on extending the range, creating a global high speed network (100 Gbps per second).
Initial trials were conducted across institutes of higher learning which were 20-30 kilometres apart. Subsequently, InfiniCortex was demonstrated in November 2014 on a show floor in New Orleans, United States, running a connection from Singapore.
Now, large data sets are shared with the US, Europe, Japan, Australia and Canada.
Explaining the change wrought by InfiniCortex in data transfer, Dr. Kan said, “Transferring a 1 terabyte file over an Ethernet network using TCP/IP might take 2-3 days. Using InfiniBand trial networks, the transfer is completed in 20-25 minutes. In addition, the error rate in the old networks is as high as 60-80%. With InfiniBand, the error rate drops to a range between 10 and 20%. So, we can transfer faster and more accurately. We will keep on working to make data faster and cheaper.”
This is critical for data-driven research areas such as genomics. Large data sets can be transferred and accessed significantly faster, enabling improved collaboration between universities in Singapore and the rest of the world.
Making Singapore a regional and global research hub, is an integral part of A*STAR’s strategy to improve the quality of life of Singapore’s citizens.
Read the first part of the report here.