Story image

Fujitsu building massive supercomputer for the University of Tokyo

18 Feb 2019

Fujitsu today announced that it has received an order for a large-scale ‘massively parallel’ supercomputer system for the Information Technology Center of The University of Tokyo.

According to the tech specialist, the new system - named ‘Oakbridge-CX’ - will achieve a theoretical peak performance of 6.6 petaflops by clustering 1,368 next-generation Fujitsu Server PRIMERGY x86 servers that feature Intel Xeon Scalable Processors.

To put that in perspective, Petaflops is short for peta floating point operations per second. Peta is an SI prefix indicating one quadrillion, or 10 to the power of 15, indicating the performance of one quadrillion floating point operations per second.

Fujitsu obviously has much expertise and technologies based in this space, as it has extensive experience of building and operating supercomputer systems for customers both in and outside of Japan. In light of this, the company is already in the process of building the supercomputer and hopes to have it operational by July this year.

The Computer Center of The University of Tokyo was established in 1965, specialising in shared use of large-scale computing by researchers in Japan. The Information Technology Center has been using supercomputers to advance cutting-edge research in science and engineering.

The Center has also played a key role in the Joint Usage/Research Center for Interdisciplinary Large-scale Information Infrastructures (JHPCN) and has contributed to R&D in a wide range of fields, including the natural sciences, engineering, and the social sciences by providing computational resources to High Performance Computing Infrastructure (HPCI).

According to The University of Tokyo, the new system is also expected to be used for field trials designed to create data utilisation platforms with a vision of achieving Society 5.0 - a human-centric society that delivers both economic development and resolutions to societal issues through systems that fuse cyberspace (virtual spaces) and physical space (the real world) at a high level.

Fujitsu is confident that by developing and offering high-performance supercomputers, the company will continue to contribute to the advancement and use of such technologies as computer science, simulations, data utilisation, and AI.

Dropbox invests in hosting data inside Australia
Global collaboration platform Dropbox has announced it will now host Australian customer files onshore to support its growing base in the country.
Opinion: Meeting the edge computing challenge
Scale Computing's Alan Conboy discusses the importance of edge computing and the imminent challenges that lie ahead.
Alibaba Cloud discusses past and unveils ‘strategic upgrade’
Alibaba Group's Jeff Zhang spoke about the company’s aim to develop into a more technologically inclusive platform.
Protecting data centres from fire – your options
Chubb's Pierre Thorne discusses the countless potential implications of a data centre outage, and how to avoid them.
Opinion: How SD-WAN changes the game for 5G networks
5G/SD-WAN mobile edge computing and network slicing will enable and drive innovative NFV services, according to Kelly Ahuja, CEO, Versa Networks
TYAN unveils new inference-optimised GPU platforms with NVIDIA T4 accelerators
“TYAN servers with NVIDIA T4 GPUs are designed to excel at all accelerated workloads, including machine learning, deep learning, and virtual desktops.”
AMD delivers data center grunt for Google's new game streaming platform
'By combining our gaming DNA and data center technology leadership with a long-standing commitment to open platforms, AMD provides unique technologies and expertise to enable world-class cloud gaming experiences."
Inspur announces AI edge computing server with NVIDIA GPUs
“The dynamic nature and rapid expansion of AI workloads require an adaptive and optimised set of hardware, software and services for developers to utilise as they build their own solutions."