Nvidia goes (even more) green with new liquid-cooled GPUs

Team Green is on a sustainability drive

When you purchase through links on our site, we may earn an affiliate commission.Here’s how it works.

Nvidiahas lifted the lid on a new line of liquid-cooled A100 and H100GPUsthat promise to bring greater energy efficiency to the data center.

Announced atComputex 2022, the new configurations are billed as “the next step in accelerated computing forNvidiaGPUs”, marking the first time the company has offered direct-to-chip liquid cooling.

Liquid-cooled A100 GPUs will be available in a few months’ time in a PCIe card format and will feature inside the HGX A100 server. The new H100 card, meanwhile, will be available in the HGX H100 server from early next year.

The cooling conundrum

The cooling conundrum

Traditionally, data center operators - from large enterprises tocloudvendors - have relied on air conditioning to keepserversand other equipment from overheating.

However, chilling the air inside a data center is both inefficient and expensive. And this is especially true for facilities located in tropical climates (e.g. Hong Kong or Singapore), which are locked in a never-ending battle with the environment.

With organizations placing greater emphasis than ever on sustainability, attention has turned to identifying methods of cooling data centers more effectively, without compromising on performance.

“Data center operators aim to eliminate chillers that evaporate millions of gallons a water a year to cool the air inside data centers. Liquid cooling promises systems that recycle small amounts of fluids in closed systems focused on key hot spots,” Nvidia explained.

Are you a pro? Subscribe to our newsletter

Are you a pro? Subscribe to our newsletter

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

Nvidia Grace CPU could bring Intel a whole new world of pain>Nvidia kills off $40bn Arm acquisition>Nvidia, IBM have plans to connect GPUs straight to SSDs

“We plan to support liquid cooling in our high-performance data center GPUs and our NVIDIA HGX platforms for the foreseeable future.”

In testing, the new liquid-cooled A100 cards were able to execute identical workloads using 30% less energy. By Nvidia’s calculations, switching out CPU-only servers running AI and HPC workloads for GPU-accelerated systems worldwide could save up to 11 trillion watt-hours of energy.

Another benefit is that the liquid-cooled GPUs fill just one slot in a server rack, whereas their air-cooled counterparts fill two, which means operators have the opportunity to pack much more compute into the same area.

Nvidia says that upwards of a dozen server manufacturers - from ASUS to Gigabyte, Supermicro and more - intend to integrate the new cards into their products later this year, with the first systems to hit the market in Q3.

Joel Khalili is the News and Features Editor at TechRadar Pro, covering cybersecurity, data privacy, cloud, AI, blockchain, internet infrastructure, 5G, data storage and computing. He’s responsible for curating our news content, as well as commissioning and producing features on the technologies that are transforming the way the world does business.

Nokia confirms data breach leaked third-party code, but its data is safe

Best CDN provider of 2024

3 reasons why PIA fell in our best VPN rankings