2024 Compute Performance Considerations and Expectations

Large language models compute loads captures a lot of focus, following an “AI on Top of HPC” architecture, LLMs became feasible, and are now, maybe, the world’s major technology and business competency arena.

Typical HPC clusters utilized in LLMs training has 10s to 1000s of compute nodes, making the long training jobs feasible. The majority of this arena’s spotlights are focused upon the processing units, primarily GPUs, which provide an average of 60 TFLOPS of FP64, though, such HPC clusters usually configured with dual sockets CPUs providing an average of 8 TFLOPS , some high speed RAM, high speed network (100 – 400 Gbps) + multiple NICs per node, and High Performance Storage (usually NAS).

Now the inevitable question of compute efficiency of these modern “super” systems reveals a lot of surprises, highlighted in the following lines.

A study by Google suggests that the actual load of BERT1 on a GPU is around 10-20% of the peak FLOPS capability of the GPU.

Another study by Facebook AI suggests the actual load of RoBERTa2 on a GPU is around 20-30% of the peak FLOPS capability of the GPU.

A closer look at LLMs training and inference indicates that these are data-centric workloads, the poor utilization of the GPUs by these models, clearly implies a need for higher rates data access and transfer technologies, rather than increasing the compute capability. In other words, a higher ROI of a compute facility running data-centric workloads, can be achieved by increasing the investment in data access and transfer capabilities, specifically, the memory and the computer architecture, instead of investing in additional “unusable” compute capability.

Surprisingly, this means:

  1. Development of (an order of magnitude) faster memory systems is what the industry must expect in the near future
  2. Currently, Modified Harvard architecture (a little different than Von Neumann architecture), which is the most adopted computer architecture, does not suit modern data-centric workloads, however, modern compute requirements clearly necessitate a technology development shift, towards a “Data Flow/In Memory Computing” as an alternative architecture for such workloads.
  3. Development and innovation of new hardware processing platforms, along with their enablement by software and programming models, will be increasing, as much newer workloads do evolve.
  4. The chipmakers market landscape is about to change, the domination of GPUs as AI processing units is not going to last for long, although, it is currently the best fit.
  5. In the short term, the compute capability differences between the various GPU platforms currently available in the market might soon be realized as insignificant, specifically comparing Nvidia, AMD, and Intel datacenter GPUs.

Although, above GPU market expectations might be considered as bold, and seems to suggest a bubble approach, it is worth preparing for.

———————————————————————————-

1 : BERT (Bidirectional Encoder Representations from Transformers), is a popular LLM that has been shown to achieve state-of-the-art results on a wide range of NLP tasks.

2: RoBERTa (Robustly Optimized BERT Pretraining Approach): RoBERTa is a variant of BERT that was specifically designed for text classification tasks

Author:
Tamer Assad Hassan Mahmoud
HPC & Media Streaming Consultant
CEO of PHOTON COMPUTING LLC
LinkedIn: https://www.linkedin.com/in/tamerassad
https://www.photon-computing.com

Share

Related

A Safer Take on Your Finances

It wouldn’t be too much of a stretch, if...

The proliferation of technology into law

The early 21st Century has proved to be an...

An Ambitious Bid for the Future

If we look into the world’s history, we can...

Lumen Acclaimed by Frost & Sullivan for Boosting Enterprise Customer Web Application Security

Frost & Sullivan recognizes Lumen Technologies with the 2021...

Why Digital Transformation May Be the Independent Retailer’s Last Best Hope

Digital Transformation has been touching just about every industry...

Is it THE END of Rumor-Mongers?

Despite the fact that technology has been a hugely...

Cyber Security & Cloud Expo Europe 2023 – Less Than Two Months To Go

Save the date and get ready for one of...

Another Pursuit for Domination

One reason why technology feels like such a transformative...

Cybersecurity Among The Biggest Risks In 2021

This year, cybersecurity are going to be the second...

Automation and app security are top priorities in 2021

There are about 350 people in what we call...

Latest

No posts to display

No posts to display