WTL

What Is Exascale Computing And Why Does It Matter?

In 2008 a new benchmark in high-performance computing (HPC) was set by the so-called petascale generation of supercomputers. But no matter how quickly technology evolves, our demand for even greater performance and potential continues to exceed capability.

Today we are at the threshold of the next giant leap forward – exascale computing.

What is exascale computing?

Like all the great computing advances, exascale marks a significant step-change. These new computers will be capable of executing 1018 floating-point operations every second – a quintillion is one billion billion (1,000,000,000,000,000,000) FLOPS.

But what does a quintillion look like? Imagine every single person on earth performing one maths calculation every second, 24 hours per day. It would take us four years working around the clock to complete one quintillion calculations – the same amount as an exascale computer does in one second.

What’s so great about exascale computing?

With the ability to complete so many calculations simultaneously, an exascale computer can complete complex problems far more quickly than its petascale predecessors. Some estimates suggest that exascale computers will complete these calculations up to 1000 times faster, allowing data scientists to improve productivity and output.

But exascale is about more than just raw power. By processing data 1000 times faster, an exascale computer can also crunch 1000 times more data in the same timeframes. This opens new possibilities for scientists, providing them with the capacity they need to solve the most complex problems. What once took days and weeks to compute will now be achievable in mere minutes and hours.

Exascale computing will also play a pivotal role in the future development of artificial intelligence (AI). Using current high-performance computing (HPC) systems, training an AI model can take weeks as the system learns from training data. With the increased throughput offered by exascale, that training period can be dramatically reduced, allowing businesses to deploy reliable AI models faster.

At the same time, AI models themselves will become increasingly performant. AI will be capable of making more real-time decisions to take automated action quicker than ever before.

Looking to the future

As these capabilities come online, expect to see significant advances in scientific research. Identifying potential vaccines for the next pandemic will become faster than ever, as researchers can process millions of molecular combinations and protein folds in a matter of hours. Automating the early stages of detection will prevent resources from being wasted on testing candidate treatments that have little or no potential benefit.

Exascale also opens opportunities to monitor and predict the most complex of scenarios. Weather prediction will become more accurate as meteorologists can process more historical data points to provide greater depth to their analysis for instance. These calculations will allow governments to better forecast the paths of hurricanes and tornadoes, and to issue life-saving guidance to citizens who may be affected.

As the volume of data collected by businesses grows, we need a way to process it – otherwise, its true value will never be unlocked. Exascale offers just the capabilities required to process the entire data estate and to put it to work for the business – or for the good of mankind in general.

To learn more about exascale – or more affordable HPC technologies – and how they will help your business achieve its strategic goals, please drop us a line

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top