Last Updated on
AMD has launched its second generation of Epyc processors, with a spectacular display of seriously large numbers – including the claim that it’s broken 80 world records. Which is all well and good, but will the new chips improve our lives?
To find out, I’ve been speaking to experts at the launch, including industry experts, AMD engineers and the company’s many partners.
Faster, denser supercomputers
To understand what difference Epyc 2 could make, you need to understand two things: first, these are the processors that sit inside the servers that power the cloud. Second, they’re also powering the next generation of supercomputers, such as the Frontier.
For supercomputers, FLOPS are particularly important. FLOPS stands for floating point operations per second, and as always the higher the figure the better. AMD claims that the 2nd Gen Epyc chips deliver four times more FLOPS than its predecessors.
Density is also vital. Much medical and academic research goes on in server rooms within universities, and the greater the processing power they can include per square foot the better. The new generation of Epyc enables universities to boost this hugely, although exact improvements will depend on what went before.
AMD already has a white paper on its website that demonstrates the difference the first generation of Epyc made when it partnered with Oregon State University.
Or consider weather analysis. Via its partner Cray, AMD announced that the US Army was set to become a customer because it wanted localised weather information on demand.
If this second generation delivers on its promises then we should see:
- Improved cancer research, as programs run more quickly
- More accurate and more localised weather prediction
- More scientific breakthroughs in less time (thanks to better modelling of complex scenarios)
Improved AI and machine learning
Rather than manually take notes from the talks, I used Otter.ai to live transcribe what the speakers were saying. It already does a cracking job of understanding non-technical sentences, correcting itself on the fly, but what if Otter had access to even faster hardware? Is near-100% accuracy on the horizon?
Otter is just one example of a service that benefits from machine learning, and the second generation of Epyc processors will improve both aspects of it: inference and training.
Inference is where the software analyses your data on the fly, while training is where the software churns through terabytes of historic data to work out patterns.
The jump in hardware speeds (and hopefully a drop in prices, because the new AMD chips translate into significantly more computational power per dollar) may even lead to a new generation of “AI” services that we can’t imagine right now. But that, I admit, is entering star-gazing territory.
Faster cloud services
Currently, Google uses 100% Intel hardware when it comes to processors in its server farms. That makes sense, because Intel was the obvious choice for years.
Now, having delivered two generations of server-level hardware, and smashing 80 world records with this new release, Google’s buyers must surely be looking in AMD’s direction for its next server farm.
I took the above photo of Google appearing on stage, announcing that Epyc servers would now be available on Google Cloud and that they would power some Google services. That’s a sign of things to come, and a worrying one for Intel.
Nor was Google alone among cloud service providers appearing on stage, with Amazon and Microsoft both taking a bow at various points.
The knock-on effect of AMD Epyc powering a larger chunk of the world’s servers is that energy consumption will go down. While precise figures are hard to nail down, AMD promises half the power for the same performance – if that translates into real life then it could have a big impact.
And, on a final personal note as someone who uses Google Drive and occasionally tears his hair out that a file hasn’t synchronised across systems, if Epyc means that Drive will synchronise faster, I will be a very happy man.
READ THIS NEXT: Why Intel should be very worried by the Ryzen 9