News

AMD Instinct MI300A and MI300X - Available at SabrePC

January 19, 2024 • 3 min read

SPC-Blog-AMD-Instinct-mi300a-mi300x-data-center-APU.png

AMD Data Center Innovations

Late 2023, AMD launched a highly anticipated data center solution that changes the landscape for HPC and AI. With 3D V-Cache technology, AMD’s ability to stack cache on a die has hinted at the possibility of stacking other components other than just cache memory. Additionally, AMD has dabbled in chiplet based approaches for both their CPU components as well as their GPU accelerators with Zen 4 and CDNA3.

The release of the MI300A and MI300X is the culmination of innovation from the folks at AMD and we dive deeper to explore what makes these new data center accelerators special to enterprise and research.

What is Special About the AMD Instinct MI300A?

The AMD Instinct MI300A introduces data center focused accelerated processing units (APUs) fusing the power of AMD Instinct GPU and accelerators with AMD EPYC™ cores stacked on a single processor with shared memory LPDDR5X and HBM memory to enable enhanced efficiency, flexibility, and programmability.

APUs have existed in the consumer and laptop market for decades, CPU with built in graphics that is. But as a collective, many dismiss the idea of an all-in-one processor and would opt for the option for dedicated graphics support.

AMD-Instinct-Solutions-MI300A.png

AMD Instinct MI300A comes equipped with 24 AMD EPYC Zen 4 CPU cores, 228 CDNA3 GPU Compute Units, and 128GB of shared HBM3 memory. Stacking the CPU and GPU on the same die coupled with the close proximity shared memory reduces bottlenecks and increases bandwidth, perfect for powering complex HPC workload capabilities.

AMD Instinct MI300X - Rival NVIDIA’s DGX H100

Along with the data center APU innovation, AMD releases their newest generation Instinct GPU with the Instinct MI300X baseboard platform featuring 8 MI300X GPUs connected via AMD Infinity Fabric for 42.4TB/s of peak theoretical memory bandwidth. Each MI300X OAM GPU is equipped with 304 CDNA3 CUs and 192GB of HBM3 memory.

AMD-Instinct-Solutions-MI300X.png

More GPU memory and the high bandwidth deliver competitive speeds for AI training and inferencing of the most complex, demanding models. With higher FP64 and FP32 performance per GPU, there is no question as to why this new GPU accelerator is making waves in the HPC landscape.

Where to Purchase an AMD Instinct System?

At SabrePC, we offer turnkey solutions featuring both platforms for AMD Instinct MI300A APU and AMD Instinct MI300X GPU servers. We have a total of 3 systems up: 2 MI300A APU servers and 1 AMD Instinct MI300X server. Take a look at our AMD Instinct Solutions page for more info, get a quote, and talk to one of our representatives today.


Tags

amd

data center

hpc

ai



Related Content