This roadmap for AI chips in the data center shows that NVIDIA will dominate well into 2027 and beyond

This roadmap for AI chips in the data center shows that NVIDIA will dominate well into 2027 and beyond

In a recent data center AI chip roadmap published on X, we get a good overview of what the companies already have on the market and what’s in the AI ​​chip pipeline through 2027. Check it out:

This roadmap for AI chips in the data center shows that NVIDIA will dominate well into 2027 and beyond 34

VIEW GALLERY – 2 PICTURES

The list includes chipmakers NVIDIA, AMD, Intel, Google, Amazon, Microsoft, Meta, ByteDance, and Huawei. You can see that the list of NVIDIA AI GPUs ranges from Ampere A100, Hopper H100, GH200, H200 AI GPUs, Blackwell B200A, B200 Ultra, GB200 Ultra, and GB200A. But after that – and we all know it’s coming – comes Rubin and Rubin Ultra, both of which feature next-generation HBM4 memory.

We also have AMD’s growing line of Instinct MI series AI accelerators, from the MI250X to the new MI350 and the upcoming MI400 listed there for 2026 and beyond.

Google is close behind in third place with its TPU processor series. The next generation TPU v5e up to the next generation TPU v7p, which will be released in 2025, is listed. Intel has its Gaudi 2 and Gaudi 3 on the list, the next Falcon Shores AI processor is not expected until the second half of 2025.

We expect NVIDIA’s next-generation Ruby R100 AI GPUs to use a 4x reticle design (compared to Blackwell’s 3.3x reticle design) and be manufactured on TSMC’s cutting-edge CoWoS-L packaging technology on the new N3 process node. TSMC recently talked about chips with up to 5.5x reticle size coming in 2026, featuring a 100x100mm substrate that would handle 12 HBM sites, compared to 8 HBM sites on current-generation 80x80mm packages.

TSMC will be moving to a new SoIC design that will allow for over 8x the crosshair size on a larger 120 x 120mm package configuration, but as Wccftech notes, these are still in the planning stages, so we can probably expect around 4x the crosshair size for the Rubin R100 AI GPUs.

Leave a Reply

Your email address will not be published. Required fields are marked *