Samsung Readies AI-Friendly HBM4 Chips for Nvidia


Samsung and SK Hynix appear to be on the cusp of producing HBM4 chips, ushering in a new generation of high-bandwidth memory (HBM) for AI data centers. Reuters reports that a source said Samsung will begin producing HBM4 in February. SK Hynix is also expected to begin HBM4 soon.

Because chip manufacturers are fairly secretive about production, it’s not entirely clear which company will reach production first. And because both companies appear to be getting up to speed at about the same time, being first might not matter much: There are plenty of customers waiting to get hold of the new memory. In fact, Reuters noted that SK Hynix had wrapped up talks with customers (for 2026 orders) by October of last year.

Nvidia will be one of Samsung’s customers for the HBM4 chips. Neither company has indicated how many of the chips are going to Nvidia. Still, it’s a good sign for Samsung, which has seen stiff competition from SK Hynix in the lucrative HBM market. Near the end of 2024, Samsung’s chairman publicly apologized for the company’s financial struggles and promised to return it to its leadership position in the tech sector. Now, with its HBM4 chips nearing production and demand for AI hardware through the roof, Samsung looks to have built some real momentum.


Credit: Amazon

As AI data centers gobble up hardware, supplies of common chips (especially memory) are running low. The result has been elevated prices for hardware well beyond typical data center server parts. Prices are jumping for a number of components used in consumer PCs, including graphics cards, RAM, SSDs, and even old-school hard drives.

With SSD supplies dwindling and prices spiking, some data centers are finding that a mix of SSDs and HDDs makes for a significantly less expensive setup. Memory makers, meanwhile, are devoting more resources to building parts specifically for AI data centers. Micron (which is also preparing to produce HBM4 chips) even went so far as to discontinue its long-time Crucial brand of memory. The brand had been a staple for many gamers and DIY PC builders. Micron is still producing memory for major PC builders, like Dell, but dropping the Crucial brand remains a stark reminder of just how much AI is changing the tech landscape.

HBM4 memory isn’t likely to appear in consumer GPUs, but it’s a fast, power-efficient memory for GPUs built for AI tasks. Both AMD and Nvidia use HBM3 (the current generation) chips in their data center products.



Source link

Recent Articles

spot_img

Related Stories