Monday, October 7, 2024
25 C
Brunei Town

Latest

Powering AI: Samsung’s game-changer

SEOUL (ANN/THE KOREA HERALD) – Samsung aims to lead in AI technology with its cutting-edge chip boasting the highest capacity. 

When deployed in AI applications, it’s projected to enhance AI training speed by 34 per cent and increase the capacity for simultaneous users of inference services by over 11.5 times.

“The industry’s AI service providers are increasingly requiring HBM with higher capacity, and our new HBM3E 12H product has been designed to answer that need,” Bae Yong-cheol, Executive Vice President of Memory Product Planning at Samsung Electronics said.

“This new memory solution forms part of our drive toward developing core technologies for high-stack HBM and providing technological leadership for the high-capacity HBM market in the AI era,” he said.

High Bandwidth Memory is a memory chip with low power consumption and ultra-wide communication lanes, using vertically stacked memory chips to break processing bottlenecks caused by conventional memory chips.

In light of the exponential growth of the AI market, the advanced, high-capacity memory chip would be an optimal solution for future systems that require more memory, Samsung said.

When used in AI applications, it is estimated that, in comparison to adopting HBM3 8H, the average speed for AI training can be increased by 34 per cent while the number of simultaneous users of inference services can be expanded more than 11.5 times.

“Its higher performance and capacity will especially allow customers to manage their resources more flexibly and reduce the total cost of ownership for data centres,” Samsung said.

Samsung’s HBM3E 12H provides an all-time high bandwidth of up to 1,280 gigabytes per second (GB/s) and an industry-leading capacity of 36 GB, the company said. In comparison to the 8-stack HBM3 8H, both aspects have improved by more than 50 per cent.

The world’s largest memory chipmaker said it has begun providing samples of HBM3E 12H to customers, and mass production is slated for the first half of this year. PHOTO: ANN/THE KOREA HERALD SOURCE

The HBM3E 12H applies advanced thermal compression non-conductive film (TC NCF), allowing the 12-layer products to have the same height specification as 8-layer ones to meet current HBM package requirements.

As the industry seeks to mitigate chip die warping that comes with thinner die, this technology is anticipated to have added benefits, especially for higher stacks, Samsung explained.

With its continued effort to lower the thickness of its NCF material, it achieved the industry’s smallest gap between chips at seven micrometres, and also eliminated voids between layers, Samsung said. This has led to enhanced vertical density by over 20 per cent when compared to the HBM3 8H, the chipmaker added.

Global chipmakers are seen competing to gain an upper hand in the HBM market, quickly gaining demand amid the increasing use of generative AI.

Micron Technology, a US-based chipmaker, announced it has started mass production of HBM3E, ahead of rivals Samsung and SK hynix. With a capacity of 24 GB, Micron’s latest chip would be part of Nvidia’s H200 Tensor Core GPUs, the company said.

The HBM market, which took about one per cent of the total memory chip market in terms of volume last year, is anticipated to more than double this year.

Samsung Electronics and SK hynix are competing head-on to beat each other in the new chip market, each taking some 45 per cent of the market share. Micron comes in at third on the list, supplying about 10 per cent of all orders.

spot_img

Related News

spot_img