Samsung’s rapid AI-optimised DRAM

1949

SEOUL (ANN/THE KOREA HERALD) – Samsung Electronics, leading the global memory chip market in revenue, announced on Wednesday its groundbreaking achievement: the development of the fastest LPDDR5X memory chip tailored for on-device artificial intelligence applications.

This cutting-edge Low Power Double Data Rate 5X boasts unrivalled performance, reaching speeds of up to 10.7 gigabits-per-second, all while maintaining the smallest form factor among products utilising the 12-nanometer-class process technology. Samsung compares its speed to transmitting roughly 20 full HD movie files of 4 gigabytes each within a mere second.

“As demand for low-power, high-performance memory increases, LPDDR DRAM is expected to expand its applications from mainly mobile to other areas that traditionally require higher performance and reliability such as PCs, accelerators, servers and automobiles,” said Bae Yong-cheol, executive vice president of product planning at Samsung’s memory chip  business.

“Samsung will continue to innovate and deliver optimised products for the upcoming on-device AI era through close collaboration with customers.”

Samsung will start mass production of the 10.7Gbps LPDDR5X by the second half of the year, following verification with mobile application processors and mobile device providers, the company said.

The new product has improved in performance speed by more than 25 per cent, and the capacity has increased by more than 30 per cent when compared to the previous generation model.

Samsung Electronics 10.7Gbps LPDDR5X. PHOTO: ANN/THE KOREA HERALD SOURCE

It also expands the single package capacity of mobile DRAM up to 32 gigabytes, making it the most efficient solution for applying on-device AI, the company said.

The role of low-power, high-performance LPDDR memory chips has become more important with on-device AI, which enables direct processing of AI on devices and is increasingly crucial in the burgeoning tech industry.

Samsung said the new LPDDR5X incorporates specialised power-saving technologies, such as optimised power variation, that adjusts power according to workload, and expanded low-power mode intervals to extend the energy-saving periods.

These improvements enhance power efficiency by 25 per cent over the previous generation, enabling mobile devices to provide longer battery life and allowing servers to minimise the total cost of ownership by lowering energy usage when processing data, the tech giant added.

The application of on-device AI is expected to expand into a wide range of tech sectors, including smartphones, wearable devices, robots and autonomous vehicles, leading to higher demand for high-performance, high-capacity memory chips.

According to Omdia, a market tracker, the global demand for mobile DRAM chip capacity is expected to grow from 67.6 billion gigabytes in 2023 to 125.9 billion gigabytes by 2028, showing a compound annual growth rate of 11 percent during the period.

The market tracker also forecasts the global revenue to double from USD12.3 billion in 2023 to USD26.3 billion in 2028.