Samsung Electronics on Tuesday said it has developed a new high-bandwidth memory chip that has the “highest-capacity to date” in the industry.
The South Korean chip giant claimed the HBM3E 12H “raises both performance and capacity by more than 50%.”
“The industry’s AI service providers are increasingly requiring HBM with higher capacity, and our new HBM3E 12H product has been designed to answer that need,” said Yongcheol Bae, executive vice president of memory product planning at Samsung Electronics.
“This new memory solution forms part of our drive toward developing core technologies for high-stack HBM and providing technological leadership for the high-capacity HBM market in the AI era,” said Bae.
Samsung Electronics is the world’s largest maker for dynamic random-access memory chips, which are used in consumer devices such as smartphones and computers.
Generative AI models such as OpenAI’s ChatGPT require large numbers of high-performance memory chips. Such chips enable generative AI models to remember details from past conversations and user preferences in order to generate humanlike responses.
The AI boom continues to fuel chipmakers. U.S. chip designer Nvidia
posted a 265% jump in fourth fiscal quarter revenue thanks to skyrocketing demand for its graphics processing units, thousands of which are used to run and train ChatGPT.
During a call with analysts, Nvidia CEO Jensen Huang said the company may not be able to maintain this level of growth or sales for the whole year.
“As AI applications grow exponentially, the HBM3E 12H is expected to be an optimal solution for future systems that require more memory. Its higher performance and capacity will especially allow customers to manage their resources more flexibly and reduce total cost of ownership for datacenters,” said Samsung Electronics.
Samsung said it has started sampling the chip to customers and mass production of the HBM3E 12H is planned for the first half of 2024.
“I assume the news will be positive for Samsung’s share price,” SK Kim, executive director of Daiwa Securities, told CNBC.
“Samsung was behind SK Hynix in HBM3 for Nvidia last year. Also, Micron announced mass production of 24GB 8L HBM3E yesterday. I assume it will secure leadership in higher layer (12L) based higher density (36GB) HBM3E product for Nvidia,” said Kim.
In September, Samsung secured a deal to supply Nvidia with its high-bandwidth memory 3 chips, according to a Korea Economic Daily report, which cited anonymous industry sources.
The report also said that SK Hynix, South Korea’s second-biggest memory chipmaker, was leading the high-performance memory chip market. SK Hynix was previously known as the sole mass producer of HBM3 chips supplied to Nvidia, the report said.
Samsung said the HBM3E 12H has a 12-layer stack, but applies advanced thermal compression non-conductive film which allows the 12-layer products to have the same height specification as 8-layer ones to meet current HBM package requirements. The result is a chip that packs more processing power, without increasing its physical footprint.
“Samsung has continued to lower the thickness of its NCF material and achieved the industry’s smallest gap between chips at seven micrometers (µm), while also eliminating voids between layers,” said Samsung. “These efforts result in enhanced vertical density by over 20% compared to its HBM3 8H product.”
Source: https://www.cnbc.com/2024/02/27/samsung-unveils-new-memory-chip-with-highest-capacity-to-date-for-ai.html