Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
© SK Hynix
Electronics Production |

SK hynix: we'll invest $1bn in HBM this year

Korea's SK hynix will spend big on the development of its high-bandwidth memory (HBM) tech as competition heats up in the next-gen AI GPU space.

In an interview with Bloomberg, Dr Lee Kang-Wook – who heads up packaging development at SK hynix – said the firm will invest more than USD 1 billion in 2024 to improve the final steps of its chip manufacturing.

HBM is critical to the development of AI GPUs. This is because high-bandwidth memory stacks chips on top of one another, connecting them with TSVs for the fast and energy-efficient data processing that AI applications require.

Lee said the 50 years of the semiconductor industry “has been about the front end”, or the design and fabrication of the chips themselves. “But the next 50 years is going to be all about the back end (or packaging).”

Business is booming for SK hynix at the moment, thanks largely to the huge demand for AI applications. Market-leading AI chip design firm Nvidia currently uses SK hynix's HBM3 and HBM3E memory modules for AI GPUs, including the Hopper H100, the forthcoming Hopper H200 and the  Blackwell B100. AMD's new Instinct MI300X AI GPU also comprises HBM3.

SK hynix recently admitted that this year's supply of HDM is already sold out. It posted an operating profit of 346 billion won (USD 259 million) in Q4 2023 after four consecutive quarters of loss.
 


Ad
Ad
Load more news
April 15 2024 11:45 am V22.4.27-1
Ad
Ad