Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/index.php?threads/don%E2%80%99t-count-samsung-out-in-the-ai-memory-stakes.20035/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021370
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Don’t Count Samsung Out in the AI Memory Stakes

Daniel Nenni

Admin
Staff member
Don’t Count Samsung Out in the AI Memory Stakes

Don’t Count Samsung Out in the AI Memory Stakes© Provided by The Wall Street Journal

South Korean technology giant Samsung Electronics has fallen behind in the artificial-intelligence race—at least in the first heat.

It would be foolish to count it out, though. Recent signs indicate it might be narrowing the technological gap with rivals SK Hynix and Micron in high-performance AI memory chips. Even if it takes longer than expected to catch up, a tighter overall memory market thanks to the AI boom could still be a significant tailwind for Samsung.

Nvidia’s AI chips have been selling like hot cakes since the rise of generative AI apps such as ChatGPT. Memory-chip makers have, in turn, sold out their high-performance products to Nvidia and others. High-bandwidth memory, or HBM, offers enhanced data-processing speed, which is crucial for AI number crunching.

Korea’s SK Hynix has taken an early lead in HBM. It is virtually the only supplier to Nvidia for the latest-generation memory chip, called HBM3. Samsung only started mass-producing HBM3 in the second half of last year. It does produce earlier generations of HBM chips used by some slower AI chips, rather than the most cutting-edge ones made by Nvidia.

And now SK Hynix has started mass producing its next-generation chips, called HBM3E. SK’s smaller rival Micron, which essentially skipped the previous generation, is doing the same. Both companies said they have already sold out their full HBM production volume this year and are already filling orders for next year.


Even so, Samsung is working hard to catch up. The company expects to mass-produce next-generation HBM chips in the first half of this year. That would leave it about a fiscal quarter rather than a full year—as with the previous generation of HBM chips—behind the competition.

Moreover, on March 19, Nvidia Chief Executive Jensen Huang said the company is in the process of testing Samsung’s next-generation HBM chips, according to Japan’s Nikkei. An interesting aside: Huang wrote “Jensen Approved” along with his signature on Samsung’s HBM3E product display at Nvidia’s AI conference this March.

Samsung will have to ensure its products are up to standard while simultaneously ramping up capacity. But given the extremely tight supply situation, Nvidia has every reason to want an additional supplier.

If Samsung does manage to catch up, it could tap in to a fast-rising segment of the memory market.

Bernstein Research estimates HBM sales will expand to 16% of total industry revenue from DRAM, a type of chip used as working memory, this year. Goldman Sachs, in a report dated March 22, raised its estimate of the future HBM market to $23 billion by 2026: That would represent a 10-fold increase from $2.3 billion in 2022.

But the jump in demand for HBM chips will also help keep the overall memory market tighter as more capacity is used to make these high-margin chips. That shift will benefit Samsung, which has a cost advantage over its peers in conventional memory products. The rising use of AI applications might also require more powerful devices with higher memory capacity in general.

Samsung’s shares have sharply lagged behind rivals SK Hynix and Micron, whose values have more than doubled since the start of last year. That is partly because Samsung isn’t a pure memory-chip company. But it also reflects slower progress in HBM.

Samsung, which is the market leader for the overall memory market, now finds itself in the uncomfortable position of catching up. Sustaining that sprint will be expensive, but a tighter overall memory market—and a potential assist from Nvidia—would be a big help.

 
Don’t Count Samsung Out in the AI Memory Stakes

Don’t Count Samsung Out in the AI Memory Stakes© Provided by The Wall Street Journal

South Korean technology giant Samsung Electronics has fallen behind in the artificial-intelligence race—at least in the first heat.

It would be foolish to count it out, though. Recent signs indicate it might be narrowing the technological gap with rivals SK Hynix and Micron in high-performance AI memory chips. Even if it takes longer than expected to catch up, a tighter overall memory market thanks to the AI boom could still be a significant tailwind for Samsung.

Nvidia’s AI chips have been selling like hot cakes since the rise of generative AI apps such as ChatGPT. Memory-chip makers have, in turn, sold out their high-performance products to Nvidia and others. High-bandwidth memory, or HBM, offers enhanced data-processing speed, which is crucial for AI number crunching.

Korea’s SK Hynix has taken an early lead in HBM. It is virtually the only supplier to Nvidia for the latest-generation memory chip, called HBM3. Samsung only started mass-producing HBM3 in the second half of last year. It does produce earlier generations of HBM chips used by some slower AI chips, rather than the most cutting-edge ones made by Nvidia.

And now SK Hynix has started mass producing its next-generation chips, called HBM3E. SK’s smaller rival Micron, which essentially skipped the previous generation, is doing the same. Both companies said they have already sold out their full HBM production volume this year and are already filling orders for next year.


Even so, Samsung is working hard to catch up. The company expects to mass-produce next-generation HBM chips in the first half of this year. That would leave it about a fiscal quarter rather than a full year—as with the previous generation of HBM chips—behind the competition.

Moreover, on March 19, Nvidia Chief Executive Jensen Huang said the company is in the process of testing Samsung’s next-generation HBM chips, according to Japan’s Nikkei. An interesting aside: Huang wrote “Jensen Approved” along with his signature on Samsung’s HBM3E product display at Nvidia’s AI conference this March.

Samsung will have to ensure its products are up to standard while simultaneously ramping up capacity. But given the extremely tight supply situation, Nvidia has every reason to want an additional supplier.

If Samsung does manage to catch up, it could tap in to a fast-rising segment of the memory market.

Bernstein Research estimates HBM sales will expand to 16% of total industry revenue from DRAM, a type of chip used as working memory, this year. Goldman Sachs, in a report dated March 22, raised its estimate of the future HBM market to $23 billion by 2026: That would represent a 10-fold increase from $2.3 billion in 2022.

But the jump in demand for HBM chips will also help keep the overall memory market tighter as more capacity is used to make these high-margin chips. That shift will benefit Samsung, which has a cost advantage over its peers in conventional memory products. The rising use of AI applications might also require more powerful devices with higher memory capacity in general.

Samsung’s shares have sharply lagged behind rivals SK Hynix and Micron, whose values have more than doubled since the start of last year. That is partly because Samsung isn’t a pure memory-chip company. But it also reflects slower progress in HBM.

Samsung, which is the market leader for the overall memory market, now finds itself in the uncomfortable position of catching up. Sustaining that sprint will be expensive, but a tighter overall memory market—and a potential assist from Nvidia—would be a big help.



Back to a basic question: Did anyone count Samsung out of HBM competition?
 
Back
Top