
US semiconductor company AMD’s AI data center chips sales reached $12.6 billion last year, up 94% from the previous year. However, this was only 40% of the sales of “AI giant” Nvidia in the first quarter, so its presence is still insignificant.
Memory manufacturers supplying high-bandwidth memory (HBM) to AI accelerators can enjoy AI benefits only by providing products to Nvidia, which has strong market dominance.
According to AMD’s latest earnings report released Tuesday, its data center division revenue for Q4 of fiscal year 2024 (October to December) reached $3.86 billion, a 69% year-over-year increase.
The company’s annual data center revenue surged by 94% to $12.6 billion, with $5 billion attributed to AMD’s AI GPU Instinct series. The remaining revenue came from server and CPU products, including the EPYC series.
During the earnings call, AMD Chair and CEO Lisa Su stated, “The year 2024 was AMD’s inflection point, and annual sales in the data center division nearly doubled, recording the largest annual sales and net profit ever.”
Su expressed optimism about AMD’s future, predicting that the rapid expansion of its data center franchise will generate tens of billions in annual revenue over the coming years, enabling the company to enter a steep long-term growth trajectory.
Despite AMD’s impressive growth in the data center division, it is still below market expectations. Analysts had projected AMD’s Q4 data center division revenue to hit $4.14 billion. It is interpreted that Nvidia has high dominance in the AI semiconductor market.
While AMD and NVIDIA have long been rivals in the gaming GPU market, the gap widened as Nvidia preemptively focused on developing AI-specific GPU products and establishing AI ecosystems such as the AI software development platform “CUDA.”
To put this in perspective, AMD’s annual data center division revenue of $12.6 billion represents just 40% of NVIDIA’s projected data center revenue for Q3 of fiscal year 2025 (August to October 2024), estimated at $30.8 billion.
This stark contrast reaffirms NVIDIA’s dominance in the AI market, making it necessary for memory suppliers of essential parts like HBM to prioritize NVIDIA as a customer.
SK Hynix, a key supplier of cutting-edge fifth-generation HBM (HBM3E) to NVIDIA, reported record-breaking operating profits of $16.2 billion last year. On the other hand, Samsung Electronics’ operating profit, still undergoing quality verification for its HBM3E, remained at around $10.4 billion.
Samsung Electronics is redoubling its efforts to secure NVIDIA as a customer. During its January 31 earnings call, the company announced its plans to begin supplying improved HBM3E products to major clients by the end of Q1, with full-scale production slated for Q2.