
The global server market achieved a record-breaking quarterly growth rate of 134.1% in the first quarter of this year, dispelling concerns about a potential bubble. Both traditional x86-based servers and ARM-based servers, known for their superior power efficiency, are experiencing rapid growth. This surge is fueling industry expectations for increased demand in high-performance memory.
AI Inference Drives Server Market to Record Growth in Q1
According to market research firm IDC, the global server market reached an unprecedented 95.2 billion USD in the first quarter of this year. This represents a staggering 134.1% increase compared to the same period last year, marking the highest quarterly growth rate ever recorded.
Based on these impressive results, IDC has revised its forecast for the annual server market size this year to 366 billion USD, a 44.6% increase from the previous year.
Artificial intelligence (AI) servers are undoubtedly spearheading the growth in the overall server market. The emergence and rapid adoption of generative AI, exemplified by ChatGPT, has led to a surge in demand for high-performance AI servers necessary for training and inference of large AI models.
Some skeptics had raised concerns about an AI bubble, questioning whether big tech companies could generate real profits from their massive AI investments. However, the accelerating growth of AI servers is effectively silencing these doubts.
High-performance AI servers equipped with graphics processing units (GPUs), explicitly designed for parallel computing, are essential for training AI models. However, there were predictions that investment might slow as AI inference, which requires less powerful hardware than training, becomes more widespread.
Contrary to these expectations, the explosive growth in AI inference demand is driving up investments in AI infrastructure. As of June, ChatGPT’s weekly active users have soared to approximately 800 million, doubling from 400 million in February. This surge is attributed to the increasing sophistication of AI inference and its widespread adoption across various industries. Analysts project that OpenAI’s revenue will skyrocket from 1 billion USD in 2023 to 11 billion USD this year, representing an elevenfold increase in just two years.
ARM Servers Gain Traction, Boosting High-Performance Memory Demand
The booming AI server market is not only fueling demand for traditional x86-based servers but also accelerating the growth of ARM-based servers, which are optimized for AI inference. To cater to the exploding number of AI inference users, companies must process enormous workloads simultaneously while maintaining operational efficiency.
This is where ARM-based servers shine, leveraging simpler instruction sets to reduce power consumption and cut data center operating costs. IDC predicts that the non-x86 server market will reach $ 82 billion, a 63.7% year-over-year increase. ARM-based servers are expected to show a remarkable 70% growth rate, capturing 21.1% of total shipments this year. Meanwhile, the x86-based server market, dominated by Intel Corporation and AMD processors, is projected to grow by 39.9% to 283.9 billion USD.
The expansion of the AI server market is driving up demand for high-performance memory, boosting the performance of related companies. SK Hynix, a pioneer in developing high-bandwidth memory (HBM) and maintaining technological leadership, continues to set new performance records.
U.S.-based Micron is also aggressively entering the HBM market, achieving a record quarterly revenue of 9.3 billion USD in the third quarter of fiscal 2025 (February to May). Their HBM revenue surged by 50% compared to the previous quarter.
Samsung Electronics, which has traditionally focused more on general-purpose dynamic random access memory (DRAM), is gearing up for a comeback. In the second quarter of this year, Samsung Electronics introduced its fifth-generation HBM (HBM3E), featuring a 12-layer design change, and successfully integrated it into AMD’s next-generation AI accelerator. Suppose the company passes quality verification from Nvidia, the largest HBM customer, in the second half of the year. In that case, it is expected to join other memory manufacturers in reaping the benefits of the booming AI server market.