Friday, December 5, 2025

The Most Expensive Flashlight: What Is The Point Of ChatGPT In The World’s Darkest Country?

North Korea's media landscape evolves with ChatGPT use in universities and rising mobile phone ownership at the 2025 Inter-Korean Conference.

China Outpaces U.S. in Quantum Technology Investment, Boosting Its Global Ambitions

China invests heavily in quantum technology, challenging the U.S., while South Korea aims to enhance its quantum capabilities.

Apple Music Classical Expands to Web, Bringing 5 Million Tracks to Your PC

Apple Music Classical expands to a web platform, offering over 5 million classical tracks and exclusive content from renowned artists.

Memory-Centric Computing: Why HBM is the Game Changer for AI and Data Centers

TechMemory-Centric Computing: Why HBM is the Game Changer for AI and Data Centers
 Visitors look around an exhibition booth at \'SK AI Summit 2025\' held at COEX in Gangnam-gu, Seoul, on the 3rd. 2025.11.3 / News1 © News1 Reporter
 Visitors look around an exhibition booth at ‘SK AI Summit 2025’ held at COEX in Gangnam-gu, Seoul, on the 3rd. 2025.11.3 / News1 © News1 Reporter

On December 3rd, Kim Ho-sik, Vice President of SK Hynix’s Memory System Research Center, emphasized that the computing paradigm is shifting towards memory-centric approaches. He stressed the necessity of memory innovation for advancing artificial intelligence (AI) systems.

During a panel discussion at the ‘SK AI Summit 2025’ in Seoul’s COEX convention center, Kim urged all ecosystem players to view memory companies as technology partners and solution providers rather than mere suppliers. He advocated for collaborative innovation in the field.

The ‘New Semiconductor Solutions’ session featured a keynote by David A. Patterson, a Google engineer and UC Berkeley professor emeritus, on “The Reality and Future of Memory-Centric Computing: Addressing Memory Bottlenecks.” A panel discussion followed, including Vice President Kim, TSMC engineer and Stanford professor Philip Wong, and Meta engineer Kim Chang-kyu.

With the recent surge in AI demand, the industry faces a critical challenge: computational devices like GPUs and CPUs are outpacing memory development, creating significant bottlenecks.

This shift is driving the importance of next-generation memory solutions, particularly high-bandwidth memory (HBM), as the industry transitions from processor-centric to memory-centric architectures.

Addressing the crucial role of memory in data centers, Kim stated, “HBM has been and will continue to be a game-changer.” He humorously referenced NVIDIA CEO Jensen Huang’s jest about reaching HBM97, adding, “While that’s likely an exaggeration, HBM will remain essential.”

During a recent press conference in South Korea, coinciding with the APEC summit, Huang praised Samsung Electronics for its diversity and SK Hynix for its focus. He expressed complete confidence in their joint development of future HBM generations, including HBM4, HBM5, and even the speculative HBM97.

Kim highlighted energy efficiency as a primary concern, proposing ‘compute-near memory’ solutions. This approach involves positioning computational devices adjacent to or directly above memory components.

Traditional processor-centric designs require data to travel long distances between memory and processors. By integrating processors within or near memory chips, latency can be significantly reduced, alleviating bandwidth bottlenecks and dramatically cutting energy consumption in data transfer.

“Near-memory processing is on the horizon,” Kim concluded, “with numerous products set to debut in the near future.”

Check Out Our Content

Check Out Other Tags:

Most Popular Articles