Friday, January 30, 2026

Hanwha Achieves Top CDP Climate Rating, Upgraded to Leadership A

Hanwha Group achieved the highest Leadership A rating in the 2024 Climate Change Response Assessment by CDP, highlighting its climate initiatives.

The New Model Y Is Here—Sharper Looks, Smoother Ride, Same Tesla DNA

Tesla Korea launches the new Model Y with enhanced design and features, offering three variants and impressive range and efficiency.

Lineage2M Goes Big in Southeast Asia with May 20 Launch

NCSOFT will launch Lineage2M in six Southeast Asian countries on May 20, featuring localization and cross-region play.

OpenAI Unveils New AI Model: Are Higher Subscription Fees on the Horizon?

TechOpenAI Unveils New AI Model: Are Higher Subscription Fees on the Horizon?

OpenAI is set to launch a new computationally intensive artificial intelligence (AI) model.

Given the substantial computational resources invested in enhancing this model’s performance, the company has signaled a premium service strategy that will include additional fees on top of its already high-priced plans.

Sam Altman OpenAI CEO X (former Twitter) capture
Sam Altman OpenAI CEO X (former Twitter) capture

Sam Altman, Chief Executive Officer (CEO) of OpenAI, announced on his X (formerly Twitter) account on September 21 that they plan to unveil a new computationally intensive service in the coming weeks. He noted that due to cost considerations, some features will be exclusive to pro subscribers, while others may incur extra charges.

The current ChatGPT Pro subscription is priced at 200 USD.

Altman emphasized that its ultimate goal remains to minimize costs and make the ChatGPT service widely accessible, and he’s confident they’ll achieve this over time. He added, however, today they’re excited to explore the potential of ideas that leverage significant computational resources.

Recently, the AI tech industry has been focusing on developing low-cost, high-efficiency solutions.

Analysts view Altman’s announcement as a strategic response to user feedback suggesting that the recently released GPT-5 didn’t deliver the dramatic performance improvements some had anticipated.

Some experts interpret this move as an attempt to overcome the limitations of the scaling laws in AI development.

The scaling laws posit that performance improves proportionally with increases in model size, data volume, and computing power. However, critics argue that the benefits of this scaling approach have reached a plateau.

While techniques like test-time compute, which extend inference time to boost performance, are gaining traction, they still show limitations in areas where the model lacks training data.

Check Out Our Content

Check Out Other Tags:

Most Popular Articles