Friday, December 5, 2025

Super Micro Stock Soars 16% After Nasdaq Delisting Fears Fade

Super Micro Computer's stock surged 16% as Nasdaq delisting fears eased; CEO projects $40 billion revenue next fiscal year.

Nikola’s Stock Makes a Comeback—But Will It Last?

Nikola's stock is volatile, soaring 41.44% recently despite bankruptcy fears, driven by speculative trading and potential upside.

BabyMonster’s ‘DRIP’: Behind the Scenes of a Musical Masterpiece

BabyMonster reveals the recording process of "DRIP," showcasing their dedication and talent ahead of their debut world tour.

OpenAI Unveils New AI Model: Are Higher Subscription Fees on the Horizon?

TechOpenAI Unveils New AI Model: Are Higher Subscription Fees on the Horizon?

OpenAI is set to launch a new computationally intensive artificial intelligence (AI) model.

Given the substantial computational resources invested in enhancing this model’s performance, the company has signaled a premium service strategy that will include additional fees on top of its already high-priced plans.

Sam Altman OpenAI CEO X (former Twitter) capture
Sam Altman OpenAI CEO X (former Twitter) capture

Sam Altman, Chief Executive Officer (CEO) of OpenAI, announced on his X (formerly Twitter) account on September 21 that they plan to unveil a new computationally intensive service in the coming weeks. He noted that due to cost considerations, some features will be exclusive to pro subscribers, while others may incur extra charges.

The current ChatGPT Pro subscription is priced at 200 USD.

Altman emphasized that its ultimate goal remains to minimize costs and make the ChatGPT service widely accessible, and he’s confident they’ll achieve this over time. He added, however, today they’re excited to explore the potential of ideas that leverage significant computational resources.

Recently, the AI tech industry has been focusing on developing low-cost, high-efficiency solutions.

Analysts view Altman’s announcement as a strategic response to user feedback suggesting that the recently released GPT-5 didn’t deliver the dramatic performance improvements some had anticipated.

Some experts interpret this move as an attempt to overcome the limitations of the scaling laws in AI development.

The scaling laws posit that performance improves proportionally with increases in model size, data volume, and computing power. However, critics argue that the benefits of this scaling approach have reached a plateau.

While techniques like test-time compute, which extend inference time to boost performance, are gaining traction, they still show limitations in areas where the model lacks training data.

Check Out Our Content

Check Out Other Tags:

Most Popular Articles