Since the launch of ChatGPT by OpenAI at the end of 2023, the world has witnessed a seismic shift marked by the rise of generative AI technologies. This phenomenon has led to the emergence of numerous large-scale AI models, with over 200 models currently registered in China alone. Despite the rapid growth and proliferation of generative AI, many companies that produce these models have struggled financially. Most notably, OpenAI itself has found it increasingly difficult to maintain profitability, which highlights a paradox in the AI landscape: while AI technologies are advancing rapidly, the accompanying business models have yet to align with sustainable financial success.
The high costs associated with training AI models and maintaining server infrastructure have significantly exacerbated the financial challenges faced by these companies. For instance, OpenAI has projected an operating loss of $5 billion for 2024. Training massive AI models, involving hundreds of billions or even trillions of parameters, can run into training costs of a million to over a hundred million dollars. As a result, companies engaged in developing large models are contending with prohibitive operational costs that hinder their ability to translate technological prowess into profitability.
Advertisement
In stark contrast to these struggles, companies like NVIDIA, which provide the essential high-performance chips for AI development, are thriving. The monopolistic position held by NVIDIA in this niche reinforces the idea that in the current AI landscape, the true competition may lie not in the models themselves but in the number of NVIDIA GPUs at a company’s disposal. This "who-has-the-most-chips" contest highlights a significant aspect of the AI ecosystem: the reliance on powerful hardware drives much of the competition and innovation.
"The AI era is accelerating, pushing the world toward a NVIDIA-centric computing approach," noted Jensen Huang, CEO and founder of NVIDIA. He emphasized that as developers of foundational models continue to expand their pre-training, fine-tuning, and inference operations, the demand for their latest chip architecture, Hopper, remains robust, with high expectations surrounding the forthcoming Blackwell series. This sentiment encapsulates the prevailing demand dynamics in the tech industry.
The parameter count and training dataset sizes for large models are substantial, necessitating significant computational resources, which in turn drives a surge in demand for semiconductors. Notably, investments in AI from major tech players like Google, Meta, Amazon, and Microsoft are projected to exceed $200 billion, with billions allocated solely for purchasing NVIDIA's chips to enhance their AI capabilities.
As big tech companies engage in a frenzied buying spree for high-performance chips, NVIDIA's products have been met with overwhelming demand, resulting in record revenue growth. By October 27, 2024, NVIDIA reported a staggering revenue of $35.08 billion for its fiscal third quarter of 2025, which marked a 94% increase year-over-year. The data center segment alone generated $30.8 billion, up 112% compared to the previous year, and net profits soared to $19.31 billion—a 109% increase from the prior year. This demonstrates that while many in the AI sector are grappling with financial challenges, NVIDIA has established itself as the most lucrative segment of the AI business, primarily through selling computational power.
However, concerns about the sustainability of NVIDIA's rapid growth abound. The company forecasts annual revenue could reach $140 billion, but analysts express skepticism over the duration of such explosive growth. The surging figures are mainly driven by key clients like Meta and Google, who have invested significant sums into acquiring NVIDIA chips. Yet, it is unlikely that their expenditures on AI infrastructure can continue to rise at such an unprecedented rate.
This reality raises important questions regarding NVIDIA's future financial trajectory, as evidence suggests that growth rates are beginning to slow. Following the release of its robust earnings report, NVIDIA's stock still suffered a 2.5% drop in after-hours trading, signaling investor caution.
(Yet another compelling factor influencing NVIDIA’s growth is the anticipated Blackwell chip.) This next-generation AI flagship chip is set to begin mass production, with expectations of producing approximately 450,000 units in the fourth quarter, which could contribute more than $10 billion in revenue. Influential clients, including Microsoft, Oracle, and OpenAI, have already received deliveries of this next-gen chip, with expectations of further increases in output set to create considerable revenue streams in the upcoming quarters.
In summary, generative AI represents a powerful driver of technological progress, with NVIDIA positioned at its core. The competitive landscape is characterized by urgent demand for limited chip supplies, making NVIDIA’s role especially critical in advancing AI applications across various domains. Thanks to its unassailable dominance in the GPU market, NVIDIA has seized a rare opportunity in this burgeoning sector.
Clearly, within the landscape of generative AI, NVIDIA is emerging as the major victor; its revenues have skyrocketed, and its stock performance has surged, growing threefold in 2023 and continuing twofold in subsequent years. This has rendered NVIDIA one of the most valuable companies globally, with a market capitalization soaring to $3.58 trillion. Such valuation reflects a robust investor confidence in the future of AI-driven commercial applications. Nonetheless, the surging capital market raises questions regarding the potential existence of a speculative bubble.
Ultimately, with the escalating demand for chips driven by AI large models, it is worth contemplating whether NVIDIA could be the first company to surpass a market valuation of $4 trillion. What are your thoughts on this potential milestone?