-
DeepSeek is seen as AI’s Sputnik moment—just as the USSR's 1957 Sputnik launch caught the U.S. off guard, China's AI startup DeepSeek appears to have shaken U.S. AI dominance.
-
DeepSeek has disrupted the U.S. AI sector by achieving comparable performance at a fraction of the cost of its American counterparts.
Does DeepSeek’s R1 Really Cost So Little?
-
DeepSeek claims that training its R1 model required only $5.6 million and 2,048 Nvidia H800 GPUs—approximately 3% to 5% of OpenAI’s estimated cost for GPT.
-
While DeepSeek attributes its cost efficiency to new techniques in reinforcement learning, some speculate it may have also leveraged unauthorized model distillation.
-
DeepSeek's actual costs are likely much higher than its claims. Scale AI CEO Alexandr Wang stated in a CNBC interview that DeepSeek has access to around 50,000 Nvidia H100 AI GPUs, each priced between $27,000 and $40,000.
-
SemiAnalysis estimates that DeepSeek’s total server investment is approximately $1.6 billion.
The Mass Adoption of DeepSeek Could Be in Doubt Due to:
1) Privacy and Security Concerns
-
Several countries, including South Korea, the Netherlands, Australia, and Taiwan, have banned DeepSeek from government devices over security concerns, with more nations potentially following suit.
-
LatticeFlow AI ranks DeepSeek’s models among the weakest in cybersecurity compared to other leading systems. Meanwhile, NowSecure has urged organizations to remove the DeepSeek iOS mobile app due to security risks.
2) Model Distillation and the Risk of Model Collapse
-
OpenAI has accused DeepSeek of using model distillation to train its AI based on OpenAI's technology.
-
While the original AI model contains rich and complex knowledge, the distillation process compresses this into a smaller model, inevitably losing some crucial details.
-
Training AI on data from other models can lead to model collapse, where results become increasingly similar, reducing knowledge diversity and weakening adaptability.
-
If DeepSeek relies heavily on distillation, its model may lag behind the latest advancements due to the time required for the process.
3) China’s Semiconductor Limitations
DeepSeek's growth could be constrained if it depends on China’s semiconductor advancements, as hardware limitations and geopolitical risks may restrict its ability to compete with global AI leaders.
The Implications of DeepSeek
1) AI Software Companies Could Emerge as Winners
-
DeepSeek’s success may benefit AI software stocks, as lower AI computing costs open new opportunities.
-
Potential beneficiaries include iShares Expanded Tech-Software Sector ETF (Ticker: IGV), Snowflake, Salesforce, Palantir, and CrowdStrike.
2) Big Tech Stocks Mostly Fell
-
Hyperscalers took a hit as they continued announcing higher capital expenditures (Capex), but investors shouldn’t overreact, as hyperscalers cite that the increased Capex is necessary to meet rising cloud and AI demand.
-
Despite the sharp tech selloff following DeepSeek’s developments, the Bloomberg Magnificent 7 Total Return Index is up 1.61% year-to-date, indicating that investor interest in major AI stocks remains strong.
-
US hardware and semiconductor stocks—including Nvidia, Broadcom, ASML, Marvell, and TSMC—are still recovering from the steep selloff triggered by DeepSeek’s emergence.
3) DeepSeek's Success Could Boost Nvidia's GPU Demand Despite Nvidia Being Among the Hardest Hit
DeepSeek's success could create more demand for Nvidia's GPUs for several reasons, despite initial concerns that lower-cost AI models might reduce the need for high-end chips.
A) Jevons Paradox – Increased Efficiency Drives Greater Demand
In the case of AI, if DeepSeek R1 makes AI more affordable, AI adoption will accelerate across industries, leading to greater demand for computing power and, consequently, higher demand for GPUs.
B) Hyperscaler AI Capex Not Slowing Down Yet
-
Nvidia generates approximately half of its revenue from its big four customers in Q2 2025, namely Microsoft, Meta, Amazon, and Google.
-
Despite the rise of cost-effective AI models, major hyperscalers are still committed to investing more heavily in AI infrastructure in 2025.
C) DeepSeek is Open-Source, Encouraging More AI Development
-
DeepSeek’s open-source nature allows developers worldwide to build upon its foundation, fostering faster AI innovation.
-
More AI projects will lead to increased demand for high-performance GPUs to support both research and deployment.
D) Potential Loss of the China Market, but a Wake-Up Call for All Countries
-
The success of DeepSeek may prompt the US to impose tighter GPU export restrictions to China, leading to long-term revenue losses from the Chinese market for Nvidia.
-
However, more countries may attempt to replicate DeepSeek’s success, as those at the forefront of AI development gain strategic advantages in productivity, innovation, and military strength. This would further increase global demand for Nvidia's GPUs.
Conclusion:
-
The advancement of Artificial Intelligence (AI) relies on three critical factors: computing power, data, and algorithms.
-
While DeepSeek may present a low-cost, high-performance model by introducing an alternative algorithm, I believe that the race toward AGI for major tech companies will still require substantial computing power. Therefore, investors should not dismiss Nvidia too soon, as its GPUs remain pivotal to this evolution.
-
DeepSeek's algorithm, which prioritizes cost reduction and efficiency, is not a universal solution for all AI models.
-
Nonetheless, the emergence of the DeepSeek model has catalyzed AI development, and I reckon it will likely contribute to significant productivity growth in the US, similar to the transformative impact of the internet in the late 1990s.
-
I remain positive on AI-related stocks and ETFs, including the Magnificent Seven (Meta, Nvidia, Apple, Tesla, Alphabet, Amazon, and Microsoft), SMH ETF $VanEck Semiconductor ETF(SMH)$ , TSMC $Taiwan Semiconductor Manufacturing(TSM)$ , Broadcom $Broadcom(AVGO)$ , ARM $ARM Holdings(ARM)$ , and Oracle $Oracle(ORCL)$ .
Comments