TMTPOST — AI chips have emerged as a decisive factor in the escalating technological rivalry between China and the United States.
Over the past two weeks alone, Nvidia—now valued at nearly $4.5 trillion—announced plans to invest up to $100 billion in OpenAI over the next decade. As part of the partnership, OpenAI will purchase and deploy between 4 million and 5 million Nvidia GPU chips. Meanwhile, on October 7, AMD revealed a four-year, multibillion-dollar chip supply agreement with OpenAI, under which OpenAI is expected to acquire up to a 10% equity stake in AMD. Oracle has also entered into a trillion-dollar strategic partnership with OpenAI.
Following the AMD-OpenAI announcement, AMD shares surged—recording their biggest single-day gain in nearly ten years. The deal positions AMD, long regarded as the “runner-up” in the data-center AI chip market, to mount its first direct challenge to Nvidia’s dominance, while enabling OpenAI to establish what some analysts have called a trillion-dollar “circular transaction.”
In early October, DeepSeek launched its latest AI model, DeepSeek-V3.2-Exp. Shortly afterward, domestic chipmakers including Cambricon and Huawei Ascend announced full compatibility with the model. Huawei also revealed a mass production plan for its Ascend 910 series chips, with the Ascend 950PR—featuring Huawei’s self-developed HBM memory—scheduled for release in the first quarter of 2026, and the Ascend 970 planned for the fourth quarter of 2028.
Cambricon has seen an impressive surge in market value, with its stock rising 124% between July and September. Its market capitalization now stands at 521 billion yuan, at one point surpassing Japan’s largest chip equipment maker, Tokyo Electron, making Cambricon one of the highest-valued semiconductor design firms in China’s A-share market.
Nvidia CEO Jensen Huang recently commented that China is “only a few nanoseconds behind the US” in chip technology and highlighted the country’s strong potential in research, development, and manufacturing. He also urged the US government to allow American tech companies to compete in markets such as China to “enhance America’s influence.”
Despite U.S. export restrictions, these measures have inadvertently accelerated domestic AI chip innovation. The H20 chip, for example, received only a lukewarm response in China, even as Nvidia products accounted for over a third of AI chip sales in the country in 2024. Huang and his team continue to be deeply involved in the ongoing US-China AI chip competition.
The U.S. continues to tighten export controls on AI chips to China. The newly enacted “GAIN AI” Act requires that Nvidia prioritize supplying AI chips to American companies before exporting advanced chips abroad, potentially excluding China from a share of its $50 billion AI computing market.
Meanwhile, competition is heating up globally. Companies like AMD, Google, Microsoft, and Broadcom are racing to develop more cost-effective AI computing chips. At the same time, domestic players such as Huawei, Cambricon, and Moore Threads are increasingly securing deployment orders. Major Chinese internet companies—Alibaba, Tencent, Baidu, and ByteDance—are also stepping up investment in chip R&D and design, aiming to strengthen self-sufficiency and control over the AI supply chain.
According to Epoch AI, OpenAI spent $7 billion on computing power over the past year, with $5 billion allocated to AI model training. Morgan Stanley projects that global investment in AI infrastructure could reach $3 trillion (around 21 trillion yuan) over the next three years. Deloitte predicts that global semiconductor sales will reach a record $697 billion in 2025, potentially surpassing $1 trillion by 2030 as AI, 5G, and other technologies continue to expand.
Morningstar analyst Brian Colello cautioned, “If an AI bubble forms and eventually bursts, Nvidia’s large investment in OpenAI could be an early warning sign.” When asked about Chinese chipmakers’ progress, a Nvidia spokesperson simply stated, “There is no doubt that competition has arrived.”
Wang Bo, CEO of Tsing Micro Intelligence—a Tsinghua University-affiliated AI chip company—pointed out that reconfigurable AI chips provide a development path for China that doesn’t rely on Nvidia GPUs. He emphasized that to gain market share, domestic AI chip products need to offer at least five times the cost-performance ratio of competitors. “In this industry, dominant players like Nvidia or Intel set the standard,” he said. “Following their path blindly will only lead to being crushed.”
The DeepSeek Boom Accelerates Domestic AI Chip Advancement
Since October 2022, the United States has implemented multiple rounds of export controls targeting China’s semiconductor industry, aiming to prevent the country from manufacturing advanced AI chips or using American chips to train state-of-the-art AI models.
In December 2024, during the final year of the Biden administration, the U.S. tightened restrictions further, limiting exports of HBM (high-bandwidth memory) essential for advanced AI chips and lowering the threshold for computing power density—directly challenging China’s ability to develop large-scale AI models. In response, Chinese internet and cloud companies that previously relied heavily on Nvidia chips began exploring the deployment of domestic AI chips.
The DeepSeek boom in 2025 has further accelerated this shift. When DeepSeek released version V3.1 in August, the announcement noted that “UE8M0 FP8 is designed for the next generation of soon-to-be-released domestic chips,” drawing widespread market attention and even contributing to a drop in Nvidia’s stock price.
DeepSeek’s training costs are currently far lower than those of leading U.S. AI models. A paper published in Nature on September 18, with Liang Wenfeng as the corresponding author, revealed that training the DeepSeek-R1 model cost just $294,000. Even factoring in the roughly $6 million foundational model cost, this remains far below the expenditures required for comparable models at OpenAI or Google.
In July, Nvidia CEO Jensen Huang described DeepSeek-R1 as a revolutionary, open-source inference model. He emphasized its flexibility, noting that Chinese AI models can be efficiently adapted to diverse applications, allowing companies to build products or even entire businesses on these platforms.
Four years ago, Nvidia held 95% of the AI chip market in China; today, its share has dropped to just 50%. Huang warned, “If we don’t actively compete in China and instead allow domestic platforms and ecosystems to flourish independently, Chinese AI technology and leadership will inevitably spread globally as the technology itself proliferates.”
Nvidia CEO Jensen Huang
As of late 2024, Nvidia controlled over 90% of the global AI accelerator chip market. Its data center revenue surged to $41.1 billion in Q2 2025, marking a 56% year-on-year increase and solidifying its position as the company’s largest revenue driver.
In August 2025, Jensen Huang, Nvidia’s CEO, highlighted the strong demand for its Blackwell Ultra architecture chips, noting their central role in the growing AI race. He stated that the annual capital expenditure related to data center infrastructure is projected to hit $600 billion, with Nvidia aiming to capture a $3–4 trillion opportunity in AI infrastructure development through its Blackwell and Rubin architectures. Huang emphasized that as AI continues to develop, it will be a key driver of global GDP growth.
However, China’s AI chip sector is beginning to challenge Nvidia’s dominance. As U.S. restrictions on chip exports to China intensify, domestic companies like Alibaba, Tencent, and ByteDance’s Volcano Engine are stockpiling Nvidia GPUs while also exploring homegrown alternatives.
Nvidia’s fiscal struggles have worsened, with the company facing a $4.5 billion expense due to unsold H20 chips in Q1 FY2026, following U.S. export restrictions. Sales plummeted by $4 billion in Q2 2026, reflecting the growing shift toward domestic solutions.
Meanwhile, China’s AI chip market is grappling with a supply shortage, but domestic chipmakers are rising to the challenge. Companies like Alibaba, Cambricon, Iluvatar CoreX, Moore Threads, and Biren Technology are positioning themselves as core players in the AI chip race.
A recent CCTV report highlighted the China Unicom Sanjiangyuan Green Power Intelligent Computing Center project, where Alibaba’s Pingtouge introduced a new PPU chip designed for AI data centers. The chip outperforms Nvidia’s A800 and matches the H20 in key specs, while consuming less energy.
Cambricon, a Chinese AI chip company, has become a key player, with ByteDance as its largest client. The company has pre-sold over 200,000 chips. Both Alibaba and Baidu have also begun mass production of their self-developed chips, while Tencent is gradually deploying previously stockpiled chips and purchasing products from Enflame.返回搜狐,查看更多