As the United States considers imposing stricter trade restrictions to prevent advanced chip technology from reaching China, Nvidia, a U.S.-based chip manufacturer, is reportedly developing a version of its new artificial intelligence chips that aligns with those regulations.
According to reports from Reuters, Nvidia is creating a version of its Blackwell AI chips specifically for the Chinese market, collaborating with the local distribution partner, Inspur, to introduce and market the chip, provisionally named the “B20,” in China.
The B20 is anticipated to begin shipping in the second quarter of 2025, as per information from an unnamed source. Nvidia has not provided any comments on this development.
Currently, the company has three chips that have been designed to comply with U.S. export controls, including the H20, which Nvidia recently reduced prices for due to sluggish sales in order to remain competitive against Huawei’s domestic chips. However, sales of the H20 have reportedly started to increase. Nvidia is projected to sell over one million of its H20 chips in China this year, amounting to approximately $12 billion, despite existing U.S. trade restrictions. This figure is nearly double Huawei’s sales projections for its Ascend 910B chip, according to data from SemiAnalysis.
On the other hand, analysts from Jeffries noted that Nvidia’s H20 chips may face challenges under potential future U.S. trade regulations. They indicated that during the annual review of U.S. semiconductor export controls scheduled for October, there is a “high likelihood” that the H20 could be banned for sale to China. This ban could be enacted in various ways, including a specific product ban, adjustments to the computing power limits, or restrictions on memory capacity.
Additionally, there is the possibility that the U.S. will expand export controls on chips sold to neighboring countries, like Malaysia, Indonesia, and Thailand, or broaden restrictions to include overseas Chinese companies, although this would be more complex to implement.