Top China silicon figure calls on country to stop using Nvidia GPUs for AI — says current AI development model could become 'lethal' if not addressed

3 hours ago 3
Nvidia H200 Hopper
(Image credit: Nvidia)

Wei Shaojun, vice president of China Semiconductor Industry Association, and a senior Chinese academic and government adviser, has called on China and other Asian countries to ditch using Nvidia GPUs for AI training and inference. At a forum in Singapore, he warned that reliance on U.S.-origin hardware poses long-term risks for China and its regional peers, reports Bloomberg

Wei criticized the current AI development model across Asia, which closely mirrors the American path of using compute GPUs from Nvidia or AMD for training large language models such as ChatGPT and DeepSeek. He argued that this imitation limits regional autonomy and could become 'lethal' if not addressed. According to Wei, Asia's strategy must diverge from the U.S. template, particularly in foundational areas like algorithm design and computing infrastructure. 

Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.

Over time, Nvidia reinforced its lead with specialized hardware (Tensor Cores, mixed-precision formats), tight software integration, and widespread cloud and OEM support, making its GPUs the default compute backbone for AI training and inference. Nvidia's modern architectures like Blackwell for data centers have plenty of optimizations for AI training and inference and have almost nothing to do with graphics. By contrast, special-purpose ASICs — which are advocated by Wei Shaojun — are yet to gain traction for either training or inference.

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

Read Entire Article