Nvidia’s challengers are seizing a new opportunity to crack its dominance of artificial intelligence chips after Chinese start-up DeepSeek accelerated a shift in AI’s computing requirements.
DeepSeek’s R1 and other so-called “reasoning” models, such as OpenAI’s o3 and Anthropic’s Claude 3.7, consume more computing resources than previous AI systems at the point when a user makes their request, a process called “inference”.
That has flipped the focus of demand for AI computing, which until recently was centred on training or creating a model. Inference is expected to become a greater portion of the technology’s needs as demand grows among individuals and businesses for applications that go beyond today’s popular chatbots, such as ChatGPT or xAI’s Grok.