Stefanie Keenan/Getty Images for Village Global; Tsai Hsin-Han/Reuters; Kimberly White/Getty Images for Wired
- Although Nvidia is miles ahead, the field of AI hardware around it is growing fast.
- Cloud giants and startups are designing rival AI chips, especially for inference.
- China and legacy chip giants are dialing up the pressure.
Nvidia's technical dominance and surging revenues show no signs of slowing down.
At the same time, as capital expenditures explode and technical shifts disrupt the field, the company's concentration of power draws fresh pressure.
Nvidia's graphics processing units (GPUs) aren't cheap. And as customers look to reduce reliance on them, some companies are emerging as rivals.
AI's focus is also evolving. While GPUs dominate training, inference—or running AI models and having them perform tasks—is continuous and cost-sensitive. A wave of startups is building inference chips that they're positioning as cheaper and more efficient than GPUs.
AI hardware chain companies are often both competitors and partners. Silicon giant Broadcom, for instance, designs chips that compete with Nvidia's and also furnishes the networking tech to connect its GPUs.
The result isn't head-to-head rivalry so much as a rapidly widening and increasingly tangled field, even as Nvidia remains miles ahead.
These are the biggest challengers to Nvidia's dominance.
1. Customers-turned-competitors
Ludovic MARIN / AFP via Getty Images
Google has become one of the most formidable rivals, having worked on Tensor Processing Units (TPUs) for roughly a decade.
TPUs have been mostly confined to Google's cloud and internal workloads. In February, the search giant struck a deal to rent them to Meta. Google is also teaming up with cloud company Fluidstack to lease TPUs, another shift positioning it more squarely against Nvidia.
Amazon is also designing chips as lower-cost alternatives to Nvidia: Trainium for training and Inferentia for inference.
Microsoft and Meta are earlier in their processes. Meta said Wednesday it's pressing ahead with four new silicon generations over the next two years, and Microsoft recently announced an AI inference chip called Maia 200.
2. Chip startups are seizing the inference wave
Stefanie Keenan/Getty Images for Village Global
Investors are pouring billions into chip startups that are seizing the inference wave.
Nvidia wants in, too, paying $20 billion to license technology and hire top talent from Groq, founded by a former TPU engineer and is often considered one of biggest inference challengers.
The field has produced several unicorns. While many predate ChatGPT, they're flourishing now as infrastructure spend booms and demand materializes.
Cerebras, founded in 2015 and valued at $23 billion, builds dinner plate-sized "wafer-scale" chips for training and inference, and struck a $10 billion deal with OpenAI in January.
SambaNova, which raised $350 million after acquisition talks with Intel fell through, builds AI hardware and software systems for business customers. (Intel told Business Insider it is planning a multiyear collaboration with SambaNova and invested in its Series E.)
And Tenstorrent, last valued at $2 billion, also offers a GPU alternative.
3. The China factor
Jiang Qiming/China News Service/VCG via Getty Images
China remains Nvidia's biggest geopolitical headache. The US has tightened export controls of AI chips, and US regulators have alleged some Chinese labs are training their models on restricted hardware anyway, Reuters reported.
Nvidia CEO Jensen Huang has repeatedly warned that blocking sales to China will only accelerate local progress.
Huawei stands at the center of these efforts. The nearly 40-year-old telecom giant is viewed as Nvidia's closest equivalent—building chips, servers, networking gear, and running its own cloud.
Chinese chip startups like Cambricon have also emerged as alternatives to Nvidia
Other competitors include Alibaba and Baidu — China's equivalents of Amazon and Google — which design chips for their respective cloud businesses.
4. The old guard
Kimberly White/Getty Images for Wired
Deep-pocketed chip incumbents like AMD, Intel, and Broadcom vie for a piece of Nvidia's AI dominance, while Nvidia pushes into their turf as well.
AMD, which builds GPU competitors and whose CEO Lisa Su is Huang's distant cousin, has secured deals with major cloud and business customers, including Meta.
Intel, meanwhile, has a strong footprint among large business customers, while Broadcom specializes in networking and custom chips — meaning it stands to benefit even if Nvidia continues to lead in GPUs.
Have a tip? Contact this reporter via email at gweiss@businessinsider.com or Signal at @geoffweiss.25. Use a personal email address, a nonwork WiFi network, and a nonwork device; here's our guide to sharing information securely.
from Business Insider https://ift.tt/y3fzLSI
No comments:
Post a Comment