- Musk’s Grok 3 to use 100,000 GPUs.
- Meta plans 600,000 GPUs by 2024.
- GPU access becomes crucial for attracting top AI talent.
Chip champ or overkill?
Elon Musk has announced plans to train the next iteration of his AI chatbot, Grok 3, on a staggering 100,000 Nvidia H100 GPUs.
This hardware investment, potentially worth $3-4 billion, represents a significant leap from Grok 2’s 20,000 GPUs.
Musk teases that the result should be “really something special,” scheduled for release by year-end.
GPU arms race heats up
Musk’s ambitious GPU plans, however, pale in comparison to Meta’s. Mark Zuckerberg’s company aims to own about 600,000 GPUs by the end of 2024, including 350,000 Nvidia H100s.
This arms race highlights the intense competition in AI development, driving both technological advancements and aggressive talent acquisition strategies.
The GPU stockpiling trend is reshaping the AI talent landscape. Aravind Srinivas, CEO of AI startup Perplexity, shared an anecdote about a Meta AI researcher declining a job offer, citing the need for at least 10,000 H100 GPUs.
This showcases how access to computing power has become a crucial factor in attracting top AI talent.