Elon Musk’s artificial intelligence company, xAI, has unveiled Colossus, a massive AI training system boasting 100,000 Nvidia H100 GPUs. Built in just 122 days, Colossus is being hailed as the world’s most powerful AI training cluster. It is designed to advance xAI’s large language model Grok and challenge industry leaders like OpenAI and Google in the race for AI supremacy.

Colossus represents a significant leap forward for xAI in the competitive AI landscape. The system is designed to train Grok-3, which Musk hopes will become “the most powerful AI in the world” by December 2024. This ambitious goal underscores xAI’s strategy to challenge established players like OpenAI and Google. The development of Colossus also has potential implications for Tesla, as many experts speculate that Grok could eventually power the AI behind Tesla’s humanoid robot Optimus.

Plans are underway to double Colossus’s capacity in the coming months, expanding it to 200,000 GPUs. This upgrade will include the addition of 50,000 of Nvidia’s more advanced H200 series chips, which are approximately twice as powerful as the current H100s. The expansion is expected to further solidify Colossus’s position as the world’s largest GPU supercomputer and significantly boost xAI’s AI training capabilities, potentially accelerating the development of future Grok versions.

Credit: Perplexity.ai