Elon Musk is escalating the synthetic intelligence arms race against his former colleagues at OpenAI by amassing money for his own AI startup.
The Tesla mogul’s firm xAI — which powers the snarky Grok chatbot — announced on Sunday that it raised $6 billion in a fresh round of fundraising — soliciting investments from enterprise capital giants comparable to Andreessen Horowitz and Sequoia Capital, in addition to Saudi Prince al-Waleed bin Talal.
The firm had a pre-money valuation of $18 billion, Musk said on X, meaning the most recent round will push the startup’s valuation to $24 billion.
Musk, who was one in every of the co-founders of OpenAI when it was a nonprofit but then left after losing an influence struggle with management, launched xAI lower than a 12 months ago.
Earlier this month, The Post reported that xAI’s valuation was expected to eclipse $20 billion because of surging demand from enterprise capitalists.
The cash can be used to take xAI’s first products to market, construct advanced infrastructure and speed up research and development of future technologies, xAI said.
“There can be more to announce in the approaching weeks,” Musk said in one other X post, in response to the announcement of the funding.
Microsoft-backed OpenAI saw its valuation rise to some $80 billion after the introduction of its AI-powered bot ChatGPT in late 2022.
In March of last 12 months, Musk was one in every of hundreds of tech luminaries who signed onto a letter urging a pause in AI research because of potential risks to humanity.
Musk has also criticized OpenAI for abandoning its nonprofit status. Last July, he launched xAI to challenge OpenAI.
The startup rolled out Grok, an AI bot that was trained on and integrated into X, the social media platform formerly often called Twitter.
Musk has touted Grok as a “non-woke” alternative to rival OpenAI’s ChatGPT and other learned language models.
Within the blog posting which announced the funding, xAI said it was “primarily focused on the event of advanced AI systems which are truthful, competent, and maximally helpful for all of humanity.”
Musk recently told investors xAI is planning to construct a supercomputer to power the subsequent version of Grok, The Information reported on Saturday citing a presentation to investors.
Musk said he desires to get the proposed supercomputer running by the autumn of 2025, as per the report, adding that xAI could partner with Oracle to develop the large computer.
The Post has sought comment from xAI and Oracle.
When accomplished, the connected groups of chips — Nvidia’s flagship H100 graphics processing units (GPUs) — can be no less than 4 times the scale of the most important GPU clusters that exist today, The Information reported, quoting Musk from a presentation made to investors in May.
Nvidia’s H100 family of powerful GPUs dominate the information center chip marketplace for AI but could be hard to acquire because of high demand.
Earlier this 12 months, Musk said training the Grok 2 model took about 20,000 Nvidia H100 GPUs, adding that the Grok 3 model and beyond would require 100,000 Nvidia H100 chips.
With Post Wires
Elon Musk is escalating the synthetic intelligence arms race against his former colleagues at OpenAI by amassing money for his own AI startup.
The Tesla mogul’s firm xAI — which powers the snarky Grok chatbot — announced on Sunday that it raised $6 billion in a fresh round of fundraising — soliciting investments from enterprise capital giants comparable to Andreessen Horowitz and Sequoia Capital, in addition to Saudi Prince al-Waleed bin Talal.
The firm had a pre-money valuation of $18 billion, Musk said on X, meaning the most recent round will push the startup’s valuation to $24 billion.
Musk, who was one in every of the co-founders of OpenAI when it was a nonprofit but then left after losing an influence struggle with management, launched xAI lower than a 12 months ago.
Earlier this month, The Post reported that xAI’s valuation was expected to eclipse $20 billion because of surging demand from enterprise capitalists.
The cash can be used to take xAI’s first products to market, construct advanced infrastructure and speed up research and development of future technologies, xAI said.
“There can be more to announce in the approaching weeks,” Musk said in one other X post, in response to the announcement of the funding.
Microsoft-backed OpenAI saw its valuation rise to some $80 billion after the introduction of its AI-powered bot ChatGPT in late 2022.
In March of last 12 months, Musk was one in every of hundreds of tech luminaries who signed onto a letter urging a pause in AI research because of potential risks to humanity.
Musk has also criticized OpenAI for abandoning its nonprofit status. Last July, he launched xAI to challenge OpenAI.
The startup rolled out Grok, an AI bot that was trained on and integrated into X, the social media platform formerly often called Twitter.
Musk has touted Grok as a “non-woke” alternative to rival OpenAI’s ChatGPT and other learned language models.
Within the blog posting which announced the funding, xAI said it was “primarily focused on the event of advanced AI systems which are truthful, competent, and maximally helpful for all of humanity.”
Musk recently told investors xAI is planning to construct a supercomputer to power the subsequent version of Grok, The Information reported on Saturday citing a presentation to investors.
Musk said he desires to get the proposed supercomputer running by the autumn of 2025, as per the report, adding that xAI could partner with Oracle to develop the large computer.
The Post has sought comment from xAI and Oracle.
When accomplished, the connected groups of chips — Nvidia’s flagship H100 graphics processing units (GPUs) — can be no less than 4 times the scale of the most important GPU clusters that exist today, The Information reported, quoting Musk from a presentation made to investors in May.
Nvidia’s H100 family of powerful GPUs dominate the information center chip marketplace for AI but could be hard to acquire because of high demand.
Earlier this 12 months, Musk said training the Grok 2 model took about 20,000 Nvidia H100 GPUs, adding that the Grok 3 model and beyond would require 100,000 Nvidia H100 chips.
With Post Wires