Elon Musk's xAI has open sourced the base code for its Grok AI model, but not the training code. The company described it on GitHub as a “314 billion parameter expert mixture model.”
xAI said in a blog post that the model is not tailored for specific uses, such as conversational use. The company noted that Grok-1 was trained with a “custom” stack, but did not provide further details. This model is licensed under the Apache License 2.0, which allows commercial use.
Last week, Musk noted on X that xAI intends to open source the Grok model this week. Last year, the company released Grok, a chatbot form accessible to Premium+ users of the X social network. In particular, chatbots can access some of the X data, but open source models do not include connections to social networks.
Many prominent companies have open sourced some of their AI models, including Meta's LLaMa, Mistral, Falcon, and AI2. Google also released two new open models in February: Gemma2B and Gemma7B.
Some AI-powered tool makers are already talking about using Grok in their solutions. Perplexity CEO Arvind Srinivas has posted on X that the company is tweaking his Grok for conversational search and making it available to Pro users.
Yes, thanks @Eron Musk The xAI team open-sources Grok's basic model. Fine-tuned for conversational search, optimized for inference, and delivered to all Pro users. https://t.co/CGn6cIoivT
— Aravind Srinivas (@AravSrinivas) March 17, 2024
Musk has been in a legal battle with OpenAI, suing the company earlier this month for “betrayal” of his nonprofit's AI goals. He has since criticized OpenAI and Sam Altman for his X. multiple times.