Magic, an AI startup that develops models that generate code and automate a variety of software development tasks, announced it has raised a significant amount of funding from investors that includes former Google CEO Eric Schmidt.
Magic announced in a blog post on Thursday that it had closed a $320 million funding round, with contributions from Schmidt, Jane Street, Sequoia, Atlassian, Nat Friedman & Daniel Gross, Elad Gil, Capital G and others. The funding brings the company's total funding to nearly $500 million ($465 million), and puts it in the company's league of well-funded AI coding startups that also include Codeium, Anysphere and Augment. (Interestingly, Schmidt is also an investor in Augment.)
Magic today announced a partnership with Google Cloud to build two “supercomputers” on Google Cloud Platform: the first, Magic-G4, will be powered by Nvidia H100 GPUs, while the other, Magic G5, will be built with Nvidia's next-generation Blackwell chips (GPUs are often used to train and run generative AI models because they can perform many calculations in parallel).
Magic said it aims to scale the latter cluster to “tens of thousands” of GPUs in the future.
“We're excited to partner with Google and Nvidia to build our next-generation AI supercomputer on Google Cloud,” Magic co-founder and CEO Eric Steinberg said in a statement. “Nvidia's [Blackwell] This system enables much more efficient model inference and training. Google Cloud also offers the fastest scaling timelines and a rich ecosystem of cloud services.”
Steinberger and Sebastian de Lo co-founded Magic in 2022. Steinberger says he was inspired by the potential of AI from a young age. In high school, he and some friends wired up the school's computers to train machine learning algorithms. That experience planted the seed for Steinberger's computer science degree and work as an AI researcher at Meta.
Magic provides AI-driven tools designed to help software engineers write, review, debug and plan code changes. The tools work like automated pair programmers, understanding the context of coding projects and seeking to continuously learn.
Many platforms do the same thing, and GitHub Copilot is no exception. But one of Magic's innovations is its very long context window for models.
A model's context, or context window, refers to the input data (e.g., text) that the model considers before generating an output (e.g., additional text). A simple question like “Who won the 2020 US Presidential election?” can act as context, as can a movie script, show, or audio clip. The larger the context window, the larger the size of the document (or possibly a codebase) that can fit within it.
Magic claims that its latest model, the LTM-2-mini, has a context window of 100 million tokens (“tokens” are bits of raw data subdivided into, like the syllables “fan,” “tas,” and “tic” in the word “fantastic”). 100 million tokens is the equivalent of about 10 million lines of code, or 750 novels. This is by far the largest context window of any commercially available model; the next largest is Google's flagship model, Gemini, which has 2 million tokens.
Magic says that thanks to its long context, the LTM-2-mini was able to implement a password strength meter for an open source project and create a calculator using a custom UI framework.
The company is currently training a larger version of the model.