Today's numbers confirmed A long-standing rumor that they raise more money than God. The Bay Area-based robotics company announced a $675 million Series B round, valuing the startup at $2.6 billion post-money.
The lineup of investors is equally impressive. This includes Microsoft, OpenAI Startup Fund, Nvidia, Amazon Industrial Innovation Fund, Jeff Bezos (via Bezos Expeditions), Parkway Venture Capital, Intel Capital, Align Ventures, and ARK Invest. That's a daunting amount for a young startup with 80 employees. This last part will almost certainly change this round.
Figure already had a lot to work on. Founder Brett Adcock, a serial entrepreneur, initially invested $100 million to launch the company. Last May, it added $70 million in Series A funding. I previously thought the word “Figure” was a reference to the robot's humanoid design, perhaps an homage to the upstarts that are thinking about things. It now appears that it could also be a reference to the astronomical amount of funding raised to date.
When Figure launched in 2022, it set an ambitious goal of developing a bipedal robot within a year. The company told TechCrunch that date has now been reached. Video of the walk was not available at the time, but it has since surfaced.
This startup is truly a product of its time. Humanoid robots are in action. Examples include Tesla (but let's temper your expectations a little), Apptronik, and 1X. Amazon recently began small-scale trials of Agility's Digit robots, apparently realizing its benefits in supplementing human labor in brownfield warehouses and fulfillment centers.
Most of these efforts, including Figure, work toward the same goal: building robots for industrial use. Initial costs are just one reason why it makes much more sense to focus on the workplace rather than the home. This is also one of the many reasons why it is important to properly adjust expectations about what such systems can and cannot do.
Some companies (Tesla again) probably have unrealistic expectations about current cutting-edge technology. I'm mainly talking about general artificial intelligence, which many roboticists think will be around five years away, but that may prove to be optimistic.
The word “universal” gets thrown around a lot when discussing these robots. Essentially, it refers to a system that can quickly perform a variety of tasks just like humans. Traditional robotic systems are single-purpose, doing one thing very well over and over again. Multi-purpose systems do exist, and the kind of API that Boston Dynamics for Spot provides will go some way to extending that functionality.
The ultimate goal of generalized AI is, in fact, a major driver of humanoid form factors. Robots made for a single function are difficult to adapt, but robots made to think like us could, in theory, do anything we can do.
When I visited Figure's headquarters last year, the company had recently built a demo area in the center of the office.
The main use of this space was to display the robots for potential customers and investors. It was clearly set up to resemble a warehouse or factory.
Most believe that warehouse work is the first step toward broader adoption, and perhaps the eventual arrival of domestic robots. After all, companies will be willing to invest large sums of money in products that they believe will save them money in the long run. It is also much easier to fill his work day with one or two highly repetitive tasks. Consumers will almost certainly demand something indistinguishable from the norm before paying the equivalent of a new car to buy a car.
It's also worth noting that today's news also reports that Figure has entered into a partnership with OpenAI, a pioneer in generative AI.
According to Figure, the goal of the contract is to “develop next-generation AI models for humanoid robots.” A near-term application of large-scale language models will be the ability to create more natural ways of communicating between robots and human colleagues. “This partnership aims to accelerate Figure’s commercial timeline by enhancing the humanoid robot’s ability to process language and reason,” the company said.
Natural language allows humans to give commands to systems, giving them a deeper understanding of what the robot is doing (hence the ability to “reason” with language). After all, these are much more complex systems than, say, a human-operated forklift. When operating autonomously, more direct methods of communication are required, especially in crowded warehouses and factory floors. Linguistic processes allow human assistance in correcting mistakes.
“We have always planned to return to robotics, and we see a path to using Figure to explore what humanoid robots can accomplish with powerful multimodal models.” said OpenAI Vice President Peter Welinder in a statement. “We are amazed by the progress figures have made so far and look forward to working together to open up new possibilities for how robots can help in our daily lives.”
Another thing that makes this deal interesting is OpenAI's investment in 1X, a direct competitor. Deals like this may make some wonder if OpenAI is reconsidering its investment, or if the company is just being active on the ground. My guess at this point is the latter. If you were in his position at OpenAI, you'd be better off collaborating with as many promising companies as possible. And Figure has certainly shown great progress in the eight months since taking his first steps.
Check out the video below, posted about a week ago. According to this figure, the robot's movement speed is approximately 16.7% of the speed at which a human can perform the same task. That is, it is done very slowly, methodically, planned, and evenly. It's clear from the video. And no matter how well made it is, it's always good to see the robot in action at real speeds in a demo video. People have told me in hushed tones that some people are trying to disguise their sped-up videos by not publishing them as much. It's the kind of thing that fuels consumers' already unrealistic expectations of what robots can do.
Microsoft's investment sees it leveraging Azure for storage, training, and “AI infrastructure.”
“We are excited to work with Figure to accelerate breakthrough advances in AI,” Microsoft corporate VP Jon Tinter said in a statement. “Through our collaboration, Figure will be able to access his AI infrastructure and services at Microsoft to support the deployment of humanoid robots that assist people in real-world applications.”
Somewhat interestingly, Bill Gates' recent list of exciting robotics startups did not include Figure, even though it included two other humanoid companies (Agility and Apptronik). did.
Amazon Innovation Fund's participation in this round is also particularly noteworthy. This is because it often acts as a pipeline to the actual deployment at the fulfillment center. Take agility as an important example.
The autonomy part is equally important, given the tendency to disguise remote control for autonomy. One of the reasons autonomy is so difficult in cases like this is all the unexplained variation. Warehouses tend to be fairly structured environments, but in the real world any number of things can happen that can derail a task. And the less structured these tasks are, the greater the potential for error. Many questions remain, such as how many times did it take to get this right? One thing that definitely makes this a success is the fact that the action is captured in one continuous shot of him. This means the company is not putting together a sequence of actions through creative editing.
Mechatronics is easier to judge from a short video than AI or autonomy, and from that perspective the robot in Figure 01 looks very dexterous. In fact, if you look at the angle and position of your arms, you'll see that you're performing the carry in a way that is very uncomfortable for most people. It's important to note that just because a robot looks like a human doesn't mean it has to behave exactly like a human. My educated guess is that Thoth's position has something to do with the robot's center of gravity and perhaps the fact that it looks very top-heavy.
The money will be used to accelerate time-to-market, according to the diagram. The company already has a contract with BMW to introduce robots.