Local time September 16, humanoid robot firm Figure AI announced it has raised over $1 billion in its Series C funding round. The capital will be allocated to three key areas: scaling up robot production capacity, building NVIDIA GPU computing infrastructure to speed up training and simulation processes, and expanding data collection across scenarios relevant to human work and daily life.
Core Goals & Technological Confidence
Figure’s ultimate objectives are threefold: mass-producing deliverable robot hardware, developing an AI engine to make robots intelligent, and gathering the training data required for this engine.
“Figure aims to solve the challenge of general-purpose robots,” CEO Brett Adcock said in a YouTube video the same day. “For the first time in history, we have the technological conditions to achieve this goal.”
The company has set ambitious targets. Its Figure 02 robot became the world’s second humanoid robot to secure paid work late last year. Earlier this year, Adcock announced plans to deliver 100,000 humanoid robots over the next four years, noting that Figure’s client list already includes “one of the largest enterprises in the U.S.”
Ambitions Beyond Industrial Scenes
Figure’s aspirations extend beyond warehouses and factories. It regularly releases videos of robots performing household tasks—working in kitchens, serving drinks, loading dishes into dishwashers, folding laundry, and handling other daily chores.
Powering these capabilities is Helix AI, the intelligent system integrated into each robot. With Helix AI, Figure’s robots can adapt to real-world environments: they recognize and understand unseen objects, and take sensible, cognitive-based actions. This is critical for practical humanoid robots, as it eliminates the need for specialized training for every small, extreme scenario in homes or factories.
“This is an extremely challenging problem,” Adcock acknowledged, “but our team is in place, the robots are built, and the path forward is clear.”

Funding Backing & Industry Landscape
The over $1 billion funding will provide strong support. Intel, NVIDIA, LG, Salesforce, Qualcomm, and T-Mobile participated in the round via their investment arms, with Parkway Venture Capital as the lead investor.
Figure is not alone in securing large-scale funding. Reports say China-based UBtech Robotics completed a $1 billion funding round earlier this month (though unconfirmed). UBtech also signed the largest publicly disclosed humanoid robot contract to date—valued at 905.115 million yuan (about $12.7 million)—with Miy Yi (Shanghai) Automotive Technology Co., Ltd.
Market Potential & Industry Doubts
Humanoid robots have promising prospects: global annual value created by manual laborers hits $40 trillion (half of the world’s GDP), offering a massive potential market.
Yet skepticism exists. Bren Pierce, a senior robotics expert and CEO of Kinisi Robots, questioned on the TechFirst podcast: “What is the purpose of giving robots legs? Bipedal robots will exist in the future… but are we overlooking the key to their usefulness—artificial intelligence?”
Pierce argued that while bipedal designs are impressive, wheeled robots suffice for factory and warehouse tasks, and perform better in heavy load capacity and battery life. He pointed out that the real technical challenge lies in robot “hands.”
Nevertheless, dozens of manufacturers are advancing humanoid robot R&D with ongoing capital support. In a market potentially with dozens or hundreds of competitors, Figure positions itself as a leader.
Founder’s Take on the Milestone
“In a statement, Figure founder Brett Adcock said: “This milestone is crucial for launching the next phase of humanoid robot development. It will help us expand our AI platform Helix and refine our BotQ manufacturing system.” He added, “New partners and continued support from existing investors not only prove Figure’s leadership but also reflect our shared belief that humanoid robots will naturally integrate into daily life.”
With continuous capital inflows, that future may arrive sooner than expected.