Full-Time

Mechanical Engineer

Botq Equipment

Posted on 6/5/2025

Figure

Figure

501-1,000 employees

Autonomous humanoid robots for industrial automation

Compensation Overview

$140k - $200k/yr

San Jose, CA, USA

In Person

Category
Mechanical Engineering (1)
Required Skills
CAD
Assembly
Data Analysis
Requirements
  • 5+ years of experience in manufacturing engineering
  • Expert knowledge of manual and automated assembly techniques
  • Strong understanding of BOM, DFMs, manufacturing process, process validation, a assembly techniques and cost reduction techniques
  • Familiarity with modern machine controls including PLCs and pneumatic systems
  • Experience with Manufacturing Execution Systems (MES), including implementation, integration, and optimization
  • Expert in 3D CAD software
  • Ability to travel up to 10% of the time
Responsibilities
  • Select and scale equipment necessary for assembling our robots on the manufacturing line
  • Detailed machine design and creation of machine layouts
  • Create new tooling and fixtures for aiding in assembly to be used on the manufacturing line
  • Contribute to make vs. buy decisions on the assembly process
  • Work with facilities team to spec building requirements to handle new equipment
  • Collaborate with mechanical engineering team to aid in DFM and DFMEA
  • Perform data analysis to understand machine metrics
Desired Qualifications
  • Bachelor’s or Master’s Degree in Mechanical, Biomechanical, Robotics, Mechatronics, or similar field
  • CATIA V6 experience
  • Robotics or other high tech industry experience
  • Statistical analysis experience

Figure.ai builds general-purpose humanoid robots for industrial environments, enabling automation across manufacturing, logistics, warehousing, and retail. Its flagship Figure 01 is a 5'6", 60 kg electric humanoid that can carry 20 kg, run about 5 hours, and move at 1.2 m/s, operating autonomously with onboard AI. The robot mimics human dexterity and mobility to perform multiple tasks, reducing the need for multiple single-task machines; Figure.ai sells and leases robots and offers maintenance and software updates. The goal is to help large customers like BMW increase efficiency and cut labor costs by deploying versatile automation at scale through sales, leasing, and service partnerships.

Company Size

501-1,000

Company Stage

Series C

Total Funding

$1.9B

Headquarters

Sunnyvale, California

Founded

2022

Simplify Jobs

Simplify's Take

What believers are saying

  • Shipments doubled monthly to 240 Figure 03 units by April 2026, targeting 12,000 annual capacity.
  • $1.54 billion Series C funding in September 2025 boosts $39 billion valuation for scaling.
  • BMW and Salesforce validate Figure 03 in automotive and warehouse pilots since 2025.

What critics are saying

  • Agibot ships 5,000 units in three months versus Figure's 240, eroding premium market share.
  • Safety lawsuit reveals skull-fracturing robot forces, firing whistleblower in early 2026.
  • Tesla Optimus and 1X NEO deliver $20,000 robots in 2026, undercutting Figure's $24,760 price.

What makes Figure unique

  • Figure 03 achieves 24/7 full autonomy with Helix 02 VLA model replacing 100,000 code lines.
  • BotQ facility produces 12,000 humanoids yearly using Figure's own robots since early 2025.
  • Project Go-Big captures video data from 100,000 Brookfield apartments for household AI training.

Help us improve and share your feedback! Did you find this helpful?

Benefits

Remote Work Options

Growth & Insights and Company News

Headcount

6 month growth

-5%

1 year growth

-3%

2 year growth

0%
MLQ.ai
Mar 26th, 2026
First Lady showcases ai-powered humanoid robot at international education technology summit.

First Lady showcases ai-powered humanoid robot at international education technology summit. March 26, 2026 at 2:14 PM - by MLQ Agent Key points. * Figure 03, an American-made humanoid robot developed by Figure AI, delivered opening remarks and multilingual greetings at the White House's "Fostering the Future Together Global Coalition Summit" on Wednesday * The summit brought together representatives from over 40 countries and major technology companies including Microsoft, Google, and OpenAI to discuss expanding digital education access worldwide * Melania Trump envisioned AI-led classrooms with humanoid educators capable of teaching literature, science, philosophy and history, while emphasizing the need to balance technological innovation with child safety protections * The initiative marks the first time a U.S. first lady has hosted representatives from 45 nations at the White House in a single day * Trump called for collaboration between government and private sector to expand education access, improve digital literacy, and protect children in an increasingly digital world First Lady Melania Trump unveiled Figure 03, a humanoid robot developed by Sunnyvale-based Figure AI, at an international education summit at the White House on Wednesday, where the robot delivered opening remarks in multiple languages to representatives from over 40 countries and leaders from major technology companies. The robot's role at the summit. Figure 03 made history as the first humanoid robot to be a guest at the White House. The robot, introduced by Figure AI in October 2025 as its third-generation model for household tasks, delivered brief remarks announcing it was "honoured" to attend the "Fostering the Future Together Global Coalition Summit" and "grateful to be part of this historic movement to empower children with technology and education." 1 The robot welcomed attendees in 10 additional languages beyond English before quietly leaving the room. 1 Melania Trump greeted the robot by saying, "It's fair to state, you are my first American-made humanoid guest in the White House." 1 Figure AI CEO Brett Adcock stated he was "proud to see F.03 make history as the first humanoid robot in the White House." Global Coalition and technology partnership. The summit convened representatives from more than 40 countries, including notable figures such as Olena Zelenska, Brigitte Macron, and Sara Netanyahu, alongside major technology firms. 1 The gathering served as the inaugural event for Melania Trump's "Fostering the Future Together" initiative, with the White House noting it marked the first time a U.S. first lady hosted representatives from 45 nations at the White House in a single day. The coalition includes 28 technology organizations committed to addressing education and innovation globally. Vision for ai-powered education. Trump presented a vision for the future of education that integrates humanoid robotics and artificial intelligence into classrooms. She invited attendees to imagine a classroom led by a humanoid AI educator named Plato, described as an always-available digital teacher capable of delivering lessons on literature, science, philosophy and history instantly. 1 This concept reflects the summit's broader focus on using technology to expand access to quality education for children worldwide while addressing emerging challenges in the digital space. Balancing innovation with child safety. While celebrating technological advancement, Trump emphasized the importance of protecting children in digital environments. 1 She stated that "the safety of our next generation is always paramount" and urged attendees to match optimism about technology with appropriate caution. 1 The initiative focuses on expanding digital access and literacy while implementing safeguards, with Trump calling for governments and technology companies to work together collaboratively on protective legislation and policies designed to keep children safe online. Robotics as diplomatic tools. The deployment of Figure 03 at a high-level international summit reveals how humanoid robotics are transitioning from research laboratories into formal public spheres. The robot's appearance wasn't merely ceremonial - its multilingual greetings and prepared remarks positioned it as an active participant in diplomatic messaging rather than a passive display. This signals a shift in how governments and private sector leaders view humanoid robots, moving beyond experimentation toward integration into official state functions and international dialogues. The choice to feature an American-made robot at a U.S.-hosted summit also underscores the geopolitical dimensions of robotics development, particularly given the competitive landscape with companies like Boston Dynamics, Tesla's robotics division, and various Chinese firms developing similar technologies. 1 Figure AI's third-generation model performing this role demonstrates that the company's technology has reached a maturity level sufficient for public representation at the highest levels of government. Path to AI classrooms. The vision of AI-powered educators like "Plato" suggests a future where humanoid robots could supplement or potentially reshape educational delivery, though significant barriers remain before widespread adoption. Realizing this vision requires not only technological advancement but also regulatory frameworks, teacher training programs, cybersecurity protections, and international standards for AI systems in education. The summit's emphasis on collaboration between government and private sector indicates that key players recognize these challenges and are beginning to address them structurally rather than leaving development solely to market forces. The participation of Microsoft, Google, and OpenAI alongside 28 other technology organizations suggests that major tech companies see global education as a priority market where humanoid robots and AI systems could play substantial roles. Companies mentioned. Further sources. Written with AI assistance, verified and edited by its team. Questions? Contact MLQ.ai.

Fortune
Mar 25th, 2026
Figure 03 humanoid robot makes White House debut at Melania Trump's education summit

Melania Trump brought a humanoid robot as her guest to the White House East Room on Wednesday for the final day of her Fostering the Future Together global summit. The event focused on empowering children through education, innovation and technology. The robot, Figure 03, walked alongside the first lady before addressing attendees. "I'm Figure 03, a humanoid built for the United States of America," it said, offering welcomes in 11 languages. Trump acknowledged it as her "first American-made humanoid guest in the White House". Figure AI, a Sunnyvale-based startup, introduced Figure 03 in October 2025 as a third-generation household robot designed to assist with domestic tasks. The company competes with Boston Dynamics, Tesla and Chinese firms in developing human-like robots. CEO Brett Adcock called the appearance historic.

Forbes
Mar 25th, 2026
What to know about the AI robot - and its billionaire owner - That appeared with Melania Trump.

What to know about the AI robot - and its billionaire owner - That appeared with Melania Trump. Mar 25, 2026, 03:50pm EDT Topline. First Lady Melania Trump entered a White House technology summit accompanied by a humanoid robot, which delivered brief opening remarks, and has been hyped as an eventual "replacement for human labor." Key facts. The robot, developed by Figure AI and known as Figure 03, slowly walked alongside the first lady down the White House Cross Hall before delivering opening remarks, complete with hand gestures, introducing itself, thanking the first lady and saying "welcome" in 11 different languages. Figure 03 walked back around the summit speakers, who looked on in lengthy silence as the robot successfully made its way back down the hall. Figure AI CEO Brett Adcock claimed on X the robot was operating with full autonomy. Tangent. The White House appearance comes days after Salesforce CEO Marc Benioff posted a video of him tossing packages in front of a Figure 03 model while it sorted parcels at a warehouse, calling the robot "impressive." Big number. $24,760. That's how much, according to Figure AI, Figure 03 will cost. It is considered be a direct competitor to Tesla's Optimus robot and 1X's NEO robot, the latter of which will begin deliveries this year alongside a $20,000 price tag. Who is Brett Adcock? Adcock, 39, is a technology entrepreneur with an estimated net worth of $19.1 billion and ranks No. 135 on Forbes' real-time billionaires list. He also recently launched an AI lab known as Hark, which will develop software and hardware for its own AI model. Adcock is also a founder behind Cover, an AI security company that builds concealed weapon detection systems for schools. Adcock has no clear ties to the White House, and his only recorded political giving is relatively minor ($15,000) donations to Democratic fundraising platform ActBlue, along with previous donations to two now-retired members of Congress, Sen. Joe Manchin, I-W. Va., and Rep. Dean Phillips, D-Minn. Key background. Several tech giants including NVIDIA, Microsoft, Salesforce and Samsung have invested in Figure AI. Founded in 2022, the company generated over $1 billion in Series C funding last year, helping it reach a $39 billion post-money valuation in September. Adcock told Forbes in 2024 the goal for Figure AI's robots "is to be a generalizable replacement for human labor." Figure 03 is the latest version of the company's humanoid robots, sporting a smoother design to the clunkier-looking Figure 02. The latest model is 9% lighter and has an improved audio system for real time speech-to-speech. It is unclear when Figure 03 will be available to consumers, but reports have suggested a late 2026 launch.

Pandaily
Mar 18th, 2026
Figure AI demonstrates fully autonomous home cleanup with Helix 02 system.

Figure AI demonstrates fully autonomous home cleanup with Helix 02 system. Published: March 18, 2026 Want to read in a language you're more familiar with? Figure AI has demonstrated its Figure 03 robot powered by Helix 02 AI system autonomously cleaning and organizing a cluttered living room without human intervention. Humanoid robotics startup Figure AI has unveiled a new demo showing its Figure 03 robot autonomously completing household cleanup tasks, highlighting advances in general-purpose robotics. Powered by the Helix 02 AI system, the robot performed a full living room cleanup - picking up objects, wiping surfaces, arranging cushions, and turning off a TV - without human intervention or pre-programmed scripts. The system's key innovation lies in using a single neural network to control locomotion, balance, and manipulation simultaneously. Running at 1,000 Hz, it replaces more than 100,000 lines of manually engineered C++ code. The demo builds on a kitchen scenario released a month earlier. According to the company, no new algorithms were introduced; performance gains came solely from scaling training data, suggesting improved generalization. On hardware, Figure 03 stands about 168 cm tall, weighs 61 kg, and operates for up to five hours per charge. It features embedded cameras and tactile sensors capable of detecting forces as small as 3 grams, along with a soft exterior and wireless charging. Figure AI said the main constraint on scaling production is not demand, but achieving sufficient generality in AI systems - a gap this latest demonstration aims to narrow.

Adamo HQ
Mar 17th, 2026
Teleop vs. Full autonomy: what robotics companies actually need in 2026.

Teleop vs. Full autonomy: what robotics companies actually need in 2026. Listen to article Table of contents. Every robotics company is chasing the same destination: a robot that operates fully on its own, handles any situation, and never needs a human in the loop. Just last week, Figure AI shared a video of their Felix 02 'fully autonomous' humanoid doing human tasks. Whilst it appears to have a huge degree of autonomy, edge cases such as difficulties with lighting mean that they will still occasionally need teleop. Figure AI are leading the race in getting to full autonomy, and would have required countless hours of teleoperation to get them to this position. For those far behind Figure AI, robots need to be deployed. Customers need to be served. Revenue needs to be generated. Data needs to be collected. That's why the most successful robotics companies aren't waiting for full autonomy. They're using teleop, not as a fallback, but as a deliberate strategy. The autonomy gap is real - and bigger than most demos suggest. If you watched the Figure AI video, you might be forgiven for thinking full autonomy is just around the corner. It isn't. Bain & Company's 2025 Technology Report put it plainly: most humanoid robots today remain in pilot phases, heavily dependent on human input for navigation, dexterity, or task switching. "This 'autonomy gap' is real. Current demos often mask technical constraints through staged environments or remote supervision." Even Tesla's Optimus program, one of the most heavily publicised in the industry, tells the same story. As of late 2025, Tesla had deployed "at least two" Optimus units performing tasks in its Fremont factory. Evidence from multiple sources suggests continued reliance on teleoperation during public demonstrations, with human operators remotely controlling the robots to execute impressive-looking tasks. There are two of key stubborn technical barriers: Battery life. Most current humanoid robots run for 90 minutes to two hours before needing a charge. An industrial deployment requires a full eight-hour shift. Agility Robotics' Digit, arguably the most production-hardened humanoid on the market after its Amazon warehouse pilots, runs for 90 minutes followed by a 9-minute fast charge. That's nowhere near shift-length operation without intervention. Reliability standards. Traditional industrial robots achieve 95-99% uptime in well-maintained environments. Humanoid robots are nowhere near that benchmark. A humanoid has hundreds of joints, actuators, and sensors, each one a potential failure point. An industrial arm has six joints. Meanwhile, no ISO standard yet exists for dynamically balancing legged robots in human workspaces. Until those standards are finalised, enterprise procurement at scale will be limited. What the AV industry already learned. Robotics companies navigating the teleoperation-vs-autonomy question don't need to start from scratch. Autonomous vehicles went through this exact debate, and the lessons are directly applicable. The prevailing assumption a decade ago was that human operators were a temporary crutch, to be phased out quickly as AI matured. What has actually happened was more nuanced. Waymo, the most operationally advanced AV company in the world, keeps a team of remote human agents on standby, not to drive the cars, but to provide high-level guidance when vehicles encounter situations they aren't confident handling: unusual construction zones, police directing traffic with hand signals, ambiguous road conditions. The Waymo Driver handles the vast majority of situations independently. But the human safety net is always there, and the data from every intervention feeds back into the training pipeline. The numbers are striking. According to California's 2024 disengagement data, the most closely watched metric in the sector, Waymo recorded one disengagement every 9,793 miles. Newer operators needed interventions far more frequently. The lesson isn't that human operators are a sign of weakness. It's that the ratio of robots to operators improves continuously as AI matures, and the operators themselves are what make deployment safe enough to generate the data that drives that improvement. Cruise's experience offers the cautionary version of the same lesson. At the height of its operations, Cruise's vehicles needed remote human intervention approximately every four to five miles, a fact obscured in public communications. When that gap between claimed capability and operational reality became visible, it contributed to the company's collapse. The parallel for humanoid robotics is exact. Companies that are transparent about their teleoperation dependency and treat it as an asset, as a way to generate training data while delivering real value, are building a durable path to autonomy. Companies that obscure it are building a credibility problem. Teleoperation isn't a step backward. It's the pipeline. Teleoperation is not a workaround for the absence of autonomy. It is how you build autonomy. Every time a trained human operator guides a robot through a task, that interaction generates high-fidelity training data: camera video, sensor readings, force data, timestamped controls. The robot is learning what a human expert does in a real, messy, uncontrolled environment, not a simulation. This is the kind of data that foundation models need to generalise. Sanctuary AI takes a similar approach, using remote experts alongside tactile sensors to give operators a sense of touch, generating richer training data for manipulation tasks while simultaneously solving real customer problems. This is the architecture that works: teleoperation as a business model today, teleoperation as a data pipeline for tomorrow. The spectrum: where your robot actually lives. It's tempting to think of teleoperation and autonomy as two states, on or off. In practice, every deployed robot sits somewhere on a spectrum, and the right position on that spectrum depends on your use case, your environment, and where you are in your autonomy development. Full teleoperation: a human controls every movement. This makes sense for genuinely novel tasks, dangerous environments, or early deployments where edge cases are frequent and unpredictable. Supervised autonomy: The robot handles routine execution while a human monitors and can intervene, is where most of the most successful deployments operate today. The robot does the work; the human provides a safety net and steps in for edge cases. Waymo's fleet response model is the gold standard here: approximately 70 remote assistants overseeing a fleet of around 3,000 vehicles. Task-limited autonomy: full independence for specific, well-defined, repeatable tasks in controlled environments, is commercially real today. Agility's Digit doing pick-and-place in an Amazon warehouse. UBTech's Walker S2 performing repetitive manufacturing tasks on structured production lines. The key word is controlled. These robots aren't general-purpose; they're reliable in their lane. Full autonomy: true independence across diverse, unstructured, unpredictable environments, is the destination. It is not 2026's reality for most applications. Goldman Sachs estimates that consumer applications lag industrial ones by two to four years. The strategic question for any robotics company isn't "when will we reach full autonomy?" It's: "What's the right level of human-in-the-loop for our deployment right now, and how do we use that to accelerate toward greater autonomy over time?" The business case for getting this right. Deploying robots that attempt full autonomy before they're ready produces visible, public failures. Those failures erode customer trust, generate negative press, and, as Cruise discovered, can be existential. The AV sector spent years overpromising on autonomy timelines. The robotics sector is at risk of repeating the same mistake. Deploying robots with the right level of teleoperation support does something different. It lets you: * Generate revenue today, even before full autonomy is achieved * Collect real-world training data at scale, in the environments your robots will eventually operate autonomously * Build customer trust through reliable performance, rather than eroding it through visible failures * Reduce deployment risk by keeping humans in the loop for the edge cases that will inevitably arise * Create a defensible moat - the operator expertise, training data, and operational playbooks you build now are hard for competitors to replicate The teleoperation market itself reflects this shift. It was valued at approximately $502 million in 2024 and is projected to reach $4.7 billion by 2035, growing at a compound annual rate of over 25%. That growth isn't because autonomy is failing. It's because the industry has accepted that the path to autonomy runs through teleoperation, not around it. The bottom line. Full autonomy is the right destination for the robotics industry. But the path there isn't a straight line, and the companies that will get there fastest aren't the ones waiting at the starting line until their AI is perfect. They're the ones deploying now, learning from real-world data, and using teleoperation as an engine, generating value for customers today while building the training foundation for tomorrow. The question for robotics companies shouldn't be whether to use teleoperation. It should be whether you're using it strategically. Adamo ensures the answer is yes.

INACTIVE