Full-Time

Reinforcement Learning Engineer-Spot Behavior

Posted on 3/19/2026

Boston Dynamics

Boston Dynamics

1,001-5,000 employees

Develops legged robots for industrial use

No salary listed

Waltham, MA, USA

In Person

Category
AI & Machine Learning (2)
,
Required Skills
Python
Machine Learning
C/C++
Reinforcement Learning
Requirements
  • Master’s degree or higher in Robotics, Mechanical Engineering, Computer Science, or a related field.
  • 3+ years of experience with a proven track record of deploying models on hardware.
  • Proficiency in both Python and C++ programming languages.
  • Strong analytical and debugging skills.
  • Familiarity with modern deep reinforcement learning toolkits and architectures.
Responsibilities
  • Design and deploy reinforcement learning systems to improve Spot’s mobility and robustness.
  • Integrate learning-based solutions into Spot’s existing planning and control systems in collaboration with experts across controls, perception, and planning.
  • Build and maintain systems that support reliable, scalable, and reproducible reinforcement learning training.
  • Test and debug your work using our in-house fleet of Spot robots.
  • Write high-quality, maintainable code in both Python and C++.
  • Provide mentorship and technical guidance on ML best practices.
Desired Qualifications
  • Experience with legged robotics.
  • PhD in Robotics, Mechanical Engineering, Computer Science, or a related field.

Boston Dynamics designs and sells advanced legged robots to improve safety and efficiency in industrial and research settings. Its products, such as Spot and Pick, use onboard AI to perceive the environment, balance and navigate complex terrain, and autonomously avoid obstacles, enabling tasks that are dangerous or physically demanding for people. Spot is a mobile, 65-pound robot that can traverse stairs, uneven surfaces, and rough terrain, while Pick focuses on manipulation for robotics workflows. The company differentiates itself through its focus on mobility, dexterity, and safety, maintaining US-made production, and offering controlled sales plus ongoing maintenance, training, and support to commercial, industrial, and academic clients rather than consumer buyers. The goal is to augment human workers by handling risky or monotonous tasks, increasing safety and productivity across industries.

Company Size

1,001-5,000

Company Stage

Acquired

Total Funding

$917M

Headquarters

Waltham, Massachusetts

Founded

1992

Simplify Jobs

Simplify's Take

What believers are saying

  • Hyundai ownership since June 2021 funds new facility scaling Atlas to thousands yearly.
  • Gemini Robotics-ER 1.6 integration boosts Spot gauge-reading accuracy to 98% in April 2026.
  • FieldAI partnership enables Spot fleets cutting construction inspections 90% via FFMs.

What critics are saying

  • C-suite exodus including CEO Playter in February 2026 and CTO Saunders to DeepMind erodes expertise.
  • Figure AI's Figure 02 deploys in BMW plants since 2025 undercutting Atlas pricing 10x.
  • Tesla Optimus Gen 3 hits 50,000-unit production by Q2 2026 flooding market at $20k each.

What makes Boston Dynamics unique

  • Boston Dynamics pioneers dynamic legged robots like Spot and Atlas since 1992 MIT spin-off.
  • Spot quadruped excels in unstructured environments with advanced mobility and dexterity.
  • Atlas humanoid robot debuts all-electric version in 2021 for commercial manipulation.

Help us improve and share your feedback! Did you find this helpful?

Benefits

Remote Work Options

Flexible Work Hours

Growth & Insights and Company News

Headcount

6 month growth

0%

1 year growth

-1%

2 year growth

0%
Ars Technica
Apr 15th, 2026
Robot dogs read gauges with 98% accuracy using Google's Gemini AI

Boston Dynamics' Spot robot can now accurately read analog thermometers and pressure gauges in industrial facilities using Google DeepMind's new Gemini Robotics-ER 1.6 model, announced on 14 April. The AI model improves instrument reading accuracy from 23% in the previous version to 98%. The model features "agentic vision", combining visual reasoning with code execution to create a visual scratchpad for inspecting images. It also delivers improved multi-view reasoning, allowing robots to use multiple camera streams to better understand their environment. Without agentic vision, the baseline model still achieves 86% accuracy. Google describes Gemini Robotics-ER 1.6 as its safest robotics model yet, with improved capacity to follow physical safety constraints and perceive injury risks to humans.

Infeeds
Apr 2nd, 2026
Why robots at a phone conference? Because 6G changes everything.

Why robots at a phone conference? Because 6G changes everything. 6G isn't just about faster downloads. It's about turning robots into intelligent fleets that think, learn, and coordinate in real time. I walked the halls of Mobile World Congress last month expecting the usual suspects: new phone announcements, network upgrades, maybe some predictable demos. What I didn't expect was humanoid robots everywhere. The question kept nagging at me as I moved from booth to booth: why are there so many robots at a conference obsessed with mobile phones? Sure, a dancing robot draws a crowd. It's good theatre. But to dismiss them as mere publicity stunts would be missing something genuinely significant happening underneath the spectacle. There's a real conversation unfolding about what happens when robotics meets next-generation connectivity, and it's worth paying attention to. The missing link between two worlds. On the surface, 6G and robotics seem like unrelated futures. One is a network. One is a machine. But that's where things get interesting. According to experts at the conference, 6G won't just make your phone download speeds faster. It'll transform robots from standalone mechanical tools into coordinated fleets that share information, learn from each other, and operate as part of a larger ecosystem. This isn't speculation about something distant either. Companies like Boston Dynamics and Honor have already shown off humanoid robots designed for real work in factories and homes. But there's another level to unlock, and it requires the connectivity that 6G promises to deliver around 2030. The shift sounds subtle in theory. In practice, it's the difference between a robot that can only do what it was programmed to do and a robot that can adapt, learn, and collaborate in real time. How 6G becomes a robot's nervous system. Let's break down what makes 6G special for robotics. First, it acts as a sensor network. Qualcomm's executive vice president of Robotics, Nakul Duggal, explained to attendees how sensors embedded in robots and environments would allow the 6G radio to function like radar, constantly scanning and mapping surroundings in real time. A crowded room stops being a navigation nightmare when you have that kind of environmental awareness. But sensing is just part of it. The real power comes from pure speed. Today's 5G networks aren't designed to handle the continuous AI requests that smart robots demand. Frank Long, associate director of intelligent services at Cambridge Consultants, noted that "with 6G you can pretty much have that quality of service guarantee" that robotics needs. The demo his team brought showed a humanoid robot that could pick up and place a box based on where you pointed, adjusting its grip in real time as it reacted. That required enormous compute power, and it only worked because they used a private 5G network. Public networks today can't reliably handle that workload. This matters because robots won't be working alone. They'll need to communicate constantly with infrastructure and with each other. Imagine a retail scenario that Anshuman Saxena, general manager of robotics at Qualcomm, described: one robot unloading soda cans from a truck while another restocks shelves. For them to work efficiently, they need to share what they see. The unloading robot needs to tell the shelving robot how many cans are coming. The shelving robot needs to know what space is available. This kind of coordination, sometimes called long-horizon planning, is where 6G becomes essential. The home scenario you're not thinking about yet. Here's where it gets weird in a good way. You might imagine a single humanoid robot in your home as being totally different from multiple robots in a warehouse or store. It's not, really. Your phone already coordinates with your security cameras, your smart lights, and your other devices. A home robot would just be another device in that mesh. Or maybe you'll have one humanoid and several smaller robots designed for specific tasks. The point is that even in your private space, there's a "fleet aspect" happening. You don't feel it, but it's how modern devices work. What's more interesting is the learning part. Your phone collects data about how you use it, and that data gets fed back to make the software better. A 6G-equipped robot would work the same way. It might learn to serve coffee in a hotel for months, then come to your home with that training already embedded. But it'll still need to learn your specific layout, your quirks, your preferences. The robot that serves coffee in a thousand hotels will bring all that collective knowledge to your kitchen. The heat problem and why it matters. Here's a detail that stuck with me. A robot might serve you a cup of coffee without understanding that the cup is scalding hot. Humans react instantly to heat. Infeeds pull its hand back without thinking. A robot needs to be taught this through data. Lots of data. From real situations. Saxena pointed this out as an example of why continuous learning matters so much. Right now, gathering that kind of real-world training data and feeding it back to improve the system is incredibly slow. Networks bottle up. The process becomes expensive and time-consuming. 6G removes that bottleneck. The speed and reliability it offers means robots can actually gather and process the information they need to become genuinely safe and useful in human environments. But here's the honest part: this is going to be hard. Frank Long put it perfectly: "Put it this way, members of my immediate family still struggle with opening the baby gate in my stairs, even after extensive training. So a robot, I think, might be a few years away from opening that baby gate." It's funny, but it's also a reality check. Even with 6G on the horizon, there's a lot of work to do. What about right now? Companies aren't waiting around for 6G to show up in 2030 and beyond. Qualcomm is working with robotics firms like Neura Robotics, pushing the boundaries of what current Technology can do. The robots being deployed now are learning, improving, and getting better with dexterity and problem-solving. They're priming themselves to take full advantage of better connectivity when it arrives. The convergence of robotics and 6G feels inevitable now. But there's also something worth noting in all this: the gap between what's possible in a controlled demo and what's actually safe and useful in a home is still enormous. The robots coming to warehouses and hotels in the next few years will be genuinely useful. The humanoids heading to homes? That story might be best written in the 2030s, if not later. Which raises a question worth sitting with: are Infeeds ready for that kind of automated assistance in its most private spaces, or are Infeeds just moving fast because Infeeds can?

Trending Topics
Mar 24th, 2026
Agile Robots teams up with Google DeepMind to power next-gen industrial AI.

Agile Robots teams up with Google DeepMind to power next-gen industrial AI. Munich-based robotics company Agile Robots has entered into a strategic research partnership with Google DeepMind. The collaboration brings Google's AI research lab's Gemini Robotics foundation models to Agile Robots' hardware platforms. This places the company, founded in 2018, among a growing list of robotics firms betting on Google DeepMind's AI expertise. Agile Robots says it has already installed more than 20,000 robotics solutions worldwide and raised over $270 million from investors including SoftBank Vision Fund, Xiaomi, and Midas Group. Adaptable robots. The partnership aims to develop adaptable robots for industrial applications - specifically in the electronics industry, automotive manufacturing, data centers, and logistics. The underlying concept: data from real-world robot operations feeds back into the Gemini models, continuously improving them. Improved models, in turn, expand the robots' capabilities. "Agile Robots has already installed over 20,000 robotics solutions worldwide, proving intelligent automation at scale," says CEO and founder Zhaopeng Chen."The huge opportunity ahead lies in autonomous, intelligent production systems that can transform entire industries. Integrating Google DeepMind's Gemini Robotics models into our robotic solutions positions us at the cutting edge of this rapidly growing market." Twice a week for free - never miss a story! Agile Robots was founded by former researchers from the German Aerospace Center (DLR), namely Dr. Zhaopeng Chen and Peter Meusel. The company now employs more than 2,300 people worldwide and operates production sites in Europe, China, and India. Carolina Parada, Senior Director and Head of Robotics at Google DeepMind, emphasizes: "We are excited to partner with Agile Robots as we develop more advanced AI models for the next generation of robots and to scale their impact across sectors. This research partnership is an important step in bringing the impact of AI to the real world." Robotics partnerships on the rise. Agile Robots is not the only robotics company to secure Google DeepMind as a partner. Earlier this year, Boston Dynamics announced it would use the Gemini foundation models for the development of its humanoid robot Atlas. The Hyundai-owned firm was itself in Google's possession between 2013 and 2017. German startup Neura Robotics also announced a partnership in early March - though with semiconductor company Qualcomm, whose new IQ10 processor series for mobile robots and humanoids is set to serve as a reference design. The rationale behind such collaborations: robots are extremely complex on both the hardware and software sides. Companies with specific strengths - whether in hardware, mobility, or software - benefit from working with partners who bring different expertise to the table. From DLR research to industrial AI. Agile Robots focuses on Physical AI - a concept in which robots not only think digitally but also respond intelligently to their physical environment. The company's solutions combine force and sensor technology, visual perception, AI models, and its proprietary robotics software AgileCore. The product portfolio includes industrial robot arms such as the Diana 7 and the Thor series, sensitive gripping modules, autonomous mobile robots (AMRs), and AGVs for in-plant logistics. The centerpiece is the humanoid assistance and industrial robot Agile ONE, set to enter series production in 2026 and capable of working closely alongside humans. By integrating the Gemini Robotics models, Agile Robots aims to further expand its position in sectors such as automotive manufacturing, the electronics industry, and healthcare - and to benefit from a scalable AI feedback loop model in which every deployment improves the models and every improvement opens up new use cases. Aus Datenschutz-Gründen ist dieser Inhalt ausgeblendet. Die Einbettung von externen Inhalten kann in den Datenschutz-Einstellungen aktiviert werden:

Dealroom.co
Mar 23rd, 2026
Boston Dynamics company information, funding & investors

Boston Dynamics, dynamic robots and simulation software for real-world tasks. Here you'll find information about their funding, investors and team.

Tech in Asia
Mar 17th, 2026
Hyundai Mobis to expand chips, robotics push

Hyundai Mobis to expand chips, robotics push. Hyundai Mobis said it plans to move early into automotive semiconductors and robotics components while expanding joint development with global customers. The company reported sales of 61.1 trillion won (US$40.93 billion) and operating profit of 3.4 trillion won (US$2.27 billion) on a consolidated basis for last year. Hyundai Mobis said it plans to raise the share of global customer companies in parts manufacturing to 40% by 2033. Food for thought. Implications, context, and why it matters. This robotics push is a clear shift, not a side effort. * Hyundai Mobis plans to supply actuators (motor-driven parts that create movement) for Boston Dynamics' next-generation Atlas humanoid robot 1. * Hyundai Mobis is shedding older lines. It has put its bumper business up for sale and named France's OPmobility as the preferred bidder for its automotive lighting business while moving toward software-driven vehicle technologies 2. * Hyundai Motor Group wants a production system that can build 30,000 robot units a year by 2028 3. That goal ties to its push into "Physical AI" (AI systems designed to operate in and interact with the real world through machines like robots) 3. Industrial giants use factory scale to build AI supply lines. * Many conglomerates are building vertically integrated supply chains (one group controls more steps from R&D to manufacturing) to compete in the AI era. * Hyundai Motor Group will have its parts unit supply a high-cost component to its robotics affiliate Boston Dynamics, which tightens a value chain from development through mass production 3. * Automotive manufacturing know-how can lower per-unit costs, which specialized robotics firms may struggle to match 1. That shift could reshape competition for robotics components 1. How would you feel if you could no longer use Tech in Asia? Share, tag us, and land on our Wall of!

INACTIVE