Full-Time

Machine Learning Engineer

Posted on 1/23/2026

Helm.ai

Helm.ai

51-200 employees

Licenses unsupervised-learning AI for autonomous driving

Compensation Overview

$150k - $250k/yr

+ Equity + Bonus/Commission

Remote in USA

Remote

Category
AI & Machine Learning (2)
,
Required Skills
Python
Neural Networks
Machine Learning
Requirements
  • Proficiency in Python
  • Proven ability to thrive in fast-paced environment
  • Ability to communicate complex technical concepts to colleagues and a variety of audience
  • A sense of practical optimism
  • Introspection, thoughtfulness, and detail-orientation
Responsibilities
  • Characterize neural network quality, failure modes, and edge cases based on research data
  • Maintain awareness of current trends in relevant areas of research and technology
  • Coordinate with researchers and accurately convey status of experiments
  • Manage a large number of concurrent experiments and make accurate time estimates with respect to deadlines
  • Review experimental results and suggest theoretical or process improvements for future iterations
  • Write technical reports indicating qualitative and quantitative results to external parties
Desired Qualifications
  • Master’s or Ph.D. in a related field and/or 5+ years of experience in a directly related field
  • Experience reading research papers and implementing the techniques therein
  • Computer vision and deep learning experience

Helm.ai provides AI software for autonomous driving by licensing its unsupervised learning-based training technology to other companies, including carmakers and teams in aviation, robotics, manufacturing, and retail. Its product works by using unsupervised learning to train driving AI on unlabeled data, offering a cheaper and scalable alternative to traditional data-labeling approaches. Customers license the technology and may pay upfront licensing fees plus ongoing royalties based on usage. The company differentiates itself by focusing on unsupervised learning to reduce data labeling costs and speed up development, and by targeting multiple industries beyond just cars. Helm.ai’s goal is to become a major provider of scalable, cost-efficient AI training for autonomous systems and related fields, helping speed the adoption of autonomous technology in a trillion-dollar market.

Company Size

51-200

Company Stage

Late Stage VC

Total Funding

$145M

Headquarters

Menlo Park, California

Founded

2016

Simplify Jobs

Simplify's Take

What believers are saying

  • Honda partnership targets 2027 mass production of ADAS in EVs.
  • KPIT's $10M investment expands into SDV tier-1 suppliers.
  • Zero-shot deployment in Torrance accelerates global OEM adoption.

What critics are saying

  • Tesla FSD 12.5 erodes data-efficient licensing by Q1 2026.
  • Waymo's 2026 robotaxi expansion diverts Honda investments.
  • NHTSA mandates LiDAR post-15 crashes, invalidating vision-only by June 2026.

What makes Helm.ai unique

  • Helm.ai's Factored Embodied AI splits perception and policy for interpretability.
  • Deep Teaching achieves urban autonomy with only 1,000 hours of data.
  • Vision-only stack eliminates LiDAR and HD maps for cost reduction.

Help us improve and share your feedback! Did you find this helpful?

Benefits

Health Insurance

401(k) Retirement Plan

401(k) Company Match

Flexible Work Hours

Growth & Insights and Company News

Headcount

6 month growth

-3%

1 year growth

-3%

2 year growth

-2%
Business Wire
Feb 25th, 2026
Helm.ai achieves vision-only urban autonomy, scaling from Level 2+ to Level 4 with 1,000 hours of data

Helm.ai has announced a major expansion of its Helm.ai Driver software, a vision-only autonomous driving stack that scales from Level 2+ systems through Level 4 urban autonomy without requiring lidar sensors or high-definition maps. The company released a demonstration video showing the system navigating Redwood City, California, handling turns, traffic lights and dynamic interactions. The system uses Helm.ai's proprietary Factored Embodied AI architecture, which splits autonomy into separate Perception and Policy layers for improved interpretability and safety certification. Using its Deep Teaching technology and semantic simulation, the company achieved urban capability with only 1,000 hours of real-world driving data, compared to millions of miles typically required. Helm.ai recently demonstrated zero-shot deployment in Torrance, California, enabling the system to operate in new environments without prior training or manual tuning.

The Robot Report
Dec 11th, 2025
Helm.ai releases new architectural framework for autonomous vehicles

Helm.ai releases new architectural framework for autonomous vehicles. Typically, in the autonomous driving industry, developers create massive black-box, end-to-end models for autonomy that require petabytes of data to learn driving physics from scratch. Helm.ai today unveiled its Factored Embodied AI architectural framework, which it says offers a different approach. With the framework, the company released a benchmark demonstration of its vision-only AI Driver steering the streets of Torrance, CA, with zero-shot success without ever having seen those specific streets before. This included handling lane keeping, lane changes, and turns at urban intersections. Helm.ai said it achieved this autonomous steering capability by training the AI using simulation and only 1,000 hours of real-world driving data. "The autonomous driving industry is hitting a point of diminishing returns. As models get better, the data required to improve them becomes exponentially rarer and more expensive to collect," said Vladislav Voroninski, CEO and Founder of Helm.ai. "We are breaking this 'Data Wall' by factoring the driving task. Instead of trying to learn physics from raw, noisy pixels, our Geometric Reasoning Engine extracts the clean 3D structure of the world first. This allows us to train the vehicle's decision-making logic in simulation with unprecedented efficiency, mimicking how a human teenager learns to drive in weeks rather than years." Helm.ai said the architecture enables automakers to deploy ADAS through L4 capabilities using their existing development fleets, bypassing the prohibitive data barrier to entry. "We are moving from the era of brute force data collection to the era of Data Efficiency," added Voroninski. "Whether on a highway in LA or a haul road in a mine, the laws of geometry remain constant. Our architecture solves this universal geometry once, allowing us to deploy autonomy everywhere." Helm.ai said its new architecture can handle roads and more. The company said its new architecture offers several key technological advancements. First, it bridges the simulator gap. Helm.ai's architecture trains in "semantic space." This is a simplified view of the world that focuses on geometry and logic rather than graphics. By simulating the structure of the road rather than just the pixels, Helm.ai can train on infinite simulated data that works immediately in the real world. Next, leveraging this geometric simulation, Helm.ai's planner achieved robust, zero-shot urban autonomous steering using only 1,000 hours of real-world fine-tuning data, offering a capital-efficient path to fully autonomous driving. Additionally, to tackle acceleration, braking, and complex interactions, Helm.ai is leveraging its world model capabilities to predict the intent of pedestrians and other vehicles. Finally, to validate the robustness of its perception layer, Helm.ai deployed its automotive software into an Open-Pit Mine. With extreme data efficiency, the system correctly identified drivable surfaces and obstacles. This, Helm.ai said, proves the architecture can adapt to any robotics environment, not just roads. Helm.ai is working with Honda on mass-producing consumer avs. Founded in 2016, Helm.ai develops AI software for L2/L3 ADAS, L4 autonomous driving, and robotics automation. In August, the company partnered with Honda Motor Co., Ltd. The companies plan to work together to develop Honda's self-driving capabilities, including its Navigate on Autopilot (NOA) platform. The partnership centers on ADAS for production consumer cars, using Helm.ai's full-stack real-time AI software and large-scale autolabeling and generative simulation foundation models for development and validation. In October, Honda made an additional investment in Helm.ai. Honda isn't the only major automaker trying to put autonomous driving capabilities into consumer vehicles. In October, General Motors Co. announced plans to bring "eyes-off" driving to market. The company will be using technology originally developed at Cruise, a now-shut-down robotaxi developer. Tesla has long been a frontrunner when it comes to personal vehicle technology. Its "full self-driving" (FSD) software first came to the streets in 2020. While the company's technology has matured since then, it still requires a human driver to pay attention to the road and be ready to take over at all times.

Surperformance
Oct 15th, 2025
Helm.AI secures funding from Honda

Helm.AI Inc. announced it received funding on October 15, 2025, with participation from returning investor Honda Motor Co., Ltd. The company issued convertible preferred stock as part of the transaction.

Undercode News
Oct 15th, 2025
Honda Invests in Helm AI for Autonomy

Honda has announced an additional investment in U.S.-based Helm AI to advance its autonomous driving technology. This marks Honda's fourth funding round for Helm AI, aiming to integrate AI-driven systems into its EV and hybrid models by 2027. The partnership, which began in 2019, focuses on developing end-to-end autonomous systems. Honda's strategy is to transition from traditional engineering to intelligent mobility, positioning itself as a leader in AI-driven automotive innovation.

AInvest
Aug 25th, 2025
Honda and Helm.ai's Strategic Alliance: A Catalyst for Scalable, Cost-Effective Autonomous Driving Innovation

- Honda partners with Helm.ai to advance AI-driven autonomous driving via Deep Teaching technology, targeting 2027 mass production.

INACTIVE