Full-Time
Posted on 5/8/2025
Licenses unsupervised-learning AI for autonomous driving
No salary listed
Remote in USA
Remote
| , |
Helm.ai provides AI software for autonomous driving by licensing its unsupervised learning-based training technology to other companies, including carmakers and teams in aviation, robotics, manufacturing, and retail. Its product works by using unsupervised learning to train driving AI on unlabeled data, offering a cheaper and scalable alternative to traditional data-labeling approaches. Customers license the technology and may pay upfront licensing fees plus ongoing royalties based on usage. The company differentiates itself by focusing on unsupervised learning to reduce data labeling costs and speed up development, and by targeting multiple industries beyond just cars. Helm.ai’s goal is to become a major provider of scalable, cost-efficient AI training for autonomous systems and related fields, helping speed the adoption of autonomous technology in a trillion-dollar market.
Company Size
51-200
Company Stage
Late Stage VC
Total Funding
$145M
Headquarters
Menlo Park, California
Founded
2016
Help us improve and share your feedback! Did you find this helpful?
Health Insurance
401(k) Retirement Plan
401(k) Company Match
Flexible Work Hours
Helm.ai has announced a major expansion of its Helm.ai Driver software, a vision-only autonomous driving stack that scales from Level 2+ systems through Level 4 urban autonomy without requiring lidar sensors or high-definition maps. The company released a demonstration video showing the system navigating Redwood City, California, handling turns, traffic lights and dynamic interactions. The system uses Helm.ai's proprietary Factored Embodied AI architecture, which splits autonomy into separate Perception and Policy layers for improved interpretability and safety certification. Using its Deep Teaching technology and semantic simulation, the company achieved urban capability with only 1,000 hours of real-world driving data, compared to millions of miles typically required. Helm.ai recently demonstrated zero-shot deployment in Torrance, California, enabling the system to operate in new environments without prior training or manual tuning.
Helm.ai releases new architectural framework for autonomous vehicles. Typically, in the autonomous driving industry, developers create massive black-box, end-to-end models for autonomy that require petabytes of data to learn driving physics from scratch. Helm.ai today unveiled its Factored Embodied AI architectural framework, which it says offers a different approach. With the framework, the company released a benchmark demonstration of its vision-only AI Driver steering the streets of Torrance, CA, with zero-shot success without ever having seen those specific streets before. This included handling lane keeping, lane changes, and turns at urban intersections. Helm.ai said it achieved this autonomous steering capability by training the AI using simulation and only 1,000 hours of real-world driving data. "The autonomous driving industry is hitting a point of diminishing returns. As models get better, the data required to improve them becomes exponentially rarer and more expensive to collect," said Vladislav Voroninski, CEO and Founder of Helm.ai. "We are breaking this 'Data Wall' by factoring the driving task. Instead of trying to learn physics from raw, noisy pixels, our Geometric Reasoning Engine extracts the clean 3D structure of the world first. This allows us to train the vehicle's decision-making logic in simulation with unprecedented efficiency, mimicking how a human teenager learns to drive in weeks rather than years." Helm.ai said the architecture enables automakers to deploy ADAS through L4 capabilities using their existing development fleets, bypassing the prohibitive data barrier to entry. "We are moving from the era of brute force data collection to the era of Data Efficiency," added Voroninski. "Whether on a highway in LA or a haul road in a mine, the laws of geometry remain constant. Our architecture solves this universal geometry once, allowing us to deploy autonomy everywhere." Helm.ai said its new architecture can handle roads and more. The company said its new architecture offers several key technological advancements. First, it bridges the simulator gap. Helm.ai's architecture trains in "semantic space." This is a simplified view of the world that focuses on geometry and logic rather than graphics. By simulating the structure of the road rather than just the pixels, Helm.ai can train on infinite simulated data that works immediately in the real world. Next, leveraging this geometric simulation, Helm.ai's planner achieved robust, zero-shot urban autonomous steering using only 1,000 hours of real-world fine-tuning data, offering a capital-efficient path to fully autonomous driving. Additionally, to tackle acceleration, braking, and complex interactions, Helm.ai is leveraging its world model capabilities to predict the intent of pedestrians and other vehicles. Finally, to validate the robustness of its perception layer, Helm.ai deployed its automotive software into an Open-Pit Mine. With extreme data efficiency, the system correctly identified drivable surfaces and obstacles. This, Helm.ai said, proves the architecture can adapt to any robotics environment, not just roads. Helm.ai is working with Honda on mass-producing consumer avs. Founded in 2016, Helm.ai develops AI software for L2/L3 ADAS, L4 autonomous driving, and robotics automation. In August, the company partnered with Honda Motor Co., Ltd. The companies plan to work together to develop Honda's self-driving capabilities, including its Navigate on Autopilot (NOA) platform. The partnership centers on ADAS for production consumer cars, using Helm.ai's full-stack real-time AI software and large-scale autolabeling and generative simulation foundation models for development and validation. In October, Honda made an additional investment in Helm.ai. Honda isn't the only major automaker trying to put autonomous driving capabilities into consumer vehicles. In October, General Motors Co. announced plans to bring "eyes-off" driving to market. The company will be using technology originally developed at Cruise, a now-shut-down robotaxi developer. Tesla has long been a frontrunner when it comes to personal vehicle technology. Its "full self-driving" (FSD) software first came to the streets in 2020. While the company's technology has matured since then, it still requires a human driver to pay attention to the road and be ready to take over at all times.
Helm.AI Inc. announced it received funding on October 15, 2025, with participation from returning investor Honda Motor Co., Ltd. The company issued convertible preferred stock as part of the transaction.
Honda has announced an additional investment in U.S.-based Helm AI to advance its autonomous driving technology. This marks Honda's fourth funding round for Helm AI, aiming to integrate AI-driven systems into its EV and hybrid models by 2027. The partnership, which began in 2019, focuses on developing end-to-end autonomous systems. Honda's strategy is to transition from traditional engineering to intelligent mobility, positioning itself as a leader in AI-driven automotive innovation.
- Honda partners with Helm.ai to advance AI-driven autonomous driving via Deep Teaching technology, targeting 2027 mass production.