Full-Time

Lead Architect

Runtime

Posted on 10/31/2025

SambaNova Systems

SambaNova Systems

201-500 employees

Enterprise AI platform with hardware stack

Compensation Overview

$200k - $250k/yr

Palo Alto, CA, USA

In Person

Category
Software Engineering (2)
,
Required Skills
CUDA
Git
JIRA
Assembly
Jenkins
C/C++
Linux/Unix
Requirements
  • Strong Software Engineering Background: Proven experience building, testing, and tuning software for distributed, high-performance systems. In-depth knowledge of operating systems and runtime stacks.
  • Real-Time Operating Systems (RTOS): Hands-on experience with RTOS and system-level software that directly interfaces with hardware.
  • High-Performance Computing (HPC): Expertise in designing and optimizing systems that handle massive parallel workloads, including machine learning training and inference tasks that involve billions of operations per second.
  • Low-Level System Understanding: Deep understanding of hardware-software interaction, including registers, device memory management, and the intricacies of accelerator design. Experience working with ASIC accelerators is highly desirable.
  • Distributed Systems Expertise: Familiarity with distributed systems architecture, including networking, communication protocols, and the challenges of scaling compute resources efficiently.
  • Toolchain Expertise: Hands-on experience with software development tools such as Git, Jenkins, and Jira, with an ability to drive automation and continuous integration efforts.
  • Cross-Disciplinary Knowledge: Ability to work at the intersection of hardware and software, designing systems that optimize both performance and reliability.
Responsibilities
  • Architectural Leadership: Lead the design, development, and performance optimization of the software runtime stack, ensuring it meets the high-performance and scalability requirements of ML, AI, and HPC applications.
  • Runtime System Design: Architect embedded software infrastructure to enable smooth integration of high-level applications with the underlying hardware, including OS interface/integration, partitioned workload orchestration, fault management, and inter-node communication.
  • Hardware Interaction: Oversee and guide the low-level integration between software and hardware components, ensuring efficient chipset initialization, monitoring, and fault management.
  • Technical Strategy: Drive the technical direction for the Runtime Engineering team, ensuring the design and implementation of software that delivers performance and scales efficiently with our next-generation AI hardware and platforms based on our Reconfigurable Dataflow Architecture.
  • Tooling and Profiling: Lead the design and development of tools and performance profilers, empowering customers to configure, deploy, and optimize their workloads on SambaNova’s Datascale systems.
  • Mentorship and Team Development: Inspire and guide the team to continuously improve development processes, coding standards, and collaboration practices. Foster a culture of excellence, accountability, and technical growth.
  • Cross-functional Leadership: Collaborate with hardware, software, and product teams to define requirements and ensure seamless integration between hardware and system software components.
Desired Qualifications
  • ASIC/FPGA Expertise: Experience designing or working closely with custom hardware accelerators (ASICs, FPGAs, etc.) and understanding low-level interactions.
  • Cloud and Data Center Experience: Familiarity with deploying high-performance systems in distributed, cloud, or data center environments.

Generating company summary.

Company Size

201-500

Company Stage

Series E

Total Funding

$1.5B

Headquarters

Palo Alto, California

Founded

2017

Simplify Jobs

Simplify's Take

What believers are saying

  • Intel's $50M investment and multi-year collaboration expand go-to-market channels.
  • SoftBank deploys SN50 in Japan AI data centers for sovereign inference.
  • TEPCO Systems distributes infrastructure across Japan's power sector from April 2026.

What critics are saying

  • Nvidia B300 GPUs crush SN50 claims with 2.5x throughput via CUDA lock-in H2 2026.
  • Intel commoditizes RDUs in hybrid stacks, eroding SambaNova margins within 12 months.
  • SN50 production delays from $350M raise strand SoftBank clusters past H2 2026.

What makes SambaNova Systems unique

  • SambaNova's RDU chips use three-tier memory architecture for faster inference than GPUs.
  • SambaRack SN50 runs 10-trillion parameter models with 5x Nvidia Blackwell speed.
  • Full-stack platform integrates hardware, software, and cloud for agentic AI.

Help us improve and share your feedback! Did you find this helpful?

Benefits

Flexible PTO in US

Parental Leave

Benefits (medical, dental and vision)

Flexible Spending Accounts

401k/Pensions

Gym Access

Flexible Working Hours

Growth & Insights and Company News

Headcount

6 month growth

-2%

1 year growth

0%

2 year growth

1%
Northland News Radio
May 1st, 2026
Intel's $35M SambaNova investment clears US antitrust review

US antitrust authorities have completed their review of Intel's investment in SambaNova, a chip startup chaired by Intel CEO Lip-Bu Tan, according to a regulatory notice issued on Friday. Intel invested $35 million in SambaNova in February, increasing its stake to 8.2% from 6.8%. The company plans to invest an additional $15 million in the startup, Reuters previously reported. The clearance removes regulatory obstacles to Intel's continued backing of the AI chip company.

SambaNova
Apr 21st, 2026
SambaNova and TEPCO Systems partner to deliver energy-efficient AI infrastructure to Japan's power sector.

SambaNova and TEPCO Systems partner to deliver energy-efficient AI infrastructure to Japan's power sector. SAN JOSE, Calif. - April 21, 2026 - SambaNova, a leader in next-generation AI infrastructure, announces that TEPCO Systems Corporation ("TEPCO Systems"), the digital transformation arm of Tokyo Electric Power Company Holdings, Incorporated ("TEPCO Group"), has signed a distributor agreement to bring SambaNova's energy-efficient, high-performance AI infrastructure to enterprises across Japan. Under the agreement, TEPCO Systems will also deploy SambaNova's AI infrastructure as the foundation for the TEPCO Group's next-generation AI system platform, powering mission-critical applications that demand performance, security, and efficiency at scale. TEPCO Systems is adopting SambaNova's AI infrastructure - which delivers outstanding power efficiency and inference performance - to build new AI data center capabilities, while also offering these services to customers beyond the TEPCO Group. This collaboration will help accelerate Japan's AI-driven digital transformation by enabling organizations to run advanced AI workloads with reduced energy consumption and a lower total cost of ownership. SambaNova's systems have already been selected for deployment in projects such as NEDO's (New Energy and Industrial Technology Development Organization) "Post-5G Information and Communication System Infrastructure Enhancement R&D Project (Advanced Computing Resources) / Development of Post-5G Information and Communication Systems / R&D on the Utilization of Diverse AI Semiconductors and High-Efficiency Computing Resources (JPNP2501)," further underscoring the platform's suitability for large-scale, compute-intensive environments. TEPCO Systems: building next-generation AI data centers "SambaNova's AI infrastructure enables accurate, high-speed inference using highly confidential internal data in a secure environment, while also offering excellent power efficiency," said Haruki Mino, President at TEPCO Systems Corporation. "For this reason, we are evaluating SambaNova as the platform for our next-generation AI data centers." "Working together with SambaNova, TEPCO Systems will provide energy-efficient, high-performance modular AI systems and services centered on SambaNova's technology," added Mino. "This will help advance the sophistication of the electric power business through AI and accelerate our digital transformation, while also supporting green transformation initiatives and expanding our external AI data center business for customers outside the TEPCO Group." Meeting Japan's demand for secure, efficient AI at scale "As agentic AI moves from proof-of-concept into full-scale deployment, customers are seeking dramatically higher inference performance without compromising power efficiency or security," said Toshinori Kujiraoka, Vice President, Asia Pacific, SambaNova. "Through this distributor agreement, TEPCO Systems can now deliver SambaNova's next-generation AI infrastructure to more enterprises across Japan and support the construction of highly reliable AI systems that meet the stringent requirements of mission-critical environments." "Together, we will build sustainable AI data centers for the TEPCO Group that combine high performance with low power consumption, accelerating DX initiatives while helping organizations reduce their energy footprint," Kujiraoka continued. A new blueprint for large-scale AI in energy "TEPCO Systems is at the forefront of AI transformation in the energy sector, and it is a great honor to deepen our collaboration through this agreement," said Rodrigo Liang, Co-founder and CEO, SambaNova. "By combining TEPCO Systems' expertise in operating large-scale, mission-critical infrastructure with SambaNova's AI platform, we've created the blueprint for how utilities and critical infrastructure operators can deploy AI responsibly and sustainably." About TEPCO Systems Corporation TEPCO Systems Corporation is a core member of the TEPCO Group responsible for driving digital transformation across the utility's operations, from power generation and grid management to customer services. Headquartered in Koto-ku, Tokyo, TEPCO Systems develops and operates large-scale IT and OT systems that support reliable, safe, and efficient energy delivery in Japan. About SambaNova SambaNova is a leader in next-generation AI infrastructure, providing a full-stack platform that delivers the fastest and most efficient AI inference for enterprises, NeoClouds, AI research labs, service providers, and sovereign AI initiatives worldwide. Founded in 2017 and headquartered in San Jose, California, SambaNova offers chips, systems, and cloud services that enable customers to deploy state-of-the-art models with superior performance, lower total cost of ownership, and faster time to value. For more information, visit sambanova.ai or contact [email protected]. Contacts Virginia Jamieson, Head of Communications, SambaNova [email protected]

The Associated Press
Apr 8th, 2026
SambaNova and Intel unveil hybrid AI inference blueprint combining GPUs, RDUs, and Xeon 6 CPUs for agentic workloads

SambaNova and Intel have announced a heterogeneous AI inference system combining GPUs for prefill, SambaNova RDUs for decode, and Intel Xeon 6 processors for agentic tools. The solution targets enterprise agentic AI applications and will be available in the second half of 2026. The architecture addresses production agentic AI workloads by using GPUs for parallel processing, SambaNova's SN50 RDU for high-throughput token generation, and Xeon 6 processors for task coordination and code execution. According to SambaNova, Xeon 6 delivers over 50% faster compilation times compared to Arm-based server CPUs. The system is designed for standard air-cooled data centres, enabling enterprises and sovereign AI programmes to run production-scale AI whilst maintaining data residency and security requirements without building new facilities.

NiftyGPT
Apr 2nd, 2026
Intel invests additional $15M in AI startup SambaNova, raising stake to 9%

Intel plans to invest an additional $15 million in SambaNova, an AI hardware startup chaired by CEO Lip-Bu Tan, increasing its stake to 9%, according to documents. The investment follows a $35 million commitment in February, bringing Intel's total backing to $50 million. SambaNova develops neural-network accelerator platforms competing with Nvidia and Graphcore. Intel's increased stake aims to integrate its CPU and FPGA technologies with SambaNova's AI inference engines, strengthening its position in the high-performance computing market. The deepening partnership reflects Intel's strategy to diversify its AI portfolio beyond traditional silicon and offer end-to-end AI solutions from data centres to edge deployment. The move could influence supply-chain dynamics and pricing strategies as competition in AI hardware intensifies.

Business Wire
Mar 2nd, 2026
SambaNova survey: 75% of consumers fear AI data centres will raise household energy bills

SambaNova has released research showing mounting public concern over AI data centres' energy consumption. A survey of 2,525 US and UK adults found 75% fear AI data centres could increase household energy bills, whilst 83% believe AI companies should prioritise energy efficiency over rapid capability rollouts. The findings reveal 71% worry AI data centres will strain national power grids, and 91% say sovereign AI systems are important. This follows SambaNova's 2024 survey showing 49.8% of business leaders were concerned about AI energy challenges, yet only 13% monitored power consumption. SambaNova CEO Rodrigo Liang said the company's new SN50 chip delivers up to five times more compute per accelerator and three times better inference efficiency than GPU-based systems, whilst operating within standard 20kW per rack power envelopes.

INACTIVE