Full-Time
Posted on 10/31/2025
Enterprise AI platform with hardware stack
$200k - $235k/yr
Palo Alto, CA, USA
In Person
Generating company summary.
Company Size
201-500
Company Stage
Series E
Total Funding
$1.5B
Headquarters
Palo Alto, California
Founded
2017
Help us improve and share your feedback! Did you find this helpful?
Flexible PTO in US
Parental Leave
Benefits (medical, dental and vision)
Flexible Spending Accounts
401k/Pensions
Gym Access
Flexible Working Hours
US antitrust authorities have completed their review of Intel's investment in SambaNova, a chip startup chaired by Intel CEO Lip-Bu Tan, according to a regulatory notice issued on Friday. Intel invested $35 million in SambaNova in February, increasing its stake to 8.2% from 6.8%. The company plans to invest an additional $15 million in the startup, Reuters previously reported. The clearance removes regulatory obstacles to Intel's continued backing of the AI chip company.
SambaNova and TEPCO Systems partner to deliver energy-efficient AI infrastructure to Japan's power sector. SAN JOSE, Calif. - April 21, 2026 - SambaNova, a leader in next-generation AI infrastructure, announces that TEPCO Systems Corporation ("TEPCO Systems"), the digital transformation arm of Tokyo Electric Power Company Holdings, Incorporated ("TEPCO Group"), has signed a distributor agreement to bring SambaNova's energy-efficient, high-performance AI infrastructure to enterprises across Japan. Under the agreement, TEPCO Systems will also deploy SambaNova's AI infrastructure as the foundation for the TEPCO Group's next-generation AI system platform, powering mission-critical applications that demand performance, security, and efficiency at scale. TEPCO Systems is adopting SambaNova's AI infrastructure - which delivers outstanding power efficiency and inference performance - to build new AI data center capabilities, while also offering these services to customers beyond the TEPCO Group. This collaboration will help accelerate Japan's AI-driven digital transformation by enabling organizations to run advanced AI workloads with reduced energy consumption and a lower total cost of ownership. SambaNova's systems have already been selected for deployment in projects such as NEDO's (New Energy and Industrial Technology Development Organization) "Post-5G Information and Communication System Infrastructure Enhancement R&D Project (Advanced Computing Resources) / Development of Post-5G Information and Communication Systems / R&D on the Utilization of Diverse AI Semiconductors and High-Efficiency Computing Resources (JPNP2501)," further underscoring the platform's suitability for large-scale, compute-intensive environments. TEPCO Systems: building next-generation AI data centers "SambaNova's AI infrastructure enables accurate, high-speed inference using highly confidential internal data in a secure environment, while also offering excellent power efficiency," said Haruki Mino, President at TEPCO Systems Corporation. "For this reason, we are evaluating SambaNova as the platform for our next-generation AI data centers." "Working together with SambaNova, TEPCO Systems will provide energy-efficient, high-performance modular AI systems and services centered on SambaNova's technology," added Mino. "This will help advance the sophistication of the electric power business through AI and accelerate our digital transformation, while also supporting green transformation initiatives and expanding our external AI data center business for customers outside the TEPCO Group." Meeting Japan's demand for secure, efficient AI at scale "As agentic AI moves from proof-of-concept into full-scale deployment, customers are seeking dramatically higher inference performance without compromising power efficiency or security," said Toshinori Kujiraoka, Vice President, Asia Pacific, SambaNova. "Through this distributor agreement, TEPCO Systems can now deliver SambaNova's next-generation AI infrastructure to more enterprises across Japan and support the construction of highly reliable AI systems that meet the stringent requirements of mission-critical environments." "Together, we will build sustainable AI data centers for the TEPCO Group that combine high performance with low power consumption, accelerating DX initiatives while helping organizations reduce their energy footprint," Kujiraoka continued. A new blueprint for large-scale AI in energy "TEPCO Systems is at the forefront of AI transformation in the energy sector, and it is a great honor to deepen our collaboration through this agreement," said Rodrigo Liang, Co-founder and CEO, SambaNova. "By combining TEPCO Systems' expertise in operating large-scale, mission-critical infrastructure with SambaNova's AI platform, we've created the blueprint for how utilities and critical infrastructure operators can deploy AI responsibly and sustainably." About TEPCO Systems Corporation TEPCO Systems Corporation is a core member of the TEPCO Group responsible for driving digital transformation across the utility's operations, from power generation and grid management to customer services. Headquartered in Koto-ku, Tokyo, TEPCO Systems develops and operates large-scale IT and OT systems that support reliable, safe, and efficient energy delivery in Japan. About SambaNova SambaNova is a leader in next-generation AI infrastructure, providing a full-stack platform that delivers the fastest and most efficient AI inference for enterprises, NeoClouds, AI research labs, service providers, and sovereign AI initiatives worldwide. Founded in 2017 and headquartered in San Jose, California, SambaNova offers chips, systems, and cloud services that enable customers to deploy state-of-the-art models with superior performance, lower total cost of ownership, and faster time to value. For more information, visit sambanova.ai or contact [email protected]. Contacts Virginia Jamieson, Head of Communications, SambaNova [email protected]
SambaNova and Intel have announced a heterogeneous AI inference system combining GPUs for prefill, SambaNova RDUs for decode, and Intel Xeon 6 processors for agentic tools. The solution targets enterprise agentic AI applications and will be available in the second half of 2026. The architecture addresses production agentic AI workloads by using GPUs for parallel processing, SambaNova's SN50 RDU for high-throughput token generation, and Xeon 6 processors for task coordination and code execution. According to SambaNova, Xeon 6 delivers over 50% faster compilation times compared to Arm-based server CPUs. The system is designed for standard air-cooled data centres, enabling enterprises and sovereign AI programmes to run production-scale AI whilst maintaining data residency and security requirements without building new facilities.
Intel plans to invest an additional $15 million in SambaNova, an AI hardware startup chaired by CEO Lip-Bu Tan, increasing its stake to 9%, according to documents. The investment follows a $35 million commitment in February, bringing Intel's total backing to $50 million. SambaNova develops neural-network accelerator platforms competing with Nvidia and Graphcore. Intel's increased stake aims to integrate its CPU and FPGA technologies with SambaNova's AI inference engines, strengthening its position in the high-performance computing market. The deepening partnership reflects Intel's strategy to diversify its AI portfolio beyond traditional silicon and offer end-to-end AI solutions from data centres to edge deployment. The move could influence supply-chain dynamics and pricing strategies as competition in AI hardware intensifies.
SambaNova has released research showing mounting public concern over AI data centres' energy consumption. A survey of 2,525 US and UK adults found 75% fear AI data centres could increase household energy bills, whilst 83% believe AI companies should prioritise energy efficiency over rapid capability rollouts. The findings reveal 71% worry AI data centres will strain national power grids, and 91% say sovereign AI systems are important. This follows SambaNova's 2024 survey showing 49.8% of business leaders were concerned about AI energy challenges, yet only 13% monitored power consumption. SambaNova CEO Rodrigo Liang said the company's new SN50 chip delivers up to five times more compute per accelerator and three times better inference efficiency than GPU-based systems, whilst operating within standard 20kW per rack power envelopes.