Full-Time

Senior Embedded Software Engineer

Firmware

Posted on 2/2/2026

Mythic

Mythic

51-200 employees

Power-efficient edge AI processors for inference

Compensation Overview

$120k - $225k/yr

Austin, TX, USA

Hybrid

Category
Software Engineering (1)
Required Skills
Python
Git
Assembly
C/C++
Requirements
  • Strong background in bare-metal and RTOS firmware development
  • 5+ years in embedded firmware, systems, or applications engineering on SoCs (experience with microcontrollers, DMA, secure boot, and serial peripherals)
  • 5+ years programming in C; 1+ year in C++
  • Experience with ARM or RISC assembly
  • 5+ years working with build tools (make, CMake, Bazel)
  • 1+ year scripting in Python
  • Familiarity with git or other version control systems
  • 1+ year of hands-on silicon bring-up experience
Responsibilities
  • Co-design next-generation compute accelerators with the hardware team
  • Develop real-time firmware enabling neural networks with the compiler team
  • Create firmware for boot, debug, and profiling support
  • Build Linux kernel modules for low-latency, high-throughput data transfer
  • Develop Linux libraries supporting inference frameworks
  • Implement Linux utilities for secure boot management
  • Optimize inference software for embedded platforms
  • Develop test firmware and utilities for manufacturing and customer deployments
Desired Qualifications
  • PCIe protocol or driver development experience
  • MIPI-CSI2 protocol or driver development experience
  • Experience writing customer-facing documentation
  • Background in SDK or BSP development (e.g., Yocto)
  • Understanding of parallel computing and/or compiler development
  • Experience with code quality tools, RTL simulators, or FPGA emulation
  • Knowledge of neural networks and machine learning
  • Experience with tile-based architectures
  • Experience working for semiconductor companies

Mythic designs power-efficient AI hardware for edge deployments. Its main products are the M1076 Analog Matrix Processor, a single-chip accelerator that can run at up to 25 TOPS, and the MP10304 Quad AMP PCIe card, which adds more performance for AI inference on edge devices and servers. The hardware uses analog matrix processing to perform AI computations with low energy usage, enabling high-performance inference in real-world environments without relying on cloud servers. Mythic targets customers that need strong AI performance with tight power budgets, such as tech companies and other businesses with heavy data-processing needs. Compared with typical AI accelerators, Mythic emphasizes analog computation and edge efficiency, offering a compact single-chip solution plus an expansion PCIe card to scale performance. The company's goal is to make high-performance AI inference practical at the edge by delivering power-efficient processors and modules for edge devices and servers.

Company Size

51-200

Company Stage

Late Stage VC

Total Funding

$290.4M

Headquarters

Austin, Texas

Founded

2012

Simplify Jobs

Simplify's Take

What believers are saying

  • Data center power constraints drive adoption as AI electricity consumption approaches 10% of U.S. grid.
  • Strategic investors Honda Motor and Lockheed Martin validate APUs for automotive and defense applications.
  • Former Nvidia CEO Taner Ozcelik leads commercialization with $125M funding from DCVC and SoftBank.

What critics are saying

  • No named paying customers despite production silicon and $125M raise signals commercialization stall.
  • Nvidia Jetson Orin and Qualcomm Snapdragon X Elite capture edge OEM designs with mature software ecosystems.
  • Absence of MLPerf benchmarks enables competitors to dismiss 750x token efficiency claims as unvalidated.

What makes Mythic unique

  • Analog compute-in-memory architecture delivers 120 TOPS/watt, 100x more efficient than GPUs.
  • Integrated memBrain technology from Microchip enables ultra-low-power inference at edge and data center.
  • M1076 executes DNNs entirely on-chip with 80M weights, eliminating external DRAM dependency.

Help us improve and share your feedback! Did you find this helpful?

Benefits

Every day is casual Friday!

Commuter: Caltrain & MetroRail passes. Scooter, Bike, Public Transit and Parking Stipend

Dog-Friendly in Austin office

$500 annual education benefit for conferences, classes or purchase of books

HSA/FSA: Mythic contributes $750 for single and $2000 for family per year if you enroll in high deductible plan

We pay 100% of employee premiums and 70% of dependent premiums

Flexible PTO and WFH

$150 quarterly wellness benefit

Growth & Insights and Company News

Headcount

6 month growth

-1%

1 year growth

-5%

2 year growth

-2%
BISinfotech
Mar 18th, 2026
Mythic selects memBrain technology from Silicon Storage Technology for its next generation of ultra-low-power Analog Processing Units.

Mythic selects memBrain technology from Silicon Storage Technology for its next generation of ultra-low-power Analog Processing Units. Mythic has chosen memBrain neuromorphic hardware intellectual property (IP) from Microchip Technology's Silicon Storage Technology (SST) subsidiary for its next-generation edge to enterprise Analog Processing Units (APUs). Mythic will utilize SST's SuperFlash embedded non-volatile memory (eNVM) bitcells to deliver high levels of analog compute-in-memory (aCIM) performance per watt. The partnership enables Mythic to achieve 120 TOPS/watt inference processing for power-efficient AI acceleration at the edge and in the data center: Mythic's APUs are targeted to be up to 100 times more energy-efficient than conventional digital Graphics Processing Units (GPUs). One hundred fifty billion units of SST SuperFlash technology that Mythic is licensing have been shipped to date. SuperFlash technology is the de facto eNVM solution for a broad spectrum of industries including industrial, automotive, consumer and computing for critical data and code storage, and is licensed by all of the top ten semiconductor foundries worldwide. "Mythic is pioneering innovative solutions in AI inference processing and AI sensor fusion for industrial, automotive and data center applications, effectively overcoming current AI power limitations," said Mark Reiten, vice president of Microchip's Edge AI business unit. "As the core memory technology for Mythic's next-generation products, memBrain delivers significant power efficiency and high performance for both edge and data center applications." The memBrain cell features: * Up to 8 data bits per bitcell (8 bpc) storage * Single digit nanoamp (nA) bitcell read current * 10-year data retention at operating temperature * 100,000 endurance cycles * Full state machine control of the 8 bpc multi-state write operation * Single cycle multiply-and-accumulate operations for aCIM "Mythic selected SST after an industry-wide search of eNVM technologies and determined the memBrain cell technology best enabled us to achieve the ultra-low-power and high performance required by our customers," said Dr. Taner Ozcelik, Mythic's chief executive officer. "Additionally, the wide foundry availability of its industry-proven SuperFlash technology, coupled with the outstanding support of the SST engineering team has been invaluable during our product development cycle." SST's memBrain technology has been developed and deployed in 40 nm and 28 nm foundry processes using production-ready SuperFlash memory. 22 nm memBrain development is planned to extend the technology roadmap. Designed to provide reliable, high-performance and low-power non-volatile storage directly on the chip, SuperFlash memory is widely used in applications that require fast access times, high endurance and data retention without the need for external memory components. Pricing and Availability Customers interested in SST's memBrain solutions and SuperFlash technology should access the SST website or contact a regional SST sales executive for details. Those interested in Mythic's products should visit the Mythic website or contact Taner Ozcelik at [email protected].

Tech in Asia
Dec 18th, 2025
SoftBank-backed AI chip firm Mythic raises $125m to rival Nvidia

SoftBank-backed AI chip firm Mythic raises $125m to rival Nvidia. Mythic has raised US$125 million in a funding round led by DCVC to advance its AI chip technology and compete with Nvidia. Investors in the round include New Enterprise Associates, Atreides Management, SoftBank Group, Honda Motor, and Lockheed Martin. Mythic develops analog computers designed to process large AI data sets with lower power usage than traditional digital CPUs. The company appointed Taner Ozcelik, a former Nvidia executive, as CEO in 2024. Mythic plans to use the new capital to commercialize its products and expand its customer base. Food for thought. * The company says its analog processing units (APUs) reach 120 trillion operations per second per watt, about 100x the energy efficiency of top GPUs 1, and can process up to 750x more tokens (the chunks of text these models read and generate) per second per watt than Nvidia's highest-end GPUs on large language models 2. The 750x figure comes from internal tests on production silicon validated by customers, with no public MLPerf or other third-party results for independent checks against Nvidia 2. * Software supports ONNX (Open Neural Network Exchange, an open model format) and NVIDIA TensorRT (an inference optimizer for Nvidia GPUs), plus mainstream frameworks 2. * The company cites validation by the U.S. Department of Defense, auto original equipment manufacturers (OEMs), and defense partners 2. Honda Motor and Lockheed Martin are strategic investors 2. The silicon is in production, and the company raised $125 million, yet it has not named paying customers or design wins 2. * Data centers face rising power limits as AI could use 10% of U.S. electricity by decade's end 1. That pressure prompts teams to look past GPU-heavy inference infrastructure (running trained models to generate outputs). * System integrators (IT services firms that design and deploy hardware/software stacks) with MLOps teams can offer model porting, optimization, plus benchmarking on non-Nvidia accelerators such as Mythic's APUs 2. * Service teams can test whether analog compute or other inference chips cut operating costs plus power use for select edge workloads (on-device or near where data is generated) 2. Early skill in quantization (reducing numerical precision to improve efficiency) and retraining workflows (fine-tuning models after conversion) can set these firms apart 3. How would you feel if you could no longer use Tech in Asia?

Mythic
Dec 17th, 2025
Mythic to Challenge AI’s GPU Pantheon with 100x Energy Advantage and Oversubscribed $125M Raise - Mythic

100x more energy-efficient than industry standard GPUs, Mythic’s analog processing units (APUs) promise a new era of accelerated computing across the AI hardware stack, at the data center and the edge. “Mythic will win based on this fundamental insight: energy efficiency will define the future of AI computing everywhere.”  — CEO Taner Ozcelik PALO ALTO…

CXOToday
May 21st, 2025
Mythik Raises $15M in Seed Funding

Jason Kothari's new company, Mythik, has raised $15 million in seed funding, marking the largest seed round in India's media tech space. Mythik aims to bring Eastern mythology, history, and folktales to a global audience using cutting-edge technology. The funding round includes investors like Sakal Media Group, BITKRAFT, Shah Rukh Khan's family office, and others. Mythik's founding team features former leaders from Disney, Netflix, and Amazon Studios.

Semiconductor Engineering
Aug 13th, 2021
Mythic Ai launches M1108 AMP

Mythic launched the first generation, the M1108 AMP, in November 2020, and its M1076 Mythic AMP in June 2021.

INACTIVE