Simplify Logo

Full-Time

FPGA Design Engineer

Staff

Confirmed live in the last 24 hours

d-Matrix

d-Matrix

51-200 employees

AI compute platform for datacenters

Hardware
AI & Machine Learning

Senior

Santa Clara, CA, USA

Hybrid position requiring onsite presence in Santa Clara, CA for 3 days per week.

Category
Embedded Systems Engineering
Electrical Engineering
Hardware Engineering
Finance & Banking
Required Skills
Verilog
Python
VHDL
FPGA
Requirements
  • Bachelor's degree in Electrical Engineering, Computer Engineering, or a related field, Master's degree preferred with a Minimum of 5+ years of experience in FPGA design and verification.
  • Expertise in hardware design using Hardware Description Languages (HDLs) like Verilog or VHDL
  • Familiarity with RISC-V architecture and embedded systems development
  • Understanding of hardware-software integration concepts
  • Experience with scripting languages like Python for test automation
  • Strong analytical and problem-solving skills
  • Excellent communication, collaboration, and teamwork abilities
  • Thrive in dynamic environments where innovative problem-solving is key
  • Experience with industry-standard management protocols (MCTP, PLDM, SPDM)
  • Experience with platform BMC (Baseboard Management Controller)
  • Knowledge of power management techniques (PMBus)
  • Knowledge of hardware security and secure boot concepts.
  • Experience with cloud server architectures and concepts
Responsibilities
  • Design and verify FPGA-based solutions for d-Matrix AI inference accelerator management
  • Define FPGA microarchitecture specifications and collaborate with stakeholders to ensure alignment with project requirements.
  • Develop resilient dual boot architecture for multi-core multi chiplet booting
  • Design and implement hardware and software modules for platform power management, health monitoring, and telemetry data acquisition.
  • Interface with host server BMC through SMBus mailbox with management protocol overlays such as MCTP, PLDM and SPDM
  • Integrate RISC-V CPU cores and related firmware into FPGA designs.
  • Develop eFuse controller within the FPGA
  • Design and integrate a secure boot solution adhering to NIST standards within the FPGA to enable secure booting of d-Matrix accelerator chiplets
  • Collaborate with cross-functional teams to ensure seamless hardware-software integration and support inference accelerator hardware bring-up and troubleshooting.
  • Author Python scripts for hardware testing and automation

d-Matrix focuses on improving the efficiency of AI computing for large datacenter customers. Its main product is the digital in-memory compute (DIMC) engine, which combines computing capabilities directly with programmable memory. This design helps reduce power consumption and enhances data processing speed while ensuring accuracy. d-Matrix differentiates itself from competitors by offering a modular and scalable approach, utilizing low-power chiplets that can be tailored for different applications. The company's goal is to provide high-performance, energy-efficient AI inference solutions to large-scale datacenter operators.

Company Stage

Series B

Total Funding

$161.5M

Headquarters

Santa Clara, California

Founded

2019

Growth & Insights
Headcount

6 month growth

-11%

1 year growth

3%

2 year growth

243%
Simplify Jobs

Simplify's Take

What believers are saying

  • Securing $110 million in Series B funding positions d-Matrix for rapid growth and technological advancements.
  • Their Jayhawk II silicon aims to solve critical issues in AI inference, such as cost, latency, and throughput, making generative AI more commercially viable.
  • The company's focus on efficient AI inference could attract significant interest from data centers and enterprises looking to deploy large language models.

What critics are saying

  • Competing against industry giants like Nvidia poses a significant challenge in terms of market penetration and customer acquisition.
  • The high dependency on continuous innovation and technological advancements could strain resources and lead to potential setbacks.

What makes d-Matrix unique

  • d-Matrix focuses on developing AI hardware specifically optimized for Transformer models, unlike general-purpose AI chip providers like Nvidia.
  • Their digital in-memory compute (DIMC) architecture with chiplet interconnect is a first-of-its-kind innovation, setting them apart in the AI hardware market.
  • Backed by major investors like Microsoft, d-Matrix has the financial support to challenge established players like Nvidia.

Help us improve and share your feedback! Did you find this helpful?