Full-Time

SOC Architect – Principal

Posted on 6/27/2024

d-Matrix

d-Matrix

201-500 employees

AI compute platform for datacenters

Enterprise Software
AI & Machine Learning

Expert

Santa Clara, CA, USA

Hybrid position requiring onsite presence in Santa Clara, CA for 3 days per week.

Category
Applied Machine Learning
Deep Learning
AI & Machine Learning
Required Skills
Python
C/C++
Requirements
  • MS in Computer Science, Electrical and Computer Engineering, or a related scientific discipline with 10+ year of relevant experience in compute architecture and system design area.
  • A problem solver, self-starter, be able to effectively multi-task and simplify complex problems to come up with elegant and efficient solution (no compromise here!)
  • 10 or more years of demonstrated experience leading Architecture or Design Development of SoCs.
  • Expert knowledge of AI SoC Architecture, requirements, and definition.
  • Be able to identify potential architectural issues and design flaws early in the design phase to provide effective solutions and recommendations.
  • Experience with CPU, GPU, ASIC and ML accelerator architecture exploration and architectural modeling.
  • High proficiency in performance modeling, ranging from simple analytical model to complex cycle accurate performance model and correlation.
  • Deep, wide and current knowledge in machine learning and modern deep learning is required.
  • Hands-on experience with CNN, RNN, Transformer neural network architectures.
  • Experience with specialized HW accelerator systems for deep neural network is preferred.
  • PhD in Computer Science, Electrical and Computer Engineering, or a related scientific discipline with 10+ year of relevant experience in compute architecture and system design area.
  • High proficiency in algorithm analysis, data structure, and Python programming is required. Proficiency with C/C++ programming is preferred.
  • Passionate about AI and thriving in a fast-paced and dynamic startup culture.
Responsibilities
  • Lead the design and delivery of SoC requirements and architecture that delivers exceptional customer experiences.
  • Analysis of AI workloads such as transformers & CNNs across existing and new emerging product categories to identify bottlenecks and opportunities for improvement.
  • Identify areas of improvement and drive the architecture to deliver the product. Design, implement and evaluate efficient deep neural network architectures and algorithms for d-Matrix's AI compute engine.
  • Conduct research to guide hardware/software co-design of SIMD engine, control plane, data reshape engine, data convertor blocks, NoC substrate and many other blocks.
  • Work on various phases of our flagship product (Corsair I and II) ranging from performance projection, performance modeling, performance correlation and performance debug aspects.
  • Develop and maintain tools for high-level simulation and research to enable architectural exploration of current and future generation products.
  • Engage and collaborate with SW team to meet stack development milestones.
  • Port customer workloads, optimize them for deployment, generate reference implementations and evaluate performance.
  • Report and present progress timely and effectively.
  • Contribute to publications of papers and intellectual properties.

d-Matrix focuses on improving the efficiency of AI computing for large datacenter customers. Its main product is the digital in-memory compute (DIMC) engine, which combines computing capabilities directly within programmable memory. This design helps reduce power consumption and enhances data processing speed while ensuring accuracy. d-Matrix differentiates itself from competitors by offering a modular and scalable approach, utilizing low-power chiplets that can be tailored for different applications. The company's goal is to provide high-performance, energy-efficient AI inference solutions to large-scale datacenter operators.

Company Stage

Series B

Total Funding

$149.8M

Headquarters

Santa Clara, California

Founded

2019

Growth & Insights
Headcount

6 month growth

2%

1 year growth

0%

2 year growth

9%
Simplify Jobs

Simplify's Take

What believers are saying

  • Growing demand for energy-efficient AI solutions boosts d-Matrix's low-power chiplets appeal.
  • Partnerships with companies like Microsoft could lead to strategic alliances.
  • Increasing adoption of modular AI hardware in data centers benefits d-Matrix's offerings.

What critics are saying

  • Competition from Nvidia, AMD, and Intel may pressure d-Matrix's market share.
  • Complex AI chip design could lead to delays or increased production costs.
  • Rapid AI innovation may render d-Matrix's technology obsolete if not updated.

What makes d-Matrix unique

  • d-Matrix's DIMC engine integrates compute into memory, enhancing efficiency and accuracy.
  • The company offers scalable AI solutions through modular, low-power chiplets.
  • d-Matrix focuses on brain-inspired AI compute engines for diverse inferencing workloads.

Help us improve and share your feedback! Did you find this helpful?

Benefits

Hybrid Work Options

INACTIVE