HW/SW Co-design Engineer
Updated on 12/2/2023
OpenAI

501-1,000 employees

AI research and deployment company
Company Overview
OpenAI's mission is to ensure that artificial general intelligence benefits all of humanity. OpenAI's mission is to ensure that artificial general intelligence (AGI)—by which they mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity.
AI & Machine Learning
Social Impact
Venture Capital

Company Stage

Later Stage VC

Total Funding

$11.3B

Founded

2015

Headquarters

San Francisco, California

Growth & Insights
Headcount

6 month growth

-22%

1 year growth

85%

2 year growth

271%
Locations
San Francisco, CA, USA
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
CUDA
Python
CategoriesNew
AI & Machine Learning
Hardware Engineering
Requirements
  • Deep understanding of GPU and/or other AI accelerators
  • Experience with CUDA or a related accelerator programming language
  • Experience driving ML accuracy with low precision formats
  • Ability to collaborate with ML engineers, kernel writers, and compiler developers
  • 3+ years of relevant industry experience
  • Alignment with well-established HPC infrastructure
Responsibilities
  • Work with hardware vendors to co-design future hardware for programmability and performance
  • Assist hardware vendors in developing optimal kernels and add support for it in our compiler
  • Develop performance estimates for critical kernels for different hardware configurations
  • Work with ML engineers, kernel engineers, and compiler developers to understand their vision and needs from high-performance accelerators
  • Manage communication and coordination with internal and external engagements
Desired Qualifications
  • PhD in Computer Science and Engineering with a specialization in Computer Architecture, Parallel Computing, Compilers, or other Systems
  • Strong coding skills in C/C++ and Python
  • Experience working with hardware developers
  • Experience building compilers
  • Good understanding of LLMs and challenges related to their training and inference