Full-Time

Research/Compiler Engineer

Confirmed live in the last 24 hours

Lightning AI

Lightning AI

51-200 employees

AI development platform for coding and deployment

AI & Machine Learning

Mid, Senior

London, UK

Category
Backend Engineering
FinTech Engineering
Software Engineering
Required Skills
Tensorflow
CUDA
Pytorch
Requirements
  • Strong experience with deep learning frameworks like PyTorch, JAX, or TensorFlow.
  • Expertise in compiler development or optimizations in distributed training and inference workflows is highly valued.
  • Proven track record contributing to open-source projects, especially in machine learning or high-performance computing. Experience collaborating with external partners is a plus.
  • Hands-on experience in model optimization, with a focus on maximizing performance, efficiency, and scalability in large-scale or distributed training setups.
  • Passion for engaging with open-source communities, including experience supporting users and advocating for project adoption.
  • Strong communication and collaboration skills for working within a close-knit, high-impact team environment.
  • Bachelor's degree in Computer Science, Engineering, or related field. Preferred Master's or PhD in machine learning and related areas.
Responsibilities
  • Develop the Thunder compiler, an open-source project developed in collaboration with NVIDIA, using your deep experience in PyTorch, JAX, or other deep learning frameworks.
  • Engage in performance-oriented model optimizations, around distributed training as well as inference.
  • Develop optimized kernels in CUDA or Triton to target specific use-cases.
  • Integrate Thunder throughout the PyTorch Lightning ecosystem.
  • Engage with the community and champion its growth.
  • Support the adoption of Thunder across the industry.
  • Work closely within the Lightning team as a strategic partner.

Lightning AI offers a platform for developing artificial intelligence applications, covering the entire process from ideation to deployment. It provides tools for coding, prototyping, and training AI models on GPUs, all accessible through a web browser without setup. The platform operates on a subscription model, featuring a cloud-based AI Studio that allows users to code on CPUs, debug on GPUs, and scale their projects. Key features include PyTorch Lightning and Lit-GPT, aimed at optimizing and scaling AI models for developers and enterprises.

Company Stage

Seed

Total Funding

$57M

Headquarters

New York City, New York

Founded

2015

Growth & Insights
Headcount

6 month growth

11%

1 year growth

23%

2 year growth

15%
Simplify Jobs

Simplify's Take

What believers are saying

  • The availability of Lightning AI Studio in AWS Marketplace can lead to greater productivity and faster development times for AI applications.
  • Thunder's ability to significantly speed up training and reduce costs can attract more developers and enterprises to Lightning AI's platform.
  • The strategic collaboration with AWS can provide optimized performance and first-class support, making Lightning AI a preferred choice for building and deploying AI products.

What critics are saying

  • The competitive landscape in AI development platforms is intense, with major players like Google and Microsoft posing significant threats.
  • Dependence on AWS for cloud services could limit flexibility and expose Lightning AI to risks associated with changes in AWS policies or pricing.

What makes Lightning AI unique

  • Lightning AI's integration with AWS Marketplace provides a seamless procurement process and flexible billing options, setting it apart from competitors.
  • The launch of Thunder, a source-to-source compiler for PyTorch, offers up to 40% speed-up in training large language models, a significant advantage over unoptimized code.
  • Strategic collaboration with AWS and support for Amazon EC2 Trn1 instances powered by AWS Trainium accelerators enhances Lightning AI's enterprise-grade cloud-based platform.

Help us improve and share your feedback! Did you find this helpful?