Full-Time

Senior Mixed Signal Design Validation Engineer

Posted on 5/9/2026

NVIDIA

NVIDIA

10,001+ employees

Designs GPUs and AI HPC platforms

Compensation Overview

$168k - $310.5k/yr

+ Equity

Company Historically Provides H1B Sponsorship

Santa Clara, CA, USA

In Person

Category
QA & Testing (2)
,
Required Skills
Python
Git
Oscilloscope
C/C++
Requirements
  • A BS degree or equivalent experience with 8 years experience or higher in Electrical Engineering, Computer Engineering, or a related field.
  • Solid experience in silicon bring-up and debugging, particularly for high-speed memory interfaces
  • Strong understanding of Memory system and specifications.
  • In-depth technical knowledge of mixed-signal circuits, analog, and digital systems, with a specific focus on high-speed memory technologies.
  • Strong knowledge of signal integrity and power integrity.
  • Experience with lab instrumentation, such as oscilloscopes, logic analyzers, spectrum analyzer, TDR, phase noise analyzer, signal generators and BER tester.
  • Strong scripting skills in Python, C, or similar languages, with experience in Git and JMP for version control and data analysis.
  • Ability to work in a fast-paced, dynamic environment that involves multiple-chip bring-ups and system validations.
  • Tight-knit collaboration are required to work effectively with multi-functional teams.
  • Strong problem-solving skills, with a proactive and forward-thinking approach.
Responsibilities
  • Lead and drive the creation of validation test plan and test scripts for DDR I/O associated with LPDDR, GDDR, and HBM memory interfaces, ensuring all necessary required analog and digital circuits are covered.
  • Conduct thorough characterization of analog, digital, and mixed-signal circuit blocks under varying conditions (process, voltage, temperature) and correlate results with simulations.
  • Drive system-level validation for memory interfaces, ensuring high-speed performance and quality metrics meet the demanding standards required by our GPU and AI systems.
  • Investigation validating Memory I/O circuit to understand circuit weakness paving ways circuit improvement, architecture enhancement in future products.
  • Lead debugging efforts and design creative solutions to resolve unexpected bugs.
  • Validate silicon performance, quality, and margins for refinement of memory interface designs.
  • Develop and implement test scripts to optimize DDR I/O circuit, sub-circuits and overall system performance
  • Work closely with multi-functional teams, including Mixed signal design, PISI team, hardware, firmware, and Memory qualification engineer, to resolve issues and improve the performance for memory interfaces.
  • Develop and implement tools and scripts to enhance electrical characterization and chip bring-up processes.
  • Mentor and train new engineers on validation practices, encouraging a culture of excellence and continuous improvement.
Desired Qualifications
  • Need strong expertise in measurement theory and the use of test instruments, especially in the context of memory interface validation. Knowledge of other high-speed transceiver performance is a must.

NVIDIA designs and manufactures graphics processing units (GPUs) and computing platforms used for gaming, data centers, and artificial intelligence. These products work by using parallel processing to handle complex mathematical calculations much faster than standard computer processors, supported by a software ecosystem that allows developers to build and run AI models. Unlike competitors that may focus solely on hardware, NVIDIA integrates its chips with specialized software and cloud services to create a complete environment for high-performance tasks. The company’s goal is to provide the underlying technology necessary to power advanced computing, from realistic video game graphics to autonomous vehicles and large-scale data analysis.

Company Size

10,001+

Company Stage

IPO

Headquarters

Santa Clara, California

Founded

1993

Simplify Jobs

Simplify's Take

What believers are saying

  • Toyota adopts NVIDIA DRIVE AGX Orin, boosting automotive revenue 103% in Q4 FY2025.
  • SoftBank plans NVIDIA AI servers in Japan by 2030; IREN deploys 5GW infrastructure.
  • NVIDIA reaches $5.5T market cap with $216B FY revenue and $400B projected FCF.

What critics are saying

  • Broadcom supplies custom chips to Google through 2031, Anthropic from 2027, OpenAI.
  • China revenue hits zero from $17B due to US restrictions, $4.5B Q1 2026 charge.
  • B200 GPU rentals drop 30% as sentiment flips bearish, cooling FY2027 $78B guidance.

What makes NVIDIA unique

  • NVIDIA invented the GPU in 1999, pioneering accelerated computing.
  • CUDA platform from 2006 enables GPUs for AI and parallel computing.
  • Full-stack AI infrastructure powers 80% of AI training GPUs in 2025.

Help us improve and share your feedback! Did you find this helpful?

Your Connections

People at NVIDIA who can refer or advise you

Benefits

Company Equity

401(k) Company Match

Growth & Insights and Company News

Headcount

6 month growth

-1%

1 year growth

-3%

2 year growth

-2%
The Associated Press
Apr 15th, 2026
Matlantis integrates NVIDIA ALCHEMI Toolkit for 10x faster materials simulation

Matlantis has integrated NVIDIA's ALCHEMI Toolkit into its materials simulation platform to accelerate industrial materials discovery. The company previously incorporated NVIDIA Warp-optimised kernels, achieving up to 10x speed improvements in atomistic calculations. The integration includes LightPFP, Matlantis' lightweight potential for large-scale simulations, which uses a server-based architecture with NVIDIA ALCHEMI Toolkit-Ops to reduce communication bottlenecks. Matlantis plans to integrate its flagship Universal Machine-Learning Interatomic Potential with the toolkit to further enhance GPU efficiency. Launched in 2021, Matlantis is a cloud-based atomistic simulator jointly developed by PFN and ENEOS. The platform uses deep learning to increase simulation speeds by tens of thousands of times and serves over 150 companies discovering materials including catalysts, batteries and semiconductors.

CNBC
Apr 14th, 2026
Nvidia stock surges 18% on 10-day winning streak fuelled by $1T GPU orders through 2027

Nvidia shares have climbed 18% over a ten-day winning streak, the longest since 2023. The stock is trading about 8% below its October all-time high of $212.19. CEO Jensen Huang revealed at last month's GTC conference that Nvidia has over $1 trillion in GPU orders through 2027, including Blackwell and next-generation Vera Rubin chips. Data centre revenue surged 75% year-over-year and now comprises 88% of the business, a dramatic shift from five years ago when gaming dominated. The rally follows major deals including Meta's February commitment to deploy millions of Nvidia chips across its global data centres. On Monday, Nvidia denied rumours it was pursuing acquisitions of PC makers Dell or HP. The company also unveiled Ising, a new family of open-source models for quantum computing.

Yahoo Finance
Apr 14th, 2026
D-Wave CEO claims quantum computers could challenge Nvidia's AI dominance with superior power efficiency

D-Wave Quantum CEO Alan Baratz claims quantum computing poses a threat to Nvidia, citing superior energy efficiency. Speaking at the Semafor World Economy Summit, Baratz said D-Wave's quantum computer uses just 10 kilowatts of power—equivalent to five or 10 GPUs—whilst solving problems that would take GPU systems nearly a million years. D-Wave shares rose nearly 16% on Tuesday, part of a 140% gain over the past year. The company reported $2.75 million in Q4 revenue, missing analyst estimates, but bookings surged 471% to $13.4 million. The $5.3 billion company recently secured a $20 million agreement with Florida Atlantic University and acquired Quantum Circuits for $550 million. However, quantum machines remain specialised tools, unable to run large language models that drive Nvidia's dominance.

Yahoo Finance
Apr 14th, 2026
Vertiv partners with Nvidia on AI data centre infrastructure as analysts raise price target to $300

Vertiv Holdings has been reaffirmed with a Buy rating by Evercore ISI, setting a price target of $280, whilst Barclays raised its target from $281 to $300 with an Overweight rating. The electrical equipment company is partnering with Nvidia on AI infrastructure development. On 16th March, Nvidia introduced its Vera Rubin DSX AI Factory reference design, with Vertiv providing critical power and cooling solutions for AI data centres. The partnership integrates Vertiv's infrastructure expertise with Nvidia's AI systems to enhance energy efficiency and performance. Vertiv is developing Vertiv OneCore Rubin DSX, a prefabricated system designed to accelerate AI factory deployment. The Brussels-headquartered company specialises in critical digital infrastructure technologies for data centres and communication networks.

Yahoo Finance
Apr 14th, 2026
Nvidia and Dell: AI infrastructure stocks to buy ahead of May earnings

Nvidia and Dell Technologies are positioned as attractive AI infrastructure investments ahead of their May earnings reports, according to recent analysis. Both companies supply critical hardware for AI computing, with demand for AI capacity continuing to outpace available resources across major cloud services. Nvidia shares have remained flat for six months despite strong fundamentals. Last quarter, its data centre business generated $62 billion in revenue, up 75% year over year, with a 75% gross margin. The company expects over $1 trillion in cumulative orders for its Blackwell and upcoming Rubin chips through 2027. Trading at 17 times next year's expected earnings, Nvidia's valuation appears discounted relative to its 66% revenue growth in fiscal year 2026. Dell Technologies similarly stands to benefit from the AI infrastructure build-out. Both companies report earnings in May.