NVIDIA

Deep Learning Solutions Architect – Large Scale Inference Optimization

United Kingdom

Not SpecifiedCompensation
Senior (5 to 8 years), Expert & Leadership (9+ years)Experience Level
Full TimeJob Type
UnknownVisa
Artificial Intelligence, Computer Hardware, SemiconductorsIndustries

Requirements

Candidates must possess a Master's or PhD in a relevant technical field or equivalent experience, with over 5 years of work or research experience in Python/C++ software development. A strong understanding of modern NLP, including transformer, state space, diffusion, and MOE model architectures, is required, along with knowledge of key NLP/LLM libraries for training or deployment. Excellent English communication and presentation skills are essential, as is the ability to collaborate effectively across multiple teams in a fast-paced environment. Experience with large-scale distributed DL training/inference, HPC systems, and data center design is advantageous.

Responsibilities

The Deep Learning Solutions Architect will work directly with key customers to understand their technology needs and provide optimal AI solutions, focusing on large-scale inference optimization. Responsibilities include performing in-depth analysis and optimization for GPU architecture systems, particularly Grace/ARM-based systems, and supporting the optimization of large-scale inference pipelines. The role involves partnering with Engineering, Product, and Sales teams to develop customer solutions, enabling product feature development through customer feedback, and conducting proof-of-concept evaluations.

Skills

Deep Learning
Neural Network Inference
TRT-LLM
vLLM
SGLang
Systems Knowledge
GPU Architecture
Grace/ARM Systems
Large Scale Inference Pipelines
Artificial Intelligence
NVIDIA Grace CPUs
Grace-Hopper
Grace-Blackwell Systems
Chip-to-Chip NVLINK
KV Cache Offloading
Hybrid Models
Diffusion Models

NVIDIA

Designs GPUs and AI computing solutions

About NVIDIA

NVIDIA designs and manufactures graphics processing units (GPUs) and system on a chip units (SoCs) for various markets, including gaming, professional visualization, data centers, and automotive. Their products include GPUs tailored for gaming and professional use, as well as platforms for artificial intelligence (AI) and high-performance computing (HPC) that cater to developers, data scientists, and IT administrators. NVIDIA generates revenue through the sale of hardware, software solutions, and cloud-based services, such as NVIDIA CloudXR and NGC, which enhance experiences in AI, machine learning, and computer vision. What sets NVIDIA apart from competitors is its strong focus on research and development, allowing it to maintain a leadership position in a competitive market. The company's goal is to drive innovation and provide advanced solutions that meet the needs of a diverse clientele, including gamers, researchers, and enterprises.

Santa Clara, CaliforniaHeadquarters
1993Year Founded
$19.5MTotal Funding
IPOCompany Stage
Automotive & Transportation, Enterprise Software, AI & Machine Learning, GamingIndustries
10,001+Employees

Benefits

Company Equity
401(k) Company Match

Risks

Increased competition from AI startups like xAI could challenge NVIDIA's market position.
Serve Robotics' expansion may divert resources from NVIDIA's core GPU and AI businesses.
Integration of VinBrain may pose challenges and distract from NVIDIA's primary operations.

Differentiation

NVIDIA leads in AI and HPC solutions with cutting-edge GPU technology.
The company excels in diverse markets, including gaming, data centers, and autonomous vehicles.
NVIDIA's cloud services, like CloudXR, offer scalable solutions for AI and machine learning.

Upsides

Acquisition of VinBrain enhances NVIDIA's AI capabilities in the healthcare sector.
Investment in Nebius Group boosts NVIDIA's AI infrastructure and cloud platform offerings.
Serve Robotics' expansion, backed by NVIDIA, highlights growth in autonomous delivery services.

Land your dream remote job 3x faster with AI