AI / ML System Software Engineer, Senior Staff at d-Matrix

Santa Clara, California, United States

d-Matrix Logo
$180,000 – $280,000Compensation
Senior (5 to 8 years), Expert & Leadership (9+ years)Experience Level
Full TimeJob Type
UnknownVisa
Artificial Intelligence, TechnologyIndustries

Requirements

  • BS in Computer Science, Engineering, Math, Physics or related degree with 8+ years of industry software development experience (MS in Computer Science, Engineering, Math, Physics or related degree preferred with 5+ years)
  • Strong grasp of computer architecture, data structures, system software, and machine learning fundamentals
  • Proficient in C/C++/Python development in Linux environment and using standard development tools
  • Experience with distributed, high performance software design and implementation
  • Self-motivated team player with a strong sense of ownership and leadership
  • Preferred
  • MS or PhD in Computer Science, Electrical Engineering, or related fields
  • Prior startup, small team or incubation experience
  • Work experience at a cloud provider or AI compute / sub-system company
  • Experience implementing SIMD algorithms on vector processors
  • Experience with open source ML compiler frameworks such as MLIR
  • Experience with deep learning frameworks (such as PyTorch, Tensorflow)
  • Experience with deep learning runtimes (such as ONNX Runtime, TensorRT)
  • Experience with inference servers/model serving frameworks (such as Triton, TFServ, KubeFlow)
  • Experience with distributed systems collectives such as NCCL, OpenMPI
  • Experience deploying ML workloads on distributed systems, in a multitenancy environment
  • Experience with MLOps from definition to deployment including training, quantization, sparsity, model preprocessing, and deployment
  • Experience training, tuning and deploying ML models for CV (ResNet), NLP (BERT, GPT), and/or Recommendation Systems (DLRM)

Responsibilities

  • Be part of the team that helps productize the SW stack for the AI compute engine
  • Develop, enhance, and maintain the next-generation AI Deployment software
  • Work across all aspects of the full stack tool chain, understanding nuances of hardware-software co-design optimization and trade-offs
  • Build and scale software deliverables in a tight development window
  • Work with a team of compiler experts to build out the compiler infrastructure
  • Collaborate closely with other software (ML, Systems) and hardware (mixed signal, DSP, CPU) experts in the company

Skills

C++
C
Python
Linux
Computer Architecture
Data Structures
System Software
Machine Learning
Compilers
Hardware-Software Co-Design
AI Deployment

d-Matrix

AI compute platform for datacenters

About d-Matrix

d-Matrix focuses on improving the efficiency of AI computing for large datacenter customers. Its main product is the digital in-memory compute (DIMC) engine, which combines computing capabilities directly within programmable memory. This design helps reduce power consumption and enhances data processing speed while ensuring accuracy. d-Matrix differentiates itself from competitors by offering a modular and scalable approach, utilizing low-power chiplets that can be tailored for different applications. The company's goal is to provide high-performance, energy-efficient AI inference solutions to large-scale datacenter operators.

Santa Clara, CaliforniaHeadquarters
2019Year Founded
$149.8MTotal Funding
SERIES_BCompany Stage
Enterprise Software, AI & Machine LearningIndustries
201-500Employees

Benefits

Hybrid Work Options

Risks

Competition from Nvidia, AMD, and Intel may pressure d-Matrix's market share.
Complex AI chip design could lead to delays or increased production costs.
Rapid AI innovation may render d-Matrix's technology obsolete if not updated.

Differentiation

d-Matrix's DIMC engine integrates compute into memory, enhancing efficiency and accuracy.
The company offers scalable AI solutions through modular, low-power chiplets.
d-Matrix focuses on brain-inspired AI compute engines for diverse inferencing workloads.

Upsides

Growing demand for energy-efficient AI solutions boosts d-Matrix's low-power chiplets appeal.
Partnerships with companies like Microsoft could lead to strategic alliances.
Increasing adoption of modular AI hardware in data centers benefits d-Matrix's offerings.

Land your dream remote job 3x faster with AI