Systems Engineer, Customer Platforms - Tech Lead/Principal at d-Matrix

Santa Clara, California, United States

d-Matrix Logo
$175,000 – $270,000Compensation
Senior (5 to 8 years), Expert & Leadership (9+ years)Experience Level
Full TimeJob Type
UnknownVisa
AI, Semiconductors, Data CentersIndustries

Requirements

  • Bachelor’s or Master’s in EE, CE, or related field with 5+ years of hands-on experience with GPU server and rack-scale solution architecture, design, and bring-up
  • Strong debugging skills across hardware, BIOS/firmware, BMC, and OS levels
  • Experience with datacenter fleet monitoring, issue debug, and rollout fix
  • Preferred Qualifications
  • Experience working with ODM and OEM vendors for GPU servers and rack-scale solutions
  • Hands-on experience with main board designs using multiple PCIe PEX switches
  • Hands-on experience with Mechanical Design, Thermal Design (Fan Control and Liquid cooling ideal), Power Design, and Compliance
  • Hands-on experience with BMC integration

Responsibilities

  • Own end-to-end platform definition and integration of d-Matrix PCIe accelerators into server and rack-scale systems
  • Debug and resolve hardware/software issues across BIOS, firmware, PCIe, and Linux OS layers
  • Collaborate with OEMs on platform design requirements (power, thermals, layout)
  • Support system validation, including power-on, thermal, and stress testing
  • Partner with software, silicon, and customer support teams to deliver deployable systems
  • Document platform configurations, integration steps, and bring-up procedures

Skills

PCIe
BIOS
Firmware
Linux OS
GPU Servers
Rack-Scale Systems
Platform Integration
Hardware Debugging
System Validation
Thermal Testing
Power Management
ODM Collaboration
OEM Design

d-Matrix

AI compute platform for datacenters

About d-Matrix

d-Matrix focuses on improving the efficiency of AI computing for large datacenter customers. Its main product is the digital in-memory compute (DIMC) engine, which combines computing capabilities directly within programmable memory. This design helps reduce power consumption and enhances data processing speed while ensuring accuracy. d-Matrix differentiates itself from competitors by offering a modular and scalable approach, utilizing low-power chiplets that can be tailored for different applications. The company's goal is to provide high-performance, energy-efficient AI inference solutions to large-scale datacenter operators.

Santa Clara, CaliforniaHeadquarters
2019Year Founded
$149.8MTotal Funding
SERIES_BCompany Stage
Enterprise Software, AI & Machine LearningIndustries
201-500Employees

Benefits

Hybrid Work Options

Risks

Competition from Nvidia, AMD, and Intel may pressure d-Matrix's market share.
Complex AI chip design could lead to delays or increased production costs.
Rapid AI innovation may render d-Matrix's technology obsolete if not updated.

Differentiation

d-Matrix's DIMC engine integrates compute into memory, enhancing efficiency and accuracy.
The company offers scalable AI solutions through modular, low-power chiplets.
d-Matrix focuses on brain-inspired AI compute engines for diverse inferencing workloads.

Upsides

Growing demand for energy-efficient AI solutions boosts d-Matrix's low-power chiplets appeal.
Partnerships with companies like Microsoft could lead to strategic alliances.
Increasing adoption of modular AI hardware in data centers benefits d-Matrix's offerings.

Land your dream remote job 3x faster with AI