d-Matrix

AI Hardware Systems Engineer, Principal

Santa Clara, California, United States

$180,000 – $280,000Compensation
Mid-level (3 to 4 years), Senior (5 to 8 years)Experience Level
Full TimeJob Type
UnknownVisa
AI & Machine Learning, Hardware, Enterprise SoftwareIndustries

Requirements

Candidates should have a BS in Electrical Engineering or Computer Engineering, with a Master's degree preferred. A minimum of 10 years of experience in hardware development is required, along with experience in deploying products to volume production. Hands-on experience in the design, bring-up, and debugging of PCBAs and chassis is essential, as well as proficiency in using schematic capture and PCB layout tools. Knowledge of thermal and mechanical designs, along with familiarity with signal and power integrity concepts, is also expected.

Responsibilities

The AI Hardware Systems Engineer will design, develop, and deploy scalable GenAI inference solutions using d-Matrix accelerator silicon. They will collaborate with cross-functional teams to specify, design, and integrate custom accelerators, processors, and memory modules for AI workloads. The role involves leading the bring-up of new AI systems, conducting prototype testing and validation, and debugging hardware-software integration issues. Additionally, the engineer will prepare comprehensive documentation and stay updated on advancements in GenAI hardware and software technologies.

Skills

PCBAs
PCB Layout
Schematic Capture
Thermal Design
Mechanical Design
Signal Integrity
Power Integrity
Hardware Development
AI Inference
GenAI

d-Matrix

AI compute platform for datacenters

About d-Matrix

d-Matrix focuses on improving the efficiency of AI computing for large datacenter customers. Its main product is the digital in-memory compute (DIMC) engine, which combines computing capabilities directly within programmable memory. This design helps reduce power consumption and enhances data processing speed while ensuring accuracy. d-Matrix differentiates itself from competitors by offering a modular and scalable approach, utilizing low-power chiplets that can be tailored for different applications. The company's goal is to provide high-performance, energy-efficient AI inference solutions to large-scale datacenter operators.

Key Metrics

Santa Clara, CaliforniaHeadquarters
2019Year Founded
$149.8MTotal Funding
SERIES_BCompany Stage
Enterprise Software, AI & Machine LearningIndustries
201-500Employees

Benefits

Hybrid Work Options

Risks

Competition from Nvidia, AMD, and Intel may pressure d-Matrix's market share.
Complex AI chip design could lead to delays or increased production costs.
Rapid AI innovation may render d-Matrix's technology obsolete if not updated.

Differentiation

d-Matrix's DIMC engine integrates compute into memory, enhancing efficiency and accuracy.
The company offers scalable AI solutions through modular, low-power chiplets.
d-Matrix focuses on brain-inspired AI compute engines for diverse inferencing workloads.

Upsides

Growing demand for energy-efficient AI solutions boosts d-Matrix's low-power chiplets appeal.
Partnerships with companies like Microsoft could lead to strategic alliances.
Increasing adoption of modular AI hardware in data centers benefits d-Matrix's offerings.

Land your dream remote job 3x faster with AI