AI Systems Solutions Architect at d-Matrix

Santa Clara, California, United States

d-Matrix Logo
$180,000 – $300,000Compensation
Expert & Leadership (9+ years)Experience Level
Full TimeJob Type
UnknownVisa
AI & Machine Learning, Hardware, Robotics & AutomationIndustries

Skills

Key technologies and capabilities for this role

AISystem ArchitectureGenAIInferenceElectrical EngineeringComputer EngineeringComputer ScienceMemoryI/OPower DeliveryFirmwareBMCHardware ManagementCustomer FacingOEMODMCSPs

Questions & Answers

Common questions about this position

What is the salary range for the AI Systems Solutions Architect position?

The salary range is $180K - $300K.

Is this role remote or hybrid, and what are the location requirements?

This is a hybrid role requiring onsite presence at the Santa Clara, CA headquarters 3 days per week.

What are the key required skills and experience for this role?

Candidates need 15+ years of industry experience with an engineering degree in EE, CE, or CS, plus 5+ years in AI server systems (architecture, memory, I/O, power, firmware, BMC) and 5+ years in customer-facing roles with OEMs, ODMs, and CSPs.

What is the company culture like at d-Matrix?

The culture values respect, collaboration, humility, and direct communication, with an inclusive and diverse team.

What makes a strong candidate for this AI Systems Solutions Architect role?

A strong candidate has 15+ years of industry experience including 5+ years in AI server systems and customer-facing roles with OEMs/ODMs/CSPs, plus the ability to collaborate across teams and engage early with customers on system design.

d-Matrix

AI compute platform for datacenters

About d-Matrix

d-Matrix focuses on improving the efficiency of AI computing for large datacenter customers. Its main product is the digital in-memory compute (DIMC) engine, which combines computing capabilities directly within programmable memory. This design helps reduce power consumption and enhances data processing speed while ensuring accuracy. d-Matrix differentiates itself from competitors by offering a modular and scalable approach, utilizing low-power chiplets that can be tailored for different applications. The company's goal is to provide high-performance, energy-efficient AI inference solutions to large-scale datacenter operators.

Santa Clara, CaliforniaHeadquarters
2019Year Founded
$149.8MTotal Funding
SERIES_BCompany Stage
Enterprise Software, AI & Machine LearningIndustries
201-500Employees

Benefits

Hybrid Work Options

Risks

Competition from Nvidia, AMD, and Intel may pressure d-Matrix's market share.
Complex AI chip design could lead to delays or increased production costs.
Rapid AI innovation may render d-Matrix's technology obsolete if not updated.

Differentiation

d-Matrix's DIMC engine integrates compute into memory, enhancing efficiency and accuracy.
The company offers scalable AI solutions through modular, low-power chiplets.
d-Matrix focuses on brain-inspired AI compute engines for diverse inferencing workloads.

Upsides

Growing demand for energy-efficient AI solutions boosts d-Matrix's low-power chiplets appeal.
Partnerships with companies like Microsoft could lead to strategic alliances.
Increasing adoption of modular AI hardware in data centers benefits d-Matrix's offerings.

Land your dream remote job 3x faster with AI