d-Matrix

AI Systems Solutions Architect

Santa Clara, California, United States

$180,000 – $300,000Compensation
Expert & Leadership (9+ years)Experience Level
Full TimeJob Type
UnknownVisa
AI & Machine Learning, Hardware, Robotics & AutomationIndustries

Requirements

Candidates must have over 15 years of industry experience and an engineering degree in Electrical Engineering, Computer Engineering, or Computer Science. A minimum of 5 years of experience in AI Server Systems is required, including work on architecture, development, and design projects. Additionally, candidates should have at least 5 years of experience in a customer-facing role, interfacing with OEMs, ODMs, and CSPs.

Responsibilities

The AI Systems Solutions Architect will develop world-class products around d-Matrix inference accelerators. This role involves engaging with key customers and internal architects to drive overall system solutions, analyzing and defining usage cases, and utilizing a broad spectrum of technologies. Responsibilities include designing, developing, and deploying scalable GenAI inference solutions, optimizing system solutions for performance and cost, and collaborating with Datacenter, OEM, and ODM customers during the product concept phase. The architect will also influence future product generations and stay updated on advancements in GenAI technologies.

Skills

AI
System Architecture
GenAI
Inference
Electrical Engineering
Computer Engineering
Computer Science
Memory
I/O
Power Delivery
Firmware
BMC
Hardware Management
Customer Facing
OEM
ODM
CSPs

d-Matrix

AI compute platform for datacenters

About d-Matrix

d-Matrix focuses on improving the efficiency of AI computing for large datacenter customers. Its main product is the digital in-memory compute (DIMC) engine, which combines computing capabilities directly within programmable memory. This design helps reduce power consumption and enhances data processing speed while ensuring accuracy. d-Matrix differentiates itself from competitors by offering a modular and scalable approach, utilizing low-power chiplets that can be tailored for different applications. The company's goal is to provide high-performance, energy-efficient AI inference solutions to large-scale datacenter operators.

Key Metrics

Santa Clara, CaliforniaHeadquarters
2019Year Founded
$149.8MTotal Funding
SERIES_BCompany Stage
Enterprise Software, AI & Machine LearningIndustries
201-500Employees

Benefits

Hybrid Work Options

Risks

Competition from Nvidia, AMD, and Intel may pressure d-Matrix's market share.
Complex AI chip design could lead to delays or increased production costs.
Rapid AI innovation may render d-Matrix's technology obsolete if not updated.

Differentiation

d-Matrix's DIMC engine integrates compute into memory, enhancing efficiency and accuracy.
The company offers scalable AI solutions through modular, low-power chiplets.
d-Matrix focuses on brain-inspired AI compute engines for diverse inferencing workloads.

Upsides

Growing demand for energy-efficient AI solutions boosts d-Matrix's low-power chiplets appeal.
Partnerships with companies like Microsoft could lead to strategic alliances.
Increasing adoption of modular AI hardware in data centers benefits d-Matrix's offerings.

Land your dream remote job 3x faster with AI