d-Matrix

AI Hardware Architect

Santa Clara, California, United States

Not SpecifiedCompensation
Expert & Leadership (9+ years)Experience Level
Full TimeJob Type
UnknownVisa
Semiconductors, Hardware Design, HardwareIndustries

Requirements

Candidates must possess a minimum of a PhD in Computer Science, Engineering, or a related field, along with a minimum of 15 years of industry experience. Strong knowledge of computer architecture, hardware/software co-design, digital design, and machine learning fundamentals is required, as is experience with data parallel architectures and SIMD/Vector extensions. Proficiency in C/C++ or Python development within a Linux environment and utilizing standard development tools is also necessary.

Responsibilities

The AI Hardware Architect will be responsible for design space exploration, workload characterization/mapping, and ISA design spanning the data and control planes within the SoC. They will design, model, and drive new architectural features to support next-generation hardware, evaluate the performance of cutting-edge AI workloads, and work closely with a team of hardware architects, software (ML, Systems, Compiler) and hardware (mixed signal, DSP, CPU) experts to build architectural solutions from strawman to detailed specification.

Skills

Computer Architecture
HW/SW Co-design
digital design
machine learning fundamentals
Data parallel architectures
SIMD/Vector extensions
workload characterization
performance modeling
RTL analysis
GPU/AI Accelerator architectures
C/C++
Python
Linux development tools

d-Matrix

AI compute platform for datacenters

About d-Matrix

d-Matrix focuses on improving the efficiency of AI computing for large datacenter customers. Its main product is the digital in-memory compute (DIMC) engine, which combines computing capabilities directly within programmable memory. This design helps reduce power consumption and enhances data processing speed while ensuring accuracy. d-Matrix differentiates itself from competitors by offering a modular and scalable approach, utilizing low-power chiplets that can be tailored for different applications. The company's goal is to provide high-performance, energy-efficient AI inference solutions to large-scale datacenter operators.

Santa Clara, CaliforniaHeadquarters
2019Year Founded
$149.8MTotal Funding
SERIES_BCompany Stage
Enterprise Software, AI & Machine LearningIndustries
201-500Employees

Benefits

Hybrid Work Options

Risks

Competition from Nvidia, AMD, and Intel may pressure d-Matrix's market share.
Complex AI chip design could lead to delays or increased production costs.
Rapid AI innovation may render d-Matrix's technology obsolete if not updated.

Differentiation

d-Matrix's DIMC engine integrates compute into memory, enhancing efficiency and accuracy.
The company offers scalable AI solutions through modular, low-power chiplets.
d-Matrix focuses on brain-inspired AI compute engines for diverse inferencing workloads.

Upsides

Growing demand for energy-efficient AI solutions boosts d-Matrix's low-power chiplets appeal.
Partnerships with companies like Microsoft could lead to strategic alliances.
Increasing adoption of modular AI hardware in data centers benefits d-Matrix's offerings.

Land your dream remote job 3x faster with AI