d-Matrix

Software Engineer, Principal - AI/ML Workloads

Santa Clara, California, United States

$180,000 – $300,000Compensation
Senior (5 to 8 years), Expert & Leadership (9+ years)Experience Level
Full TimeJob Type
UnknownVisa
AI & Machine Learning, HardwareIndustries

Requirements

Candidates should possess an MS or PhD in Computer Science, Electrical Engineering, Math, Physics, or a related degree, along with 10-12+ years of industry experience. A strong grasp of computer architecture, data structures, system software, and machine learning fundamentals is required. Proficiency in Python, C, or C++ development in a Linux environment is essential, as is experience with deep learning frameworks such as PyTorch or TensorFlow. Candidates should also have experience mapping NLP models to accelerators and a research background with a publication record in top-tier ML or computer architecture conferences is preferred.

Responsibilities

The Principal Software Engineer will be responsible for developing, enhancing, and maintaining the development and testing infrastructure for next-generation AI hardware. They will help productize the software stack for the AI compute engine and leverage the d-Matrix ISA and dataflow architecture to build optimized implementations of state-of-the-art large language models. The role involves collaborating with a team of compiler, hardware architecture experts, and machine learning model researchers, as well as contributing to research on novel techniques for the machine learning software stack, models, and architecture.

Skills

Python
C++
C
PyTorch
TensorFlow
Linux
Computer Architecture
Data Structures
Machine Learning
NLP
Transformers
State-Space Models
Deep Learning
ISA
Dataflow Architecture
Compiler
ML Op Kernels

d-Matrix

AI compute platform for datacenters

About d-Matrix

d-Matrix focuses on improving the efficiency of AI computing for large datacenter customers. Its main product is the digital in-memory compute (DIMC) engine, which combines computing capabilities directly within programmable memory. This design helps reduce power consumption and enhances data processing speed while ensuring accuracy. d-Matrix differentiates itself from competitors by offering a modular and scalable approach, utilizing low-power chiplets that can be tailored for different applications. The company's goal is to provide high-performance, energy-efficient AI inference solutions to large-scale datacenter operators.

Key Metrics

Santa Clara, CaliforniaHeadquarters
2019Year Founded
$149.8MTotal Funding
SERIES_BCompany Stage
Enterprise Software, AI & Machine LearningIndustries
201-500Employees

Benefits

Hybrid Work Options

Risks

Competition from Nvidia, AMD, and Intel may pressure d-Matrix's market share.
Complex AI chip design could lead to delays or increased production costs.
Rapid AI innovation may render d-Matrix's technology obsolete if not updated.

Differentiation

d-Matrix's DIMC engine integrates compute into memory, enhancing efficiency and accuracy.
The company offers scalable AI solutions through modular, low-power chiplets.
d-Matrix focuses on brain-inspired AI compute engines for diverse inferencing workloads.

Upsides

Growing demand for energy-efficient AI solutions boosts d-Matrix's low-power chiplets appeal.
Partnerships with companies like Microsoft could lead to strategic alliances.
Increasing adoption of modular AI hardware in data centers benefits d-Matrix's offerings.

Land your dream remote job 3x faster with AI