Software Engineering, Senior Director - Kernels at d-Matrix

Santa Clara, California, United States

d-Matrix Logo
$180,000 – $300,000Compensation
Senior (5 to 8 years), Expert & Leadership (9+ years)Experience Level
Full TimeJob Type
UnknownVisa
Artificial Intelligence, Semiconductors, TechnologyIndustries

Requirements

  • MS or PhD in Computer Engineering, Math, Physics or related degree with 12+ years of industry experience
  • Strong grasp of computer architecture, data structures, system software, and machine learning fundamentals
  • Proficient in C/C++ and Python development in Linux environment and using standard development tools
  • Experience implementing algorithms in high level languages such as C/C++, Python
  • Experience implementing algorithms for specialized hardware such as FPGAs, DSPs, GPUs, AI accelerators using libraries such as CUDA etc
  • Experience in implementing operators commonly used in ML workloads - GEMMs, Convolutions, BLAS, SIMD operators for operations like softmax, layer normalization, pooling etc
  • Experience with development for embedded SIMD vector processors such as Tensilica
  • Self-motivated team player with a strong sense of ownership and leadership
  • Experience building software kernels for HW architectures
  • Very strong understanding of various hardware architectures and how to map algorithms to the architecture
  • Understand how to map computational graphs generated by AI frameworks to the underlying architecture
  • Past experience working across all aspects of the full stack tool chain and understand the nuances of what it takes to optimize and trade-off various aspects of hardware-software co-design
  • Able to build and scale software deliverables in a tight development window

Responsibilities

  • Be part of the team that helps productize the SW stack for our AI compute engine
  • Responsible for the development, enhancement, and maintenance of software kernels for next-generation AI hardware
  • Work with a team of compiler experts to build out the compiler infrastructure working closely with other software (ML, Systems) and hardware (mixed signal, DSP, CPU) experts in the company

Skills

Key technologies and capabilities for this role

Software KernelsHardware ArchitecturesComputational GraphsAI FrameworksCompiler InfrastructureHardware-Software Co-DesignML SoftwareSystems SoftwareDSPCPUMixed Signal Hardware

Questions & Answers

Common questions about this position

What is the salary range for this position?

The salary range is $180K - $300K.

Is this role remote or hybrid, and what are the location requirements?

This is a hybrid role requiring onsite work at the Santa Clara, CA headquarters 3 days per week.

What key skills and experience are required for this role?

Required skills include a strong grasp of computer architecture, data structures, system software, and machine learning fundamentals; proficiency in C/C++ and Python in Linux; experience implementing ML algorithms like GEMMs and convolutions on specialized hardware such as GPUs and AI accelerators; plus 12+ years of industry experience with an MS or PhD.

What is the company culture like at d-Matrix?

The culture emphasizes respect, collaboration, humility, direct communication, inclusivity, and diverse perspectives for better solutions, while seeking passionate individuals driven by execution.

What makes a strong candidate for this Senior Director role?

Strong candidates have 12+ years of experience with MS/PhD, expertise in hardware-software co-design for AI, proficiency in C/C++ and Python, experience with ML operators on specialized hardware, and are self-motivated team players with leadership; prior startup experience and ML frameworks like PyTorch are preferred.

d-Matrix

AI compute platform for datacenters

About d-Matrix

d-Matrix focuses on improving the efficiency of AI computing for large datacenter customers. Its main product is the digital in-memory compute (DIMC) engine, which combines computing capabilities directly within programmable memory. This design helps reduce power consumption and enhances data processing speed while ensuring accuracy. d-Matrix differentiates itself from competitors by offering a modular and scalable approach, utilizing low-power chiplets that can be tailored for different applications. The company's goal is to provide high-performance, energy-efficient AI inference solutions to large-scale datacenter operators.

Santa Clara, CaliforniaHeadquarters
2019Year Founded
$149.8MTotal Funding
SERIES_BCompany Stage
Enterprise Software, AI & Machine LearningIndustries
201-500Employees

Benefits

Hybrid Work Options

Risks

Competition from Nvidia, AMD, and Intel may pressure d-Matrix's market share.
Complex AI chip design could lead to delays or increased production costs.
Rapid AI innovation may render d-Matrix's technology obsolete if not updated.

Differentiation

d-Matrix's DIMC engine integrates compute into memory, enhancing efficiency and accuracy.
The company offers scalable AI solutions through modular, low-power chiplets.
d-Matrix focuses on brain-inspired AI compute engines for diverse inferencing workloads.

Upsides

Growing demand for energy-efficient AI solutions boosts d-Matrix's low-power chiplets appeal.
Partnerships with companies like Microsoft could lead to strategic alliances.
Increasing adoption of modular AI hardware in data centers benefits d-Matrix's offerings.

Land your dream remote job 3x faster with AI