Software Engineer, Senior Staff- AI/ML Kernels at d-Matrix

Santa Clara, California, United States

d-Matrix Logo
$180,000 – $300,000Compensation
Senior (5 to 8 years)Experience Level
Full TimeJob Type
UnknownVisa
AI, TechnologyIndustries

Requirements

  • MS in computer engineering, math, physics, or related degree with 7+ years of industry experience or PhD in same fields with 1+ years of industry experience
  • Strong grasp of computer architecture, data structures, system software, and machine learning fundamentals
  • Proficient in C/C++ and Python development in Linux environments using standard development tools
  • Experience implementing algorithms in high-level languages such as C/C++ and Python
  • Experience implementing algorithms for specialized hardware such as FPGAs, DSPs, GPUs, and AI accelerators using libraries such as CUDA
  • Experience implementing operators commonly used in ML workloads (e.g., GEMMs, Convolutions, BLAS, SIMD operators for softmax, layer normalization, pooling)
  • Experience with development for embedded SIMD vector processors such as Tensilica
  • Self-motivated team player with strong sense of ownership and leadership
  • Experience building software kernels for HW architectures
  • Very strong understanding of various hardware architectures and how to map algorithms to the architecture
  • Understanding of mapping computational graphs from AI frameworks to underlying architecture
  • Past experience working across all aspects of full-stack toolchain, optimizing and trading off hardware-software co-design
  • Ability to build and scale software deliverables in tight development windows

Responsibilities

  • Develop, enhance, and maintain software kernels for next-generation AI hardware
  • Work as part of the team to productize the SW stack for the AI compute engine
  • Work with a team of compiler experts to build out compiler infrastructure
  • Collaborate closely with other software (ML, systems) and hardware (mixed signal, DSP, CPU) experts

Skills

Software Kernels
AI Hardware
Hardware Architectures
Computational Graphs
AI Frameworks
Compiler Infrastructure
Hardware-Software Co-Design
DSP
CPU
Mixed Signal

d-Matrix

AI compute platform for datacenters

About d-Matrix

d-Matrix focuses on improving the efficiency of AI computing for large datacenter customers. Its main product is the digital in-memory compute (DIMC) engine, which combines computing capabilities directly within programmable memory. This design helps reduce power consumption and enhances data processing speed while ensuring accuracy. d-Matrix differentiates itself from competitors by offering a modular and scalable approach, utilizing low-power chiplets that can be tailored for different applications. The company's goal is to provide high-performance, energy-efficient AI inference solutions to large-scale datacenter operators.

Santa Clara, CaliforniaHeadquarters
2019Year Founded
$149.8MTotal Funding
SERIES_BCompany Stage
Enterprise Software, AI & Machine LearningIndustries
201-500Employees

Benefits

Hybrid Work Options

Risks

Competition from Nvidia, AMD, and Intel may pressure d-Matrix's market share.
Complex AI chip design could lead to delays or increased production costs.
Rapid AI innovation may render d-Matrix's technology obsolete if not updated.

Differentiation

d-Matrix's DIMC engine integrates compute into memory, enhancing efficiency and accuracy.
The company offers scalable AI solutions through modular, low-power chiplets.
d-Matrix focuses on brain-inspired AI compute engines for diverse inferencing workloads.

Upsides

Growing demand for energy-efficient AI solutions boosts d-Matrix's low-power chiplets appeal.
Partnerships with companies like Microsoft could lead to strategic alliances.
Increasing adoption of modular AI hardware in data centers benefits d-Matrix's offerings.

Land your dream remote job 3x faster with AI