Principal Software Engineer, Distributed Systems at d-Matrix

Santa Clara, California, United States

d-Matrix Logo
$196,000 – $327,000Compensation
Senior (5 to 8 years), Expert & Leadership (9+ years)Experience Level
Full TimeJob Type
UnknownVisa
AI, TechnologyIndustries

Requirements

  • BS in Computer Science, Engineering, Math, Physics or related degree / MS Preferred
  • Strong grasp of computer architecture, data structures, system software, and machine learning fundamentals
  • Leadership experience at manager or senior manager level with software for AI accelerator HW
  • Proficient in C/C++ and Python development in Linux environment and using standard development tools
  • Experience designing and implementing algorithms in high level languages such as C/C++, Python
  • Experience with host bring-up of specialized hardware such as NICs, network FGPAs, smart NICs
  • Experience with distributed systems software such as message passing, MPI
  • Experience with designing and integrating systems for reliability, high availability, fault tolerance, failover
  • Experience with cluster orchestration including defining containers, integration with Kubernetes
  • Experience with performance benchmarking and tuning of large scale distributed systems
  • Self-motivated team player with a strong sense of ownership and leadership
  • Experience building large scale systems for novel HW architectures
  • Very strong understanding of scaleout and data communication collectives
  • Experience working across all aspects of a full stack tool chain for an accelerator and understanding hardware-software co-design nuances
  • Ability to build and scale software deliverables in a tight development window

Responsibilities

  • Lead the development, enhancement, and maintenance of the distributed systems software stack for scaleout of next generation AI hardware
  • Help productize the SW stack for the AI compute engine as part of the software team
  • Build software to enable scaleout, including support for data plane operations such as protocol translation hardware NICs
  • Build software for control and management plane operations such as telemetry, monitoring, micro-services, container orchestration, and datacenter network tooling
  • Collaborate with the HW and SW architecture team, the compiler team, the data science numerics team, SW test group, the benchmark group, and the teams developing simulator and emulation platforms

Skills

Distributed Systems
Runtime Software
Inference Engines
Embedded Chip Software
Scaleout Systems
Protocol Translation
Hardware NICs
Telemetry
Monitoring
Microservices
Container Orchestration
Datacenter Networking
Data Communication Collectives

d-Matrix

AI compute platform for datacenters

About d-Matrix

d-Matrix focuses on improving the efficiency of AI computing for large datacenter customers. Its main product is the digital in-memory compute (DIMC) engine, which combines computing capabilities directly within programmable memory. This design helps reduce power consumption and enhances data processing speed while ensuring accuracy. d-Matrix differentiates itself from competitors by offering a modular and scalable approach, utilizing low-power chiplets that can be tailored for different applications. The company's goal is to provide high-performance, energy-efficient AI inference solutions to large-scale datacenter operators.

Santa Clara, CaliforniaHeadquarters
2019Year Founded
$149.8MTotal Funding
SERIES_BCompany Stage
Enterprise Software, AI & Machine LearningIndustries
201-500Employees

Benefits

Hybrid Work Options

Risks

Competition from Nvidia, AMD, and Intel may pressure d-Matrix's market share.
Complex AI chip design could lead to delays or increased production costs.
Rapid AI innovation may render d-Matrix's technology obsolete if not updated.

Differentiation

d-Matrix's DIMC engine integrates compute into memory, enhancing efficiency and accuracy.
The company offers scalable AI solutions through modular, low-power chiplets.
d-Matrix focuses on brain-inspired AI compute engines for diverse inferencing workloads.

Upsides

Growing demand for energy-efficient AI solutions boosts d-Matrix's low-power chiplets appeal.
Partnerships with companies like Microsoft could lead to strategic alliances.
Increasing adoption of modular AI hardware in data centers benefits d-Matrix's offerings.

Land your dream remote job 3x faster with AI