d-Matrix

AI Software Application Engineer, Technical Lead / Principal

Santa Clara, California, United States

$180,000 – $300,000Compensation
Expert & Leadership (9+ years)Experience Level
Full TimeJob Type
UnknownVisa
Artificial Intelligence, Hardware & Software InnovationIndustries

Requirements

Candidates should possess 10+ years of experience in customer engineering and field support for enterprise-level AI and datacenter products, with a focus on AI/ML software and generative AI inference. They require in-depth knowledge and hands-on experience with generative AI inference at scale, including the integration and deployment of AI models in production environments. Strong experience with automation tools and scripting is also necessary.

Responsibilities

The AI Software Application Engineer – Technical Lead / Principal will provide expert guidance and support to customers deploying generative AI inference models, assisting with integration, troubleshooting, and optimizing AI/ML software stacks. They will work directly with customers to understand their needs and deliver solutions that maximize performance across their AI workloads, collaborating on technical collateral and leading the installation, configuration, and bring-up of d-Matrix’s AI software stack. Additionally, they will perform functional and performance validation testing and partner with internal engineering and product teams to produce developer guides and technical notes.

Skills

AI/ML software
generative AI inference
AI model deployment
automation tools
scripting
troubleshooting
performance optimization
software stack installation

d-Matrix

AI compute platform for datacenters

About d-Matrix

d-Matrix focuses on improving the efficiency of AI computing for large datacenter customers. Its main product is the digital in-memory compute (DIMC) engine, which combines computing capabilities directly within programmable memory. This design helps reduce power consumption and enhances data processing speed while ensuring accuracy. d-Matrix differentiates itself from competitors by offering a modular and scalable approach, utilizing low-power chiplets that can be tailored for different applications. The company's goal is to provide high-performance, energy-efficient AI inference solutions to large-scale datacenter operators.

Santa Clara, CaliforniaHeadquarters
2019Year Founded
$149.8MTotal Funding
SERIES_BCompany Stage
Enterprise Software, AI & Machine LearningIndustries
201-500Employees

Benefits

Hybrid Work Options

Risks

Competition from Nvidia, AMD, and Intel may pressure d-Matrix's market share.
Complex AI chip design could lead to delays or increased production costs.
Rapid AI innovation may render d-Matrix's technology obsolete if not updated.

Differentiation

d-Matrix's DIMC engine integrates compute into memory, enhancing efficiency and accuracy.
The company offers scalable AI solutions through modular, low-power chiplets.
d-Matrix focuses on brain-inspired AI compute engines for diverse inferencing workloads.

Upsides

Growing demand for energy-efficient AI solutions boosts d-Matrix's low-power chiplets appeal.
Partnerships with companies like Microsoft could lead to strategic alliances.
Increasing adoption of modular AI hardware in data centers benefits d-Matrix's offerings.

Land your dream remote job 3x faster with AI