Key technologies and capabilities for this role
Common questions about this position
The salary range is $180K - $300K.
This is a hybrid role requiring onsite work at the Santa Clara, CA headquarters 3 days per week.
Required skills include a strong grasp of computer architecture, data structures, system software, and machine learning fundamentals; proficiency in C/C++ and Python in Linux; experience implementing ML algorithms like GEMMs and convolutions on specialized hardware such as GPUs and AI accelerators; plus 12+ years of industry experience with an MS or PhD.
The culture emphasizes respect, collaboration, humility, direct communication, inclusivity, and diverse perspectives for better solutions, while seeking passionate individuals driven by execution.
Strong candidates have 12+ years of experience with MS/PhD, expertise in hardware-software co-design for AI, proficiency in C/C++ and Python, experience with ML operators on specialized hardware, and are self-motivated team players with leadership; prior startup experience and ML frameworks like PyTorch are preferred.
AI compute platform for datacenters
d-Matrix focuses on improving the efficiency of AI computing for large datacenter customers. Its main product is the digital in-memory compute (DIMC) engine, which combines computing capabilities directly within programmable memory. This design helps reduce power consumption and enhances data processing speed while ensuring accuracy. d-Matrix differentiates itself from competitors by offering a modular and scalable approach, utilizing low-power chiplets that can be tailored for different applications. The company's goal is to provide high-performance, energy-efficient AI inference solutions to large-scale datacenter operators.