Micro-Architect / RTL Design - CPU, Principal at d-Matrix

Santa Clara, California, United States

d-Matrix Logo
$196,000 – $300,000Compensation
Senior (5 to 8 years)Experience Level
Full TimeJob Type
UnknownVisa
Artificial Intelligence, SemiconductorsIndustries

Requirements

  • Master’s degree in electrical engineering, Computer Engineering or Computer Science with 5 years of meaningful work experience
  • Experience in micro-architecture and RTL development (Verilog/System Verilog), focused on Processor and sub-system design, Digital Signal Processing blocks
  • Exposure to Computer Architecture & Arithmetic
  • Experience with RISC-V/Tensilica/ARM/Mips processors
  • Exposure to Interconnect and Bus Interfaces
  • Experience with Floating point and Integer Arithmetic and Numerics (plus)
  • Good understanding of ASIC design flow including RTL design, verification, logic synthesis and timing analysis
  • Strong interpersonal skills and an excellent teammate

Responsibilities

  • Responsible for the micro-architecture and design of the AI Control sub-system modules including Hardware Execution Engines
  • Own design, document, execute and deliver fully verified, high performance, area, and power efficient RTL to achieve the design targets and specifications
  • Design of micro-architecture and RTL, synthesis, logic and timing verification using leading edge CAD tools and semiconductor process technologies
  • Design and Implement logic functions that enable efficient test and debug
  • Participate in silicon bring-up and validation for blocks owned

Skills

RTL Design
Verilog
SystemVerilog
Micro-architecture
RISC-V
ARM
MIPS
Tensilica
Computer Architecture
Synthesis
Timing Verification
Digital Signal Processing
Floating Point Arithmetic

d-Matrix

AI compute platform for datacenters

About d-Matrix

d-Matrix focuses on improving the efficiency of AI computing for large datacenter customers. Its main product is the digital in-memory compute (DIMC) engine, which combines computing capabilities directly within programmable memory. This design helps reduce power consumption and enhances data processing speed while ensuring accuracy. d-Matrix differentiates itself from competitors by offering a modular and scalable approach, utilizing low-power chiplets that can be tailored for different applications. The company's goal is to provide high-performance, energy-efficient AI inference solutions to large-scale datacenter operators.

Santa Clara, CaliforniaHeadquarters
2019Year Founded
$149.8MTotal Funding
SERIES_BCompany Stage
Enterprise Software, AI & Machine LearningIndustries
201-500Employees

Benefits

Hybrid Work Options

Risks

Competition from Nvidia, AMD, and Intel may pressure d-Matrix's market share.
Complex AI chip design could lead to delays or increased production costs.
Rapid AI innovation may render d-Matrix's technology obsolete if not updated.

Differentiation

d-Matrix's DIMC engine integrates compute into memory, enhancing efficiency and accuracy.
The company offers scalable AI solutions through modular, low-power chiplets.
d-Matrix focuses on brain-inspired AI compute engines for diverse inferencing workloads.

Upsides

Growing demand for energy-efficient AI solutions boosts d-Matrix's low-power chiplets appeal.
Partnerships with companies like Microsoft could lead to strategic alliances.
Increasing adoption of modular AI hardware in data centers benefits d-Matrix's offerings.

Land your dream remote job 3x faster with AI