Senior Runtime Software Engineer at d-Matrix

Sydney, New South Wales, Australia

d-Matrix Logo
Not SpecifiedCompensation
Senior (5 to 8 years)Experience Level
Full TimeJob Type
UnknownVisa
AI, Semiconductor, HardwareIndustries

Requirements

  • BS/MS preferred degree in computer science, computer engineering, or similar
  • Experience with multi-threaded C programming on multi-core CPUs running an RTOS in both AMP and SMP configurations
  • Understanding of methods used to synchronize many-core and many-CPU architectures
  • Managing static resources without an MMU
  • Zephyr OS experience is an advantage
  • Experience with PIC programming and developing interrupt service routines
  • Knowledge of bootloaders and Linux device drivers is an advantage
  • Ability to interpret HW-centric data sheets and register definitions to determine how to best program the architecture
  • Ability to work with HW design teams at both the early definition phase and during bring-up

Responsibilities

  • Architect, document, and develop the runtime firmware that executes in the various on-chip multi-core CPU subsystems to control all aspects of the AI subsystems in the chip and maximize hardware utilization
  • Achieve measures of success including overall AI hardware utilization, minimizing communication bottlenecks, and maximizing on-chip memory utilization
  • Bring the software up on FPGA platforms containing images of the embedded CPU subsystems and debug it using JTAG-connected IDE
  • Develop a firmware solution that can be developed and tested ahead of the availability of the AI subsystem hardware
  • Determine the delivery schedule and ensure the software meets d-Matrix coding and methodology guidelines
  • Collaborate with hardware teams to interpret hardware specifications and suggest changes that improve utilization, throughput, and/or reduce power
  • Collaborate with other members of the SW team (AU and US), SW quality & test team (US and India), and HW verification team to assist with SoC-level DV simulations and emulation
  • Develop and debug code on FPGA-based systems containing CPU subsystems and SystemC models of the AI subsystems and SoC
  • Port the software to a “big iron” emulation system (e.g., Veloce, Palladium) containing the final RTL
  • Be closely involved in bringing up the software on the AI subsystem hardware and validating silicon and software performance

Skills

PyTorch
Firmware
Runtime Software
Multi-core CPU
Low-level Drivers
System-on-Chip
AI Inference
Sparse Matrix Processing
Dense Matrix Processing
Fixed-point Arithmetic
Floating-point Arithmetic

d-Matrix

AI compute platform for datacenters

About d-Matrix

d-Matrix focuses on improving the efficiency of AI computing for large datacenter customers. Its main product is the digital in-memory compute (DIMC) engine, which combines computing capabilities directly within programmable memory. This design helps reduce power consumption and enhances data processing speed while ensuring accuracy. d-Matrix differentiates itself from competitors by offering a modular and scalable approach, utilizing low-power chiplets that can be tailored for different applications. The company's goal is to provide high-performance, energy-efficient AI inference solutions to large-scale datacenter operators.

Santa Clara, CaliforniaHeadquarters
2019Year Founded
$149.8MTotal Funding
SERIES_BCompany Stage
Enterprise Software, AI & Machine LearningIndustries
201-500Employees

Benefits

Hybrid Work Options

Risks

Competition from Nvidia, AMD, and Intel may pressure d-Matrix's market share.
Complex AI chip design could lead to delays or increased production costs.
Rapid AI innovation may render d-Matrix's technology obsolete if not updated.

Differentiation

d-Matrix's DIMC engine integrates compute into memory, enhancing efficiency and accuracy.
The company offers scalable AI solutions through modular, low-power chiplets.
d-Matrix focuses on brain-inspired AI compute engines for diverse inferencing workloads.

Upsides

Growing demand for energy-efficient AI solutions boosts d-Matrix's low-power chiplets appeal.
Partnerships with companies like Microsoft could lead to strategic alliances.
Increasing adoption of modular AI hardware in data centers benefits d-Matrix's offerings.

Land your dream remote job 3x faster with AI