Machine Learning Test Engineer, Transformers team - US Remote at Hugging Face

New York, New York, United States

Hugging Face Logo
Not SpecifiedCompensation
Senior (5 to 8 years)Experience Level
Full TimeJob Type
UnknownVisa
AI, Machine Learning, TechnologyIndustries

Requirements

  • Proven experience working with modern machine learning frameworks (e.g., Transformers, PyTorch) and large-scale software projects
  • Mindset that treats testing as a strategic priority, not a secondary task—demonstrating how robust testing directly enables project scalability
  • Strong belief in architecting for growth: prioritizing scalable, modular testing systems over isolated, atomic tests
  • Experience managing high-volume test suites (100K+ tests) and cross-platform testing challenges
  • Familiarity with CI/CD tools and automated testing frameworks
  • Preferred Qualifications
  • Background in open-source development or contributions to major ML libraries
  • Experience in re-architecting testing systems for rapidly growing projects

Responsibilities

  • Lead the reinvention and scaling of testing infrastructure to support 100K+ tests across multiple platforms (e.g., Transformers, PyTorch, vLLM)
  • Design growth-oriented testing architectures that prioritize scalability, reliability, and integration with CI/CD pipelines, avoiding fragmented or isolated test workflows
  • Partner with the existing testing lead to establish best practices and drive continuous improvement of the testing ecosystem
  • Collaborate with cross-functional teams to embed testing as a first-class project, ensuring it receives the focus and resources required to sustain rapid growth
  • Anticipate future testing needs and proactively adapt systems to accommodate evolving project scope, complexity, and platform diversity

Skills

Key technologies and capabilities for this role

TestingMachine LearningTransformersPyTorchvLLMCI/CDTest InfrastructureScalabilityPython

Questions & Answers

Common questions about this position

Is this position remote?

Yes, this is a US remote position with no office requirement mentioned.

What skills are required for this Machine Learning Test Engineer role?

Required skills include proven experience with modern machine learning frameworks like Transformers and PyTorch, managing high-volume test suites (100K+ tests), cross-platform testing, and familiarity with CI/CD tools and automated testing frameworks. A strategic mindset for scalable testing architectures is essential.

What is the salary or compensation for this role?

This information is not specified in the job description.

What is the company culture like at Hugging Face?

Hugging Face emphasizes democratizing good AI through an open-source platform with over 5 million users, fostering collaboration where every engineer contributes to testing, and building diverse teams with complementary skills.

What makes a strong candidate for this position?

Strong candidates have experience with ML frameworks like Transformers and PyTorch, managing large test suites, and a strategic approach to scalable testing; open-source contributions and re-architecting testing systems are preferred, but those not ticking every box are encouraged to apply.

Hugging Face

Develops advanced NLP models for text tasks

About Hugging Face

Hugging Face develops machine learning models focused on understanding and generating human-like text. Their main products include advanced natural language processing (NLP) models like GPT-2 and XLNet, which can perform tasks such as text completion, translation, and summarization. Users can access these models through a web application and a repository, making it easy to integrate AI into various applications. Unlike many competitors, Hugging Face offers a freemium model, allowing users to access basic features for free while providing subscription plans for advanced functionalities. The company also tailors solutions for large organizations, including custom model training. Hugging Face aims to empower researchers, developers, and enterprises to utilize machine learning for text-related tasks.

New York City, New YorkHeadquarters
2016Year Founded
$384.9MTotal Funding
SERIES_DCompany Stage
Enterprise Software, AI & Machine LearningIndustries
201-500Employees

Benefits

Flexible Work Environment
Health Insurance
Unlimited PTO
Equity
Growth, Training, & Conferences
Generous Parental Leave

Risks

Salesforce's exclusive partnerships may limit Hugging Face's access to training datasets.
Microsoft's rStar-Math technique could outperform Hugging Face's NLP models in specific tasks.
DeepSeek-V3's release may intensify competition in the ultra-large model space.

Differentiation

Hugging Face offers a massive library of over one million AI models.
The company provides a thriving online community for open-source AI collaboration.
Hugging Face's freemium model allows easy access to advanced NLP tools.

Upsides

Collaboration with Microsoft could enhance small model performance in mathematical reasoning.
Partnership with DeepSeek may boost Hugging Face's ultra-large model capabilities.
Integration with Salesforce's ProVision could improve multimodal AI capabilities.

Land your dream remote job 3x faster with AI