Position Overview
- Location Type: Remote
- Job Type: FullTime
- Salary: $119K - $160K
Serve Robotics is reimagining urban mobility with its personable sidewalk robot. We're building the future of delivery, aiming to alleviate street congestion, expand accessibility, and benefit local businesses. We’re seeking talented engineers to help scale our robotic delivery vision from a novelty to an efficient, ubiquitous solution.
Company Information
- About Serve Robotics: We are a team of tech industry veterans specializing in software, hardware, and design, dedicated to building the future we envision. We leverage robotics, machine learning, and computer vision to solve real-world problems with a focus on the end-user experience. Our team is agile, diverse, and collaborative.
Responsibilities
- Develop and maintain ML infrastructure, including sensor data ETL pipelines.
- Build and improve continuous training pipelines for ML models.
- Develop MLOps systems for managing the lifecycle of ML cloud training and inference pipelines.
- Continuously improve ML model development, management, and deployment processes.
- Work with ML engineers to define metrics for ML tasks and data mining.
- Design and implement algorithms, such as collaborative filtering and active learning, for ranking annotation candidates.
- Collaborate with annotation providers to establish and maintain annotation processes and quality control.
- Make sensor data and its derivatives widely discoverable and accessible for Robotics Engineers.
Requirements
- Education: BS or MS in Computer Science with a focus on data engineering and large-scale ML systems.
- Experience: 2+ years of industry experience building, running, and improving large-volume ML training and validation pipelines.
- Cloud Experience: Experience with building native cloud applications.
- Data Processing: Proficient in building large-scale data processing pipelines in production.
- Programming Languages: Proficient in at least one of the following languages: C++, Python, or Go.
- ML Knowledge: Hands-on experience and strong knowledge of Computer Vision and Deep Learning.
Technology Stack
- Dataflow (Apache Beam): For data processing pipelines.
- Bazel: As the build system.
- BigQuery via dbt: For data warehousing.
- MongoDB & GCS: For data storage.
- Kubernetes: For service deployment.
- Airflow: For workflow management.