Job Description: Data Pipeline Engineer - Market Data
Position Overview
Data is central to Aladdin, BlackRock's leading financial technology platform. The DataOps Engineering (DOE) team is responsible for the data ecosystem within BlackRock. The Market Data team, a part of DOE, builds and maintains cutting-edge data platforms that deliver highly available, consistent, and high-quality market data to users such as investors, operations teams, and data scientists. This role focuses on evolving the platform to achieve exponential scale, supporting the future growth of Aladdin.
Data Pipeline Engineers at BlackRock contribute to next-generation technologies and solutions within a renowned financial company. Our engineers design and build large-scale data storage, computation, and distribution systems. Ideal candidates have experience in data acquisition, ETL workflow orchestration, data modeling, data pipeline engineering, REST APIs, relational databases, and data distribution.
Employment Type
Full-time
Responsibilities
Data Pipeline Engineers will be involved from project inception, understanding requirements, architecting, developing, deploying, and maintaining data pipelines on our modern data platform. Continuous improvement of platform performance and scalability is a key focus. Deployment and maintenance require close collaboration with various teams within the Aladdin developer community. Creative and inventive problem-solving skills for reduced turnaround times are highly valued.
- Design and build scalable data pipelines in the market data space to process large datasets efficiently.
- Develop and maintain ETL processes to integrate data from multiple sources into centralized data warehouses.
- Monitor and maintain data systems, ensuring data accuracy, consistency, and reliability.
- Collaborate with program managers, product teams, and business analysts throughout the Software Development Life Cycle (SDLC).
- Provide L2 or L3 support for technical and/or operational issues encountered.
- Conduct end-to-end User Acceptance Testing (UAT) to ensure successful production operations.
- Prepare user documentation to maintain both development and operations continuity.
- Lead process engineering for a data pipeline engineering area.
- Contribute to design decisions for complex systems.
- Take ownership of work beyond assigned tasks and demonstrate emotional investment in projects.
Essential Skills
- Bachelor's degree in Computer Science or equivalent practical experience.
- At least 3+ years of experience as a software engineer.
- Experience in Python, Unit/Integration/Functional testing.
- Experience in SQL, PL/SQL programming, Stored procedures, User-Defined Functions (UDFs).
- Experience/Familiarity with Database Modeling and Normalization techniques.
- Experience/Familiarity with object-oriented design patterns.
- Experience with DevOps tools for Continuous Integration/Continuous Deployment (CI/CD) such as Git, Maven, Jenkins, GitLab CI, Azure DevOps.
- Experience with Agile development concepts and related tools (e.g., Azure DevOps, Jira, etc.).
- Ability to troubleshoot and fix performance issues across the codebase and database queries.
- Excellent written and verbal communication skills.
- Passion for learning and implementing new technologies.
- Ability to operate in a fast-paced environment.
Desired Skills
- Experience in Cloud Native Services (AWS/Azure).
- ETL background in any language or tools.
- Experience with Snowflake or other Cloud Data Warehousing products.
- Exposure to workflow management tools such as Airflow.
Company Information
BlackRock operates a pay-for-performance compensation philosophy. Your total compensation may vary based on role, location, and firm, department, and individual performance. Employees are eligible for an annual discretionary bonus and benefits including healthcare, leave benefits, and retirement benefits.
Location
- Location Type: [Information not provided]
Salary
- For Atlanta, GA Only: The salary range for this position is USD $99,750.00 - USD $120,000.00 annually.