Bachelor’s degree in Computer Science, Data Science or Statistics
7+ years of experience in analysis, design, development, and delivery of data
7+ years of experience and proficiency in SQL, ETL, ELT, leading cloud data warehouse technologies, data transformation, and data management tools
Understanding of data engineering best practices and data integration patterns
2+ years of experience with DevOps and CI/CD
1+ years of experience (not just POC) in using Git and Python
Agile Scrum work experience
Effective communication & facilitation; both verbal and written
Team-Oriented: Collaborating effectively with team and stakeholders
Analytical Skills: Strong problem-solving skills with ability to breakdown complex data solutions
Works independently with minimal guidance
Demonstrate adaptability, initiative and attention to detail through deliverables and ways of working
What makes you stand out
Investments or FINTECH domain knowledge (preferred)
Strong Data analysis skills and/or data mining experience
Experience with one or more Integration tools (Matillion, Informatica, SQL SSIS, DBT)
Experience with Snowflake and DBT
Responsibilities
Partner with data architects, analysts, engineers, and stakeholders to understand data requirements and deliver solutions
Help build scalable products with robust security, quality and governance protocols
Create low-level design artifacts, including mapping specifications
Build scalable and reliable data pipelines to support data ingestions (batch and/or streaming) and transformation from multiple data sources using SQL, AWS, Snowflake, and data integration technologies
Create unit/integration tests and implement automated build and deployment
Participate in code reviews to ensure standards and best practices
Deploy, monitor, and maintain production systems
Use the Agile Framework to organize, manage and execute work