10+ years in backend, data, or platform engineering (with at least 3–5 years dedicated to data infrastructure and cloud-native architecture)
Proven track record of building internal data platforms or developer tools at scale (highly desirable)
Bachelor’s degree or equivalent experience in Computer Science, Engineering, or related discipline
Proficiency in data architecture protocols and previous involvement with financial or investment data systems (beneficial)
Distributed systems expertise: Profound understanding of distributed data systems and advanced data technologies (e.g., Spark, Kafka, Airflow) for large-scale data processing
DevOps & Cloud proficiency: Expertise in implementing CI/CD pipelines, cloud orchestration, and infrastructure-as-code (Terraform, CDK) on modern cloud platforms (Azure preferred, also AWS/GCP); hands-on experience with containerization (Docker, Kubernetes) and cloud infrastructure automation
Data pipeline development: Proficiency in designing and optimizing scalable data pipelines for data ingestion, transformation, and delivery; experience with modern ETL/ELT frameworks (e.g., Apache Airflow, dbt, Azure Data Factory)
Programming & data modelling: Strong programming skills in Python, SQL, Scala with solid data modelling and database design experience; ability to ensure data quality, integrity, and lineage
Data migration experience: Proven experience leading large-scale data migration from on-premises databases (e.g., Oracle, Teradata, SQL Server) to cloud-based data platforms like Snowflake, with minimal downtime and no compromise on data integrity or compliance
Modern data architecture understanding: Familiarity with modern data architecture patterns (streaming data, data lakes, etc.) and technologies; ability to build resilient data services supporting advanced analytics, AI/ML models, and event-driven use cases
Leadership & Strategy: Demonstrated ability to lead and mentor engineering teams, establish high coding standards, and drive continuous improvement in platform reliability and developer efficiency; excellent collaboration and communication skills
Responsibilities
Architect and build the data infrastructure that underpins markets & investment analytics
Design scalable data frameworks
Integrate firm-wide data assets
Enable teams to deliver data solutions efficiently and safely
Spearhead the creation of next-generation data ecosystem using DevOps practices
Shape analytics that power critical market and investment systems
Balance hands-on execution, strategic direction, and team leadership to ensure a reliable, scalable, and secure data platform
Establish standards methodologies such as CI/CD pipelines, observability, and version control for data workflows
Partner with Data Quality, DevOps, and Infrastructure teams to ensure seamless data flow and governance across the platform
Achieve success metrics: Reduced time to build and deploy data pipelines, increased adoption of common tooling across teams, high system reliability and observability coverage, migration away from legacy to more modern, cloud native and AI-friendly data architecture