Typically Minimum 8+ Years Relevant Experience with the Installation, Customization, Testing and Maintenance of database systems within a production environment
Proven expertise in Snowflake or equivalent data platform technologies including administration, performance tuning, and workload management (Preferred)
Strong proficiency in Python, with a passion for automation and process optimization (Preferred)
Experience implementing modern data platform security (RBAC, masking policies, network policies, MFA/SSO) (Preferred)
Hands-on experience with Linux environments and CLI tools preferred over traditional UI-based tools (Preferred)
Moderate to advanced SQL skills, including window functions, CTEs, and performance optimization (Preferred)
Strong understanding of enterprise cloud security and access control practices (Preferred)
Ability to handle ambiguity and thrive in a fast-paced, dynamic environment (Preferred)
Excellent problem-solving skills and the ability to figure things out independently (Preferred)
A strong eagerness to help people and contribute (Preferred)
Deep understanding of enterprise-grade security concepts including SSO, Networking, SCIM, IAM, Network Policies, SSL, and advanced encryption mechanisms
Responsibilities
Plans computerized databases, including base definition, structure, documentation, long-range requirements, operational guidelines and protection
Designs and implements Security and Governance controls
Formulates and monitors policies, procedures and standards relating to database management
Proposes and implements enhancements that will improve the performance and reliability of the system
Design, build, and maintain scalable data solutions and pipelines using Snowflake or equivalent modern data platforms to support activities such as data ingestion, transformation, sharing, and analytics
Design and implement role-based access control (RBAC) and object-level security policies aligned with enterprise governance standards
Monitor and optimize system performance through query tuning, warehouse sizing, clustering, materialized views, and data pruning strategies
Develop and manage back-end administrative tasks such as scaling policies, user provisioning, cost controls, and usage auditing
Define and manage infrastructure as code (IaC) using Terraform for deploying cloud database objects and associated cloud infrastructure
Lead Proof of Concepts (PoCs) to explore and implement new technologies and methodologies
Automate repetitive tasks and enhance existing processes through innovative Python-based solutions
Embrace changing tasks and requirements with agility, adapting quickly to evolving priorities
Collaborate with Engineering, DevOps, and Security teams to deploy and manage secure, production-grade data solutions
Work primarily with Linux/CLI environments, minimizing reliance on point-and-click interfaces to achieve streamlined results
Integrate the data platform with cloud storage (S3, Azure Blob, GCS), ETL tools, and BI platforms to enable seamless end-to-end workflows
Strive for continuous improvement and optimization of data engineering workflows
Stay current with new modern data platform features and assess their applicability to the organization’s data strategy