Candidates should possess deep expertise in handling large-scale data systems, including Snowflake, Kafka, dbt, and Airflow/Snowpark, along with proficiency in Python, Java, and SQL. Experience with modern data ecosystems, cloud-based architectures (AWS, Azure, GCP), and data warehousing/dimensional modeling is required. Strong analytical, debugging, and communication skills are also essential.
The Data Platform Engineer will design, develop, and maintain scalable data pipelines and systems, including real-time and batch processing using Kafka, dbt, and Airflow/Snowpark. They will implement and optimize data models in Snowflake, build ELT processes, and develop Reverse ETL solutions for operational analytics. Responsibilities also include integrating data from various systems, automating workflows, collaborating with cross-functional teams, and ensuring data governance and security.
API management and connectivity solutions provider
Kong focuses on API management and connectivity, providing tools that help businesses manage, secure, and optimize their APIs for software communication. Its main product, Kong Gateway, is a fast API gateway that can handle up to 50,000 transactions per second, while Kong Konnect offers a SaaS platform for API management and Kong Mesh manages microservices. Kong stands out by combining open-source technology with enterprise solutions, allowing free access to core features and offering premium services for businesses. The company's goal is to enhance developer productivity, security, and performance for a diverse range of clients in a rapidly growing market.