Platform Architect
TetraScienceFull Time
Expert & Leadership (9+ years)
Key technologies and capabilities for this role
Common questions about this position
The role requires 5+ years of experience in Data Engineering, Data Architecture or similar role building production systems at scale.
Candidates need expert-level proficiency in Python or Java based frameworks such as Apache Spark, Apache Flink or Kafka Streams, along with strong experience in real-time data modeling, ETL/ELT practices, streaming architectures, and columnar stores.
This information is not specified in the job description.
This information is not specified in the job description.
A BSc/MSc/PhD degree in Computer Science or a related field, or equivalent work experience, is required.
Developers drive innovation, stay ahead of technology trends, embrace challenges, explore new technologies, and deliver simple and seamless solutions.
Strong candidates have proven experience architecting real-time analytics systems at scale, ability to discuss technical tradeoffs, and experience with production systems handling large data volumes.
Unified data quality platform for testing
Datafold provides a platform that focuses on maintaining high data quality through proactive and automated testing. The platform works by integrating into the development cycle, allowing data teams to test their data at various stages, such as during code deployments and migrations, to catch potential issues before they affect the data warehouse. This approach is different from traditional data observability tools, which mainly identify problems after they occur. Datafold aims to help data teams across different industries ensure the integrity and reliability of their data, ultimately speeding up their development processes. The company operates on a subscription-based model, generating revenue through recurring payments from its clients.