Principal Data Architect
Access SystemsFull Time
Expert & Leadership (9+ years)
Key technologies and capabilities for this role
Common questions about this position
You will design, build, and maintain core small business data products, engineer scalable data pipelines using Databricks SQL, PySpark, and MLFlow, collaborate with cross-functional teams, and foster a collaborative team culture.
Candidates must have 5+ years of engineering or data science experience, including 3+ years working on distributed data systems, practical experience extracting value from data, and strong collaboration skills.
The role involves Databricks SQL, PySpark, MLFlow, Dagster for orchestration, and LakeFS for data versioning, with bonus points for Python, Spark, and Databricks experience.
Enigma embraces customer focus, end-to-end ownership, high-trust and responsibility, collaboration with cross-functional teams, and a supportive team culture where impact is measured by value delivered and technical quality.
Ideal candidates have 5+ years experience including distributed systems, extract value from data creatively, care about creating value and iterating quickly, and possess strong collaboration skills with a metrics-driven approach.
Provides financial data for SMBs
Enigma offers detailed insights into the financial health and identity of small and medium-sized businesses (SMBs) by collecting and analyzing data from millions of them. Their proprietary panel covers nearly half of all U.S. card transactions, providing real-time data that is updated monthly. Companies use this information to enhance their sales and marketing strategies, onboard clients, and monitor risks. Enigma differentiates itself by offering comprehensive data that helps businesses improve client targeting and acquisition, generating revenue through access to this valuable information.