Senior Databricks Engineer- fully remote
Thermo Fisher ScientificFull Time
Senior (5 to 8 years)
Key technologies and capabilities for this role
Common questions about this position
The role requires excellent programming and debugging skills in Python, strong hands-on experience with PySpark for data transformation and validation, and proficiency in at least one cloud platform: AWS, GCP, or Azure. Additional essentials include experience with Databricks technologies like Delta Lake, Auto Loader, DLT, and Unity Catalog, plus understanding of data modeling and data warehousing principles.
Minimum 2 years of experience is required.
This information is not specified in the job description.
This information is not specified in the job description.
Certifications like Databricks Certified Professional, knowledge of machine learning and big data processing (Spark, Hadoop, Hive, Kafka), experience with data orchestration like Apache Airflow, CI/CD pipelines, ETL tools (Informatica, Talend), and familiarity with dbt are good to have and can strengthen your candidacy.
Global professional services for digital transformation
Accenture provides a wide range of professional services, including strategy and consulting, technology, and operations, to help organizations improve their performance. Their services assist clients in navigating digital transformation, enhancing operational efficiency, and achieving sustainable growth. Accenture's offerings include cloud migration, cybersecurity, artificial intelligence, and data analytics, which are tailored to meet the needs of various industries such as financial services, healthcare, and retail. What sets Accenture apart from its competitors is its extensive industry knowledge and ability to deliver comprehensive solutions that address both immediate challenges and long-term goals. The company's aim is to support clients in reducing their environmental impact while driving innovation and growth.