Cloud Engineering Architect
Accurate Background- Full Time
- Expert & Leadership (9+ years)
Candidates should possess a Bachelor’s degree in Computer Science, Information Systems, or a related field, and have at least 5 years of experience in cloud technologies and data engineering. Strong experience with cloud platforms such as AWS, Azure, or GCP is required, along with experience in DevOps practices and automation tools. Familiarity with Big Data technologies like Hadoop, Spark, and Kafka is also desired.
As a Data DevOps Engineer, you will be responsible for developing and implementing scalable clustered Big Data solutions, focusing on automated dynamic scaling and self-healing systems. You will participate in the detailed development and implementation of these solutions, ensuring they meet performance and reliability standards. Additionally, you will be responsible for maintaining and optimizing existing data infrastructure, contributing to the design and implementation of automated processes, and collaborating with Enterprise Data Engineering consultants to accelerate data engineering opportunities for clients.
Provides enterprise IT solutions and services
Hewlett Packard Enterprise provides enterprise IT solutions with a focus on cloud services, artificial intelligence, and edge computing. Their products include HPE Ezmeral for managing containers, HPE GreenLake for cloud services, and HPE Aruba for networking. These solutions help businesses improve their performance and adapt to digital changes. HPE's business model includes selling hardware, software, and services, as well as offering subscription-based services and long-term contracts. What sets HPE apart from competitors is its commitment to open-source projects and its active developer community, which supports collaboration and innovation. The company's goal is to empower organizations to transform digitally and optimize their operations.