Job Description
- Design and deploy cloud infrastructure using Terraform/CloudFormation.
- Ensure efficient performance of Big Data applications on AWS/GCP.
- Implement automation for resource scaling and management.
- Create and maintain CI/CD pipelines for seamless data workflow deployment.
- Automate testing, building, and deployment processes.
- Establish robust security measures for protecting sensitive Big Data (Hadoop / Trino/ Spark/ Kafka /Airflow).
- Design scalable architecture using techniques like autoscaling and load balancing.
- Optimize system performance to handle varying workloads.
Proactively address issues to ensure continuous Big Data application operation. - Collaborate closely with data engineers, scientists, and cross-functional teams.
Qualifications
- Studies in Computer Science, Engineering, or a related field (or equivalent experience).
- Strong expertise in AWS or GCP services.
- Proficiency in scripting languages such as Python and Bash.
- Familiarity with key Big Data tools and technologies (is a plus)
- Experience with infrastructure-as-code tools (Terraform/CloudFormation/Kubernetes).
- Strong problem-solving skills and ability to troubleshoot complex issues.
- Effective communication and teamwork skills in a collaborative environment.
What about languages?
Excellent written and verbal English for clear and effective communication is a must!
How much experience must I have?
We are looking for +8 years of hands-on DevOps experience
See more jobs at Blend36
Apply for this job