Job Description
- Contributing to new technologies investigations and complex solutions design, supporting a culture of innovation considering matters of security, scalability, and reliability with the focus on building out our ETL processes
- Working with a modern data stack, producing well-designed technical solutions and robust code, and implementing data governance processes
- Working and professionally communicating with the customer’s team
- Taking up responsibility for delivering major solution features
- Participating in requirements gathering & clarification process, proposing optimal architecture strategies, leading the data architecture implementation
- Developing core modules and functions, designing scalable and cost-effective solutions
- Performing code reviews, writing unit and integration tests
- Scaling the distributed system and infrastructure to the next level
- Building data platform using the power of modern cloud providers (AWS/GCP/Azure)
Extra Responsibilities
- These responsibilities will help you to grow professionally and can vary depending on the project and the desire to extend your role in the company
- Being in the AWS cloud team and actively contributing to the partnership
- Developing the Micro Batch/Real-Time streaming pipelines (Lambda architecture)
- Working on POCs for validating proposed solutions and migrations
- Leading the migration to modern technology platforms, providing technical guidance
- Adhering to CI/CD methods, helping to implement best practices in the team
- Contributing to unit growth, mentoring other members of the team (optional)
- Owning the whole pipeline and optimizing the engineering processes
- Designing complex ETL processes for analytics and data management, driving the massive implementation
Qualifications
- 5+ years of experience with Python and SQL
- Experience with AWS, specifically API Gateway, Kinesis, Athena, RDS, and Aurora
- Experience building ETL pipelines for analytics and internal operations
- Experience building internal APIs and integrating with external APIs
- Working with Linux operational system
- Effective communication skills, especially for explaining technical concepts to nontechnical business leaders
- Desire to work in a dynamic, research-oriented team
- Experience with distributed application concepts and DevOps tooling
- Excellent writing and communication skills
- Troubleshooting and debugging ability
WILL BE A PLUS
- 2+ years of experience with Hadoop, Spark, and Airflow
- Experience with DAGs and orchestration tools
- Practical experience with developing Snowflake-driven data warehouses
- Experience with developing event-driven data pipelines
See more jobs at Sigma Software
Apply for this job