Job Application for DevOps Engineer (Cloud) at phData{"@context":"schema.org","@type":"JobPosting","hiringOrganization":{"@type":"Organization","name":"phData","logo":"https://recruiting.cdn.greenhouse.io/external_greenhouse_job_boards/logos/000/010/147/resized/Copy_of_Untitled_(7).png?1617892865"},"title":"DevOps Engineer (Cloud)","datePosted":"2024-12-18","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":null,"addressRegion":null,"addressCountry":null,"postalCode":null}},"description":"\u003cdiv class=\"p-rich_text_section\"\u003e\n\u003cp\u003eWe are seeking qualified DevOps engineers to join our growing Cloud Data Operations and services team in Bangalore, India, as we continue our rapid growth with an expansion of our Indian subsidiary, phData Solutions Private Limited. This expansion comes at the right time with increasing customer demand for data and platform solutions.\u0026nbsp;\u003c/p\u003e\n\u003cp\u003eIn addition to the phenomenal growth and learning opportunities, we offer a competitive compensation plan, including base salary, annual bonus, training, certifications, and equity.\u0026nbsp;\u003c/p\u003e\n\u003cp\u003eAs a DevOps Engineer on our Consulting Team, you will be responsible for technical delivery for technology projects related to Snowflake, Cloud Platform (AWS/Azure/), and services hosted on Cloud.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eResponsibilities:\u003c/strong\u003e\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eOperate and manage modern data platforms - from streaming, to data lakes, to analytics, and beyond across a progressively evolving technical stack.\u003c/li\u003e\n\u003cli\u003eAbility to learn new technologies in a quickly changing field\u003c/li\u003e\n\u003cli\u003eOwns execution of Tasks and questions around Tasks other Engineers are working on related to the project.\u0026nbsp;\u003c/li\u003e\n\u003cli\u003eResponds to Pager Incidents. Solves hard and challenging problems. Goes deep into customer processes and workflows to solve issues.\u003c/li\u003e\n\u003cli\u003eDemonstrate clear ownership of tasks on multiple simultaneous customer accounts across a variety of technical stacks.\u0026nbsp;\u003c/li\u003e\n\u003cli\u003eContinually grow, learn, and stay up-to-date with the MS technology stack.\u0026nbsp;\u003c/li\u003e\n\u003cli\u003e24/7 rotational shifts\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cstrong\u003eRequired Experience:\u003c/strong\u003e\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eWorking knowledge of SQL and the ability to write, debug, and optimize SQL queries.\u003c/li\u003e\n\u003cli\u003eGood understanding of writing and optimising Python programs.\u003c/li\u003e\n\u003cli\u003eExperience in providing operational support across a large user base for a cloud-native data warehouse (Snowflake and/or Redshift).\u0026nbsp;\u003c/li\u003e\n\u003cli\u003eExperience with cloud-native data technologies in AWS or Azure.\u0026nbsp;\u003c/li\u003e\n\u003cli\u003eProven experience learning new technology stacks.\u003c/li\u003e\n\u003cli\u003eStrong troubleshooting and performance tuning skills.\u003c/li\u003e\n\u003cli\u003eClient-facing written and verbal communication skills and experience.\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cstrong\u003ePreferred Experience:\u0026nbsp;\u003c/strong\u003e\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eProduction experience and certifications in core data platforms such as Snowflake, AWS, Azure, GCP, Hadoop, or Databricks.\u003c/li\u003e\n\u003cli\u003eProduction experience working with Cloud and Distributed Data Storage technologies such as S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems.\u003c/li\u003e\n\u003cli\u003eProduction experience working with Data integration technologies such as\u003cstrong\u003e \u003c/strong\u003eSpark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, HVR, NiFi, AWS Data Migration Services, Azure DataFactory or others.\u003c/li\u003e\n\u003cli\u003eProduction experience working with Workflow Management and Orchestration such as Airflow, AWS Managed Airflow, Luigi, NiFi.\u003c/li\u003e\n\u003cli\u003eWorking experience with infrastructure as code using Terraform or Cloud Formation.\u003c/li\u003e\n\u003cli\u003eExpertise in scripting language to automate repetitive tasks (preferred Python).\u003c/li\u003e\n\u003cli\u003eWell versed with continuous integration and deployment frameworks with hands-on experience using CI/CD tools like Bitbucket, Github, Flyway, Liquibase.\u0026nbsp;\u003c/li\u003e\n\u003cli\u003eBachelor's degree in Computer Science or a related field\u0026nbsp;\u003c/li\u003e\n\u003c/ul\u003e\n\u003c/div\u003e\n\u003cp class=\"p1\"\u003e\u003cstrong\u003ePerks and Benefits\u003c/strong\u003e\u003c/p\u00
See more jobs at phData
Apply for this job