airflow Remote Jobs

100 Results

+30d

Sr. Site Reliability Engineer IV

Signify HealthDallas TX, Remote
terraformairflowDesignmobileazurec++kubernetespythonAWS

Signify Health is hiring a Remote Sr. Site Reliability Engineer IV

How will this role have an impact?

Join Signify Health's vibrant Site Reliability Engineering team as a Site Reliability Engineer. We’re seeking passionate individuals from diverse technical backgrounds. Reporting to the Manager of Site Reliability Engineering, we offer a collaborative environment that values each team member's unique contribution and fosters an inclusive culture.

Your Role:

  • Developing strategies to improve the stability, scalability, and availability of our products.
  • Maintain and deploy observability solutions to optimize system performance.
  • Collaborate with cross-functional teams to enhance operational processes and service management.
  • Design, build, and maintain application stacks for product teams.
  • Create sustainable systems and services through automation.

Skills We’re Seeking:

  • An eagerness to collaborate with and mentor others in the field of Site Reliability Engineering.
  • Strong familiarity with cloud environments (Azure, AWS, or GCP) and a desire to develop further expertise.
  • Advanced understanding of scripting languages, preferably with experience with Bash or Python, and programming languages, preferably with experience with Golang.
  • Advanced grasp of infrastructure as code, preferably with experience with Terraform.
  • Advanced understanding of Kubernetes and containerization technologies.
  • Advanced understanding of CI/CD principles and willingness to guide and enforce best practices.
  • Advanced understanding of Site Reliability and observability principles, preferably with experience with New Relic.
  • A proactive approach to identifying problems, performance bottlenecks, and areas for improvement.

The base salary hiring range for this position is $108,900 to $189,700. Compensation offered will be determined by factors such as location, level, job-related knowledge, skills, and experience. Certain roles may be eligible for incentive compensation, equity, and benefits.
In addition to your compensation, enjoy the rewards of an organization that puts our heart into caring for our colleagues and our communities.  Eligible employees may enroll in a full range of medical, dental, and vision benefits, 401(k) retirement savings plan, and an Employee Stock Purchase Plan.  We also offer education assistance, free development courses, paid time off programs, paid holidays, a CVS store discount, and discount programs with participating partners.  

About Us:

Signify Health is helping build the healthcare system we all want to experience by transforming the home into the healthcare hub. We coordinate care holistically across individuals’ clinical, social, and behavioral needs so they can enjoy more healthy days at home. By building strong connections to primary care providers and community resources, we’re able to close critical care and social gaps, as well as manage risk for individuals who need help the most. This leads to better outcomes and a better experience for everyone involved.

Our high-performance networks are powered by more than 9,000 mobile doctors and nurses covering every county in the U.S., 3,500 healthcare providers and facilities in value-based arrangements, and hundreds of community-based organizations. Signify’s intelligent technology and decision-support services enable these resources to radically simplify care coordination for more than 1.5 million individuals each year while helping payers and providers more effectively implement value-based care programs.

To learn more about how we’re driving outcomes and making healthcare work better, please visit us at www.signifyhealth.com

Diversity and Inclusion are core values at Signify Health, and fostering a workplace culture reflective of that is critical to our continued success as an organization.

We are committed to equal employment opportunities for employees and job applicants in compliance with applicable law and to an environment where employees are valued for their differences.

See more jobs at Signify Health

Apply for this job

+30d

Google Pillar | Data Project

DevoteamLisbon, Portugal, Remote
Bachelor degreeterraformairflowsqljavadockerpython

Devoteam is hiring a Remote Google Pillar | Data Project

Job Description

Devoteam G Cloud is our Google strategy and identity within the group Devoteam. We focus on developing solutions end to end within Google Cloud Platform and its technologies.

Our Devoteam G Cloud is looking for Cloud Data Engineers to join our Data Engineer specialists.

  • Delivery of Data projects more focused on the Engineering component;
  • Working with GCP Data Services such as BigQuery, Cloud Storage , Dataflow, Dataproc, Pub/Sub and Dataplex;
  • Write efficient SQL queries;
  • Develop data processing pipelines using programming frameworks like Apache Beam and CI/CD automatisms; 
  • Automate data engineering tasks;
  • Building and managing data pipelines, with a deep understanding of workflow orchestration, task scheduling, and dependency management;
  • Data Integration and Streaming, including data ingestion from various sources (such as databases, APIs, or logs) into GCP.

Qualifications

  • Bachelor degree in IT or similar;
  • More than 2 years of professional experience, with expertise in the delivery of Data Engineering projects;
  • GCP Data Services, BigQuery, Looker, Cloud Storage, Dataflow, Dataproc, Pub/Sub and Dataplex;
  • Knowledge of programing languages: Python, Java, or SQL;
  • Experience with tools like Apache Airflow, Google Cloud Composer, or Cloud Data Fusion;
  • Code-review mindset;
  • Experience with Terraform, GitHub, Github Actions, Bash and/or Docker will be valued;
  • Knowledge of streaming data processing using tools like Apache Kafka;
  • GCP Certifications: Professional Data Engineer or Professional Cloud Database Engineer and/or Associate Cloud Engineer (nice to have);
  • Proficiency in English (written and spoken).

See more jobs at Devoteam

Apply for this job

+30d

Google Pillar | Data Architect

DevoteamLisbon, Portugal, Remote
Bachelor degreeterraformairflowsqljavadockerpython

Devoteam is hiring a Remote Google Pillar | Data Architect

Job Description

Devoteam G Cloud is our Google strategy and identity within the group Devoteam. We focus on developing solutions end to end within Google Cloud Platform and its technologies.

Our Devoteam G Cloud is looking for Cloud Data Engineers to join our Data Engineer specialists.

  • Delivery of Data projects more focused on the Engineering component;
  • Working with GCP Data Services such as BigQuery, Cloud Storage , Dataflow, Dataproc, Pub/Sub and Dataplex;
  • Write efficient SQL queries;
  • Develop data processing pipelines using programming frameworks like Apache Beam and CI/CD automatisms; 
  • Automate data engineering tasks;
  • Building and managing data pipelines, with a deep understanding of workflow orchestration, task scheduling, and dependency management;
  • Data Integration and Streaming, including data ingestion from various sources (such as databases, APIs, or logs) into GCP.

Qualifications

  • Bachelor degree in IT or similar;
  • More than 2 years of professional experience, with expertise in the delivery of Data Engineering projects;
  • GCP Data Services, BigQuery, Looker, Cloud Storage, Dataflow, Dataproc, Pub/Sub and Dataplex;
  • Knowledge of programing languages: Python, Java, or SQL;
  • Experience with tools like Apache Airflow, Google Cloud Composer, or Cloud Data Fusion;
  • Code-review mindset;
  • Experience with Terraform, GitHub, Github Actions, Bash and/or Docker will be valued;
  • Knowledge of streaming data processing using tools like Apache Kafka;
  • GCP Certifications: Professional Data Engineer or Professional Cloud Database Engineer and/or Associate Cloud Engineer (nice to have);
  • Proficiency in English (written and spoken).

See more jobs at Devoteam

Apply for this job

+30d

Senior Data Engineer

InstacartCanada - Remote
airflowsqlDesign

Instacart is hiring a Remote Senior Data Engineer

We're transforming the grocery industry

At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

Instacart is a Flex First team

There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

Overview

 

At Instacart, our mission is to create a world where everyone has access to the food they love and more time to enjoy it together. Millions of customers every year use Instacart to buy their groceries online, and the Data Engineering team is building the critical data pipelines that underpin all of the myriad of ways that data is used across Instacart to support our customers and partners.

About the Role 

 

The Finance data engineering team plays a critical role in defining how financial data is modeled and standardized for uniform, reliable, timely and accurate reporting. This is a high impact, high visibility role owning critical data integration pipelines and models across all of Instacart’s products. This role is an exciting opportunity to join a key team shaping the post-IPO financial data vision and roadmap for the company.

 

About the Team 

 

Finance data engineering is part of the Infrastructure Engineering pillar, working closely with accounting, billing & revenue teams to support the monthly/quarterly book close, retailer invoicing and internal/external financial reporting. Our team collaborates closely with product teams to capture critical data needed for financial use cases.

 

About the Job 

  • You will be part of a team with a large amount of ownership and autonomy.
  • Large scope for company-level impact working on financial data.
  • You will work closely with engineers and both internal and external stakeholders, owning a large part of the process from problem understanding to shipping the solution.
  • You will ship high quality, scalable and robust solutions with a sense of urgency.
  • You will have the freedom to suggest and drive organization-wide initiatives.

 

About You

Minimum Qualifications

  • 6+ years of working experience in a Data/Software Engineering role, with a focus on building data pipelines.
  • Expert with SQL and  knowledge of Python.
  • Experience building high quality ETL/ELT pipelines.
  • Past experience with data immutability, auditability, slowly changing dimensions or similar concepts.
  • Experience building data pipelines for accounting/billing purposes.
  • Experience with cloud-based data technologies such as Snowflake, Databricks, Trino/Presto, or similar.
  • Adept at fluently communicating with many cross-functional stakeholders to drive requirements and design shared datasets.
  • A strong sense of ownership, and an ability to balance a sense of urgency with shipping high quality and pragmatic solutions.
  • Experience working with a large codebase on a cross functional team.

 

Preferred Qualifications

  • Bachelor’s degree in Computer Science, computer engineering, electrical engineering OR equivalent work experience.
  • Experience with Snowflake, dbt (data build tool) and Airflow
  • Experience with data quality monitoring/observability, either using custom frameworks or tools like Great Expectations, Monte Carlo etc

 

#LI-Remote

See more jobs at Instacart

Apply for this job

+30d

Jr. Back-End/Data Engineer: Internship

airflowsqlapigitelasticsearchpythonAWS

The Lifetime Value Co. is hiring a Remote Jr. Back-End/Data Engineer: Internship

Jr. Back-End/Data Engineer: Internship - The Lifetime Value Co. - Career PageLi

See more jobs at The Lifetime Value Co.

Apply for this job

+30d

Jr. Back-End/Data Engineer: Practica

Bachelor's degreeremote-firstairflowsqlapigitelasticsearchpythonAWS

The Lifetime Value Co. is hiring a Remote Jr. Back-End/Data Engineer: Practica

Jr. Back-End/Data Engineer: Practica - The Lifetime Value Co. - Career PageSee more jobs at The Lifetime Value Co.

Apply for this job

+30d

应用开发工程师(Python)

Western DigitalShanghai, China, Remote
airflowmariadbsqloracleapidockercsskubernetesjenkinspythonjavascriptPHP

Western Digital is hiring a Remote 应用开发工程师(Python)

Job Description

  • Excellent interpersonal communication and organizational skills to contribute as a leading member of global, distributed teams focused on delivering quality services and solutions
  • Able to distill complex technical challenges to actionable and explainable decisions
  • Work in DevOps teams by building consensus and mediating compromises when necessary
  • Demonstrate excellent engineering & automation skills in the context of application development using continuous integration (CI) and continuous deployment (CD)
  • Demonstrate ability to rapidly learn new and emerging technologies with ability to rapidly define engineering standards and produce automation code
  • Operational abilities including early software release support and driving root cause analysis and remediation
  • Ability to work with and engage multiple functional groups

Qualifications

  • Bachelor’s Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline)
  • 5+ years overall IT industry experience
  • 3+ years in an engineering role using service and hosting solutions such as factory dashboards
  • 3+ years functional knowledge of server-side languages: Python, PHP
  • 3+ years functional knowledge of client-side programming: JavaScript, HTML, CSS
  • Experience with relational SQL database: MariaDB, MSSQL, Oracle
  • Experience with data pipeline and workflow management tools: Airflow
  • Solid understanding of containerization and orchestration tools: Docker, Kubernetes
  • Experience with version control systems: BitBucket
  • Experience with Dash framework (Python web framework) for building interactive web applications
  • Exposure to Common Web Frameworks & REST API’s
  • Experience with continuous integration and deployment using Jenkins is a plus

See more jobs at Western Digital

Apply for this job

+30d

Software Engineer, Payment Pricing & Cost Platform

SquareAtlanta, GA, Remote
airflowDesignrubyjavamysqlpythonAWSbackend

Square is hiring a Remote Software Engineer, Payment Pricing & Cost Platform

Job Description

Square processes billions of dollars worth of transactions every year. At that scale, even small optimizations can lead to huge business impact. Payment Pricing & Cost Platform makes data-driven improvements to core payment profitability by focusing on pricing and cost of a payment. This work has a direct impact on Square’s profitability and financial success. The team's engineers work together closely with data scientists and analysts. It offers competitive pricing optionality that serves all Square sellers, provides understandable and predictable pricing and supports reliable and transparent costs.

We are looking for a senior backend Software Engineer to help us expand our pricing and cost solutions and scale our platform. As a senior member of the team, you will:

  • Design scalable and reliable systems that solve the problems related to pricing, and costs.

  • Implement the designed solutions and new features and optimize them

  • Work cross-functionally with our product, business, and finance teams to develop Square’s global payments strategy

  • Grow as an engineer and lift others around you in the process by providing technical mentorship and guidance

  • Help contribute to a culture of positivity, psychological safety, and inclusivity

 

Qualifications

You have:

  • BA/BS degree or equivalent practical, working experience (5+ years preferred)

  • Experience in working cross-functionally and building architecting data-driven solutions

  • Innate curiosity and a desire to be responsible for all aspects of reliably moving billions of dollars in a small, highly focused team

  • Understanding and curiosity in creating highly available, scalable, low-latency data systems

  • Interest in changing the payments landscape in the US and globally

Technologies we use:

  • Java, Python, Ruby

  • Guice, Guava, gRPC, Protocol Buffers

  • MySQL, Aurora, Airflow, Tensorflow, Kafka, Google Cloud Platform, AWS

See more jobs at Square

Apply for this job

+30d

Machine Learning Engineer III, HD Maps

MapBoxRemote, UK
airflowDesignpythonAWS

MapBox is hiring a Remote Machine Learning Engineer III, HD Maps

Mapbox is the leading real-time location platform for a new generation of location-aware businesses. Mapbox is the only platform that equips organizations with the full set of tools to power the navigation of people, packages, and vehicles everywhere. More than 3.9 million registered developers have chosen Mapbox because of the platform’s flexibility, security, and privacy compliance. Organizations use Mapbox applications, data, SDKs, and APIs to create customized and immersive experiences that delight their customers.

 

What We Do

On the HD Maps team, we are at the forefront of geospatial big-data analytics and insights for customer market segments and product offerings. Our expertise is pivotal in deploying GIS algorithmic stages into scalable production cloud applications, leveraging platforms like AWS and Spark. We work on Mapbox's award-winning high-precision maps (“HD Maps”) products family, spanning across ADAS, AV, and Non-Automotive GIS data customers in numerous projects. We cover everything from data and systems analysis to automotive and cloud application architecture, including compute, storage, cost and performance assessments.

What You'll Do

  • Lead the research and development of state-of-the-art computer vision and geospatial models for our high-precision maps.
  • Design and manage datasets for training and testing our models in collaboration with the internal labeling team.
  • Implement machine learning models that are efficient, scalable, and capable of handling highload in production.
  • Collaborate with cross-functional partners to ensure that our machine learning features align with customer needs and market trends.

What We Believe are Important Traits for This Role

  • Extensive experience developing and optimizing deep learning algorithms for image segmentation, object detection, and pattern recognition.
  • Strong programming skills in Python and familiarity with libraries such as OpenCV, PyTorch.
  • Experience deploying and scaling systems with cloud providers such as AWS.
  • Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders.

 

Preferred Qualifications

  • Prior experience in the mapping, navigation, or automotive industry.
  • Understanding of geospatial data concepts and tools (GeoJSON, PostGIS, QGIS, etc.).
  • Experience with distributed processing pipelines (Hadoop, Spark, Airflow, Dask).

What We Value

In addition to our core values, which are not unique to this position and are necessary for Mapbox leaders:

  • We value high-performing creative individuals who dig into problems and opportunities.
  • We believe in individuals being their whole selves at work. We commit to this through supportive health care, parental leave, flexibility for the things that come up in life, and innovating on how we think about supporting our people.
  • We emphasize an environment of teaching and learning to equip employees with the tools needed to be successful in their function and the company.
  • We strongly believe in the value of growing a diverse team and encourage people of all backgrounds, genders, ethnicities, abilities, and sexual orientations to apply.

By applying for this position, you acknowledge that you have received the Mapbox Non-US Privacy Notice for applicants, which is linked here. Completing this application requires you to provide personal data, such as your name and contact information, which is mandatory for Mapbox to process your application.

Mapbox is an EEO Employer - Minority/Female/Veteran/Disabled/Sexual Orientation/Gender Identity


#LI-Remote

Apply for this job

+30d

Senior Data Engineer (Automated restaurant system)

Sigma SoftwareKyiv, Ukraine, Remote
airflowsqloracleDesignazureAWS

Sigma Software is hiring a Remote Senior Data Engineer (Automated restaurant system)

Job Description

  • Design, develop, and maintain ETL pipelines using Azure Data Factory, ensuring efficient data flow and transformation
  • Utilize Databricks for data processing tasks, leveraging PySpark for advanced data manipulation and analysis
  • Develop and optimize ETL processes within Azure Synapse Analytics, focusing on performance and scalability
  • Apply expertise in MS SQL Server for data modeling, implementing stored procedures and analytical views to support business requirements
  • Create visually appealing and insightful reports using Microsoft Power BI, enabling stakeholders to derive actionable insights from the data

Qualifications

  • At least 5 years of professional experience in data engineering
  • Advanced experience with enterprise Data Warehouse modeling
  • Experience with Complex ETL process implementation
  • Experience with Power BI Data Modeling and Reporting (DAX, Power Query, Dashboards Design)
  • Experience with SQL Development (both MS SQL and Oracle)
  • Solid experience with Azure Synapse/Data Factory Pipelines, Azure Notebooks/Databricks
  • Familiarity with Cloud stack technology (AWS, Airflow)
  • Deep understanding of SAP BW/HANA Modeling, SAP ECC/S4 Corporate Data Extraction
  • Experience with PySpark Data Processing
  • Experience with CI/CD
  • At least Upper - Intermediate level of written and spoken English

See more jobs at Sigma Software

Apply for this job

+30d

Senior Data Engineer

Procore TechnologiesBangalore, India, Remote
Bachelor's degreetableauairflowsqlAWSjavascript

Procore Technologies is hiring a Remote Senior Data Engineer

Job Description

We’re looking for a Data Engineer to join Procore’s data platform and who wants to make an impact.  This is a great opportunity to work with an amazing team and help them achieve their mission to deliver actionable insights from data to the business. As a member of our Data team, you’ll help define the future of our next-generation global data infrastructure to help Procore connect everyone in construction on a worldwide platform. 

The construction vertical is ripe for technological innovation. Construction impacts the lives of nearly everyone in the world, and yet it’s one of the least digitized industries, not to mention one of the most dangerous. Procore is leading the market with our SaaS construction platform. We build for real people with real experiences, empowering Groundbreakers to develop and transform the communities where we all live.

What you’ll do:

  • As a Data engineer, you will work closely with Architects, Data scientists, Product Managers and analysts to understand the needs of the business and deliver actionable insights via Data products
  • Use your technical knowledge to help drive scalable, performant data solutions by articulating how our solution meets the needs of our customers and partners
  • Partner closely with cross-functional platform teams, and stakeholders on data management, data governance, reliability, security, and quality to deliver a data platform that scales with Procore’s growth
  • Build and maintain batch and streaming data pipelines, CI/CD pipelines using cloud technologies technologies
  • Build data visualizations using tools such as Tableau, Power BI or Looker to make data driven decisions
  • Partner with the internal consumers of data to define the requirements and success criteria to help them achieve their mission to deliver actionable insights from data to the business.
  • Maintain the user stories in a prioritized backlog and effectively divide overall project goals into prioritized deliverables for execution.
  • Participate in daily standups, team meetings, sprint planning, and demo/retrospectives while working cross-functionality with other teams to drive the innovation of our products 
  • Develop user experience designs in the form of wireframes and mock-ups for stakeholder validation and execution by the team.

What we’re looking for:

  • Bachelor's degree in Computer Science or a similar technical field of study 
  • Strong expertise in Data Engineering with 3+ years of experience in building efficient and scalable data infrastructure, data pipelines, automated deployments, data lifecycle policies using modern data stack such as Snowflake, Airflow, Spark, dbt, AWS, Gitlab, Databricks
  • Develop and maintain tables and data models in SQL, abstracting multiple sources and historical data across varied schemas to a format suitable for reporting and analysis
  • Experience in extracting, processing and storing structured and unstructured data
  • Experience building integrations between enterprise systems to automate manual processes
  • Strong experience with AWS services including EC2, EKS, ECS, Lambda, S3, Opensearch, Kafka, MWAA, Cloud watch
  • Experience building data visualizations in Tableau, Javascript or PowerBI to improve operational efficiencies and foster data driven decision making
  • Exceptional Communication skills including working with onshore and offshore stakeholders
  • Strong interpersonal skills with the ability to manage ambiguity and conflicts

Qualifications

See more jobs at Procore Technologies

Apply for this job

+30d

Data Scientist - Support

SquareSan Francisco, CA, Remote
Bachelor degreetableauairflowsqlDesignpython

Square is hiring a Remote Data Scientist - Support

Job Description

The Cash App Support organization is growing and we are looking for a Data Scientist (DS) to join the team. The DS team at Cash derives valuable insights from our extremely unique datasets and turn those insights into actions that improve the experience for our customers every day. In this role, you’ll be embedded in our Support org and work closely with operations and other cross-functional partners to drive meaningful change for how our customers interact with the Support team and resolve issues with their accounts. 

You will:

  • Partner directly with a Cash App customer support team, working closely with operations, engineers, and machine learning
  • Analyze large datasets using SQL and scripting languages to surface actionable insights and opportunities to the operations team and other key stakeholders
  • Approach problems from first principles, using a variety of statistical and mathematical modeling techniques to research and understand advocate and customer behavior
  • Design and analyze A/B experiments to evaluate the impact of changes we make to our operational processes and tools
  • Work with engineers to log new, useful data sources as we evolve processes, tooling, and features
  • Build, forecast, and report on metrics that drive strategy and facilitate decision making for key business initiatives
  • Write code to effectively process, cleanse, and combine data sources in unique and useful ways, often resulting in curated ETL datasets that are easily used by the broader team
  • Build and share data visualizations and self-serve dashboards for your partners
  • Effectively communicate your work with team leads and cross-functional stakeholders on a regular basis

Qualifications

You have:

  • An appreciation for the connection between your work and the experience it delivers to customers. Previous exposure to or interest in customer support problems would be great to have
  • A bachelor degree in statistics, data science, or similar STEM field with 5+ years of experience in a relevant role OR
  • A graduate degree in statistics, data science, or similar STEM field with 2+ years of experience in a relevant role
  • Advanced proficiency with SQL and data visualization tools (e.g. Looker, Tableau, etc)
  • Experience with scripting and data analysis programming languages, such as Python or R
  • Experience with cohort and funnel analyses, a deep understanding statistical concepts such as selection bias, probability distributions, and conditional probabilities
  • Experience in a high-growth tech environment

Technologies we use and teach:

  • SQL, Snowflake, etc.
  • Python (Pandas, Numpy)
  • Looker, Mode, Tableau, Prefect, Airflow

See more jobs at Square

Apply for this job

+30d

Analyst 2, Data Analytics

Western DigitalBatu Kawan, Malaysia, Remote
Bachelor's degreeairflowmariadbsqloracleapidockercsskubernetesjenkinspythonjavascriptPHP

Western Digital is hiring a Remote Analyst 2, Data Analytics

Job Description

  • Excellent interpersonal communication and organizational skills to contribute as a leading member of global, distributed teams focused on delivering quality services and solutions
  • Able to distill complex technical challenges to actionable and explainable decisions
  • Work in DevOps teams by building consensus and mediating compromises when necessary
  • Demonstrate excellent engineering & automation skills in the context of application development using continuous integration (CI) and continuous deployment (CD)
  • Demonstrate ability to rapidly learn new and emerging technologies with ability to rapidly define engineering standards and produce automation code
  • Operational abilities including early software release support and driving root cause analysis and remediation
  • Ability to work with and engage multiple functional groups

Qualifications

  • Bachelor's Degree in Computer Science, Software Engineering, Computer Engineering or a related field or equivalent work experience. 
  • 5+ years overall IT industry experience
  • 3+ years in an engineering role using service and hosting solutions such as factory dashboards
  • 3+ years functional knowledge of server-side languages: Python, PHP
  • 3+ years functional knowledge of client-side programming: JavaScript, HTML, CSS
  • Experience with relational SQL database: MariaDB, MSSQL, Oracle
  • Experience with data pipeline and workflow management tools: Airflow
  • Solid understanding of containerization and orchestration tools: Docker, Kubernetes
  • Experience with version control systems: BitBucket
  • Experience with Dash framework (Python web framework) for building interactive web applications
  • Exposure to Common Web Frameworks & REST API’s
  • Experience with continuous integration and deployment using Jenkins is a plus

See more jobs at Western Digital

Apply for this job

+30d

Senior Software Engineer, Data

JW PlayerUnited States - Remote
agileairflowjavadockerelasticsearchkubernetespythonAWSbackend

JW Player is hiring a Remote Senior Software Engineer, Data

About JWP:

JWP is transforming the Digital Video Economy as a trusted partner for over 40,000 broadcasters, publishers, and video-driven brands through our cutting-edge video software and data insights platform. JWP empowers customers with unprecedented independence and control over their digital video content. Established in 2004 as an open-source video player, JWP has evolved into the premier force driving digital video for businesses worldwide. With a rich legacy of pioneering video technology, JWP customers currently generate 8 billion video impressions/month and 5 billion minutes of videos watched/month. At JWP, everyone shares a passion for revolutionizing the digital video landscape. If you are ready to be a part of a dynamic and collaborative team then join us in shaping the future of video! 

The Data Engineering Team: 

At JWP, our data team is a dynamic and innovative team, managing the data lifecycle, from ingestion to processing and analysis, touching every corner of our thriving business ecosystem. Engineers on the team play a pivotal role in shaping the company's direction by making key decisions about our infrastructure, technology stack, and implementation strategies. 

The Opportunity: 

We are looking to bring on a Senior Software Engineer to join our Data Engineering team. As an Engineer on the team, you will be diving into the forefront of cutting-edge big data tools and technology. In this role, you will have the opportunity to partner closely with various teams to tackle crucial challenges for one of the world's largest and rapidly expanding video companies. Join us and make an impact at the forefront of digital innovation.

As a Senior Data Engineer, you will:

  • Contribute to the development of distributed batch and real-time data infrastructure.
  • Mentor and work closely with junior engineers on the team. 
  • Perform code reviews with peers. 
  • Lead small to medium sized projects, documenting and ticket writing the projects. 
  • Collaborate closely with Product Managers, Analyst, and cross-functional teams to gather insights and drive innovation in data products. 

Requirements for the role:

  • Minimum 5+ years of backend engineering experience with a passionate interest for big data.
  • Expertise with Python or Java and SQL. 
  • Familiarity with Kafka
  • Experience with a range of datastores, from relational to key-value to document
  • Demonstrate humility, empathy, and a collaborative spirit that fuels team success. 

Bonus Points:

  • Data engineering experience, specifically with data modeling, warehousing and building ETL pipelines
  • Familiarity with AWS - in particular, EC2, S3, RDS, and EMR
  • Familiarity with Snowflake
  • Familiarity with Elasticsearch
  • Familiarity with data processing tools like Hadoop, Spark, Kafka, and Flink
  • Experience with Docker, Kubernetes, and application monitoring tools
  • Experience and/or training with agile methodologies
  • Familiarity with Airflow for task and dependency management

Perks of being at JWP, United States

Our goal is to take care of you and ensure you will be successful in your new role. Your success is our success! 

As a full time employee, you will qualify for:

  • Private Medical, Vision and Dental Coverage for you and your family
  • Unlimited Paid Time Off
  • Stock Options Purchase Program
  • Quarterly and Annual Team Events
  • Professional Career Development Program and Career Development Progression
  • New Employee Home Office Setup Stipend
  • Monthly Connectivity Stipend
  • Free and discounted perks through JW Player's benefit partners
  • Bi-Annual Hack Weeks for those who are interested in using their coding knowledge
  • Fireside chats with individuals throughout JW Player

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

See more jobs at JW Player

Apply for this job

+30d

Principal Software Engineer

Procore TechnologiesUS - Remote TX - Austin, TX
agileMaster’s DegreescalanosqlairflowDesignscrumjavadockerpostgresqlkubernetesjenkinspython

Procore Technologies is hiring a Remote Principal Software Engineer

Job Description

Procore’s Business Systems Technology group is looking for a Principal Software Engineer to elevate our business systems technology landscape, enhance scalability, drive operational excellence, and enable efficient growth for the business.

 

As a Principal Software Engineer, you’ll use your expert-level technical skills to craft innovative solutions while influencing and mentoring other technical leaders. You’ll collaborate with cross-functional teams and play a pivotal role to design, develop, and optimize business systems, platforms, services, integrations, and transactional data across diverse domains including finance, accounting, e-commerce, billing, payments, expenses, tax, and talent. To be successful in this role, you’re passionate about domain-driven design, systems optimization, event based integrations, configurable cloud services, with a strong bias for action and outcomes. If you’re an inspirational technology leader comfortable translating vague problems into pragmatic solutions that open up the boundaries of technical possibilities—we’d love to hear from you!

 

This role is based out of our Austin, Texas office, reports into the VP Technology of DTS Business Systems and offers flexibility to work remotely as schedule permits.

 

What you’ll do:

  • Lead the design, development, and implementation of scalable software and data solutions to meet business needs.
  • Optimize performance and scalability of existing systems to support business growth.
  • Architect and implement robust integrations between diverse systems and services.
  • Collaborate with cross-functional teams to define technical strategies, roadmaps, and drive outcome delivery.
  • Contribute to setting standards and development principles across multiple teams and the larger organization.
  • Champion best practices for software development, code reviews, and quality assurance processes.
  • Generate technical documentation and presentations to communicate architectural and design options, and educate development teams and business users.
  • Mentor and guide junior engineers to foster their growth and development.
  • Roughly 40-60% hands-on coding.

 

What we’re looking for:

  • Bachelor’s or Master’s degree in Computer Science or related field.
  • 10+ years of experience designing & implementing complex systems and business application integrations with SaaS applications (including enterprise integration patterns, middleware frameworks, SOA web services) 
  • 10+ years of demonstrated success in software development and building cloud-based, highly available, and scalable online services or streaming systems 
  • Deep understanding of micro-services architecture and containerization technologies (e.g., Docker, Kubernetes, Mesos).
  • Expertise with diverse DB technologies like RDMS PostgreSQL, Graph, NoSQL (document, columnar, key-value), Snowflake. 
  • Strength in the majority of commonly used data technologies and languages such as Python, Java, Go or Scala, Kafka, Spark, Flink, Airflow, Splunk, Datadog, Jenkins, or similar
  • Skilled in software development lifecycle processes and experience with scrum, agile and iterative approaches 
  • Excellent communication skills: Demonstrated ability to explain complex technical issues to both technical and non-technical audiences.
  • Knowledge of accounting, billing and payment processing concepts and experience with finance (ERP), billing applications and payment processors preferred

Qualifications

See more jobs at Procore Technologies

Apply for this job

+30d

Staff Data Scientist - Marketing

SquareSan Francisco, CA, Remote
Bachelor degreetableauairflowsqlDesignpython

Square is hiring a Remote Staff Data Scientist - Marketing

Job Description

The Data Science team at Cash App derives valuable insights from our extremely unique datasets and turns those insights into actions that improve the experience for our customers every day. As a Marketing Data Scientist, you will play a critical role in accelerating Cash App’s growth by creating and improving how we measure the impact of all our marketing efforts.

In this role, you’ll be embedded in our Marketing organization and work closely with marketers, product management as well as other cross-functional partners to make effective spend decisions across marketing channels, understand the impact of incentives and explore new opportunities to enable Cash App to become the top provider of primary banking services to our customers.

You will:

  • Build models to optimize our marketing efforts to ensure our spend has the best possible ROI
  • Design and analyze A/B experiments to evaluate the impact of marketing campaigns we launch
  • Analyze large datasets using SQL and scripting languages to surface actionable insights and opportunities to the Marketing product team and other key stakeholders
  • Partner directly with the Cash App Marketing org to influence their roadmap and define success metrics to understand the impact to business,
  • Approach problems from first principles, using a variety of statistical and mathematical modeling techniques to research and understand customer behavior & segments
  • Build, forecast, and report on metrics that drive strategy and facilitate decision making for key business initiatives
  • Write code to effectively process, cleanse, and combine data sources in unique and useful ways, often resulting in curated ETL datasets that are easily used by the broader team
  • Build and share data visualizations and self-serve dashboards for your partners
  • Effectively communicate your work with team leads and cross-functional stakeholders on a regular basis

Qualifications

You have:

  • A bachelor degree in statistics, data science, or similar STEM field with 7+ years of experience in a relevant role OR
  • A graduate degree in statistics, data science, or similar STEM field with 5+ years of experience in a relevant role
  • Advanced proficiency with SQL and data visualization tools (e.g. Tableau, Looker, etc)
  • Experience with scripting and data analysis programming languages, such as Python or R
  • Worked extensively with Causal Inference techniques and off platform data
  • A knack for turning ambiguous problems into clear deliverables and actionable insights 
  • Gone deep with cohort and funnel analyses, a deep understanding statistical concepts such as selection bias, probability distributions, and conditional probabilities

Technologies we use and teach:

  • SQL, Snowflake, etc.
  • Python (Pandas, Numpy)
  • Tableau, Airflow, Looker, Mode, Prefect

See more jobs at Square

Apply for this job

Genesis is hiring a Remote User Acquisition Specialist (Paid Social) at HolyWater

See more jobs at Genesis

Apply for this job

+30d

Staff Data Engineer, Banking

SquareSan Francisco, CA, Remote
airflowDesignazurerubyjavapythonAWS

Square is hiring a Remote Staff Data Engineer, Banking

Job Description

The Square Banking team is building a suite of new financial products for Square sellers. We offer business checking accounts, savings accounts, credit card, and loans to help our sellers manage their business cash flow. Investing in a Financial Data Mesh Platform is not just about managing data; it's about unleashing the full potential of our organization's most valuable asset. It’s a critical strategic move that not only empowers us to use the distinctive value of data but also extends its positive impact to our customers, our Sellers, and users of the Banking platform.

As an Engineer focused on Data for Square Banking, you will help us build our own Square Banking Financial Data Mesh Platform, using real-time Big Data technologies and Medallion architecture. You will work directly with product, engineering, data science and machine learning teams to understand their use-case, develop reliable, trusted datasets that accelerate the decision-making process of important products.

You will:

  • You'll design large-scale, distributed data processing systems and pipelines to ensure efficient and reliable data ingestion, storage, transformation, and analysis

  • Promote high-quality software engineering practices towards building data infrastructure and pipelines at scale

  • You'll build core datasets to serve as unique sources of truth for product and departments (product, marketing, sales, finance, customer experience, data science, business operations, IT, engineering)

  • You'll partner with data scientists and other cross-functional partners to understand their needs and build pipelines to scale.

  • Identify and address data quality and integrity issues through data validation, cleansing, and data modeling techniques. You'll implement automated workflows that lower manual/operational cost for team members, define and uphold SLAs for timely delivery of data, move us closer to democratizing data and a self-serve model (query exploration, dashboards, data catalog, data discovery)

  • Learn about Big Data architecture via technologies such as AWS, DataBricks and Kafka.

  • Stay up to date with emerging technologies, best practices, and industry trends in data engineering and software development

  • Mentor and provide guidance to junior data engineers fostering inclusivity and growth.

  • Work remotely with a team of distributed colleagues #LI-Remote

  • Report to the Engineering Manager of Banking - Data Engineering

Qualifications

You Have:

  • 8+ years as a data engineer or software engineer, with a focus on large-scale data processing and analytics

  • You've spent 4+ years as a data engineer building core datasets.

  • You are passionate about analytics use cases, data models and solving complex data problems.

  • You have hands-on experience shipping scalable data solutions in the cloud (e.g AWS, GCP, Azure), across multiple data stores (e.g Databricks, Snowflake, Redshift, Hive, SQL/NoSQL, columnar storage formats) and methodologies (e.g dimensional modeling, data marts, star/snowflake schemas)

  • You have hands-on experience with highly scalable and reliable data pipelines using BigData (e.g Airflow, DBT, Spark, Hive, Parquet/ORC, Protobuf/Thrift, etc)

  • Optimized and tuned data pipelines to enhance overall system performance, reliability, and scalability

  • Knowledge of programming languages (e.g. Go, Ruby, Java, Python)

  • Willingness to participate in professional development activities to stay current on industry knowledge and passion for trying new things.

See more jobs at Square

Apply for this job

+30d

Data Integration Engineer (Req #1713)

Clover HealthRemote - USA
Master’s DegreetableauairflowpostgressqlDesignc++python

Clover Health is hiring a Remote Data Integration Engineer (Req #1713)

Location: 3401 Mallory Lane, Suite 210, Franklin, TN 37067; Telecommuting
permissible from any location in the U.S.


Salary Range: $132,974 /yr - $161,250 /yr


Job Description: Create and manage ETL packages, triggers, stored procedures, views,
SQL transactions. Develop new secure data feeds with external parties as well as internal
applications including the data warehouse and business intelligence applications. Perform
analysis and QA. Diagnose ETL and database related issues, perform root cause analysis,
and recommend corrective actions to management. Work with a small project team to
support the design, development, implementation, monitoring, and maintenance of new
ETL programs. Telecommuting is permissible from any location in the US. 

Requirements: Bachelor’s degree or foreign degree equivalent in Computer Science,
Information Systems or related field and five (5) years of progressive, post-baccalaureate
experience in IT development or in the job offered or related role. Alternatively,
employer will accept a Master’s degree or foreign equivalent in Computer Science,
Information Systems or a related field and two (2) years of experience in IT development
or in the job offered or a related role. Any suitable combination of education, experience,
or training is acceptable.

Skills: Experience and/or education must include: 

1.  Python & Postgres; 
2.  Snowflake, DBT, Airflow, Big Query, Data Governance; 
3. Analytics, data science through SQL Optimization; 
4.  Database Design Modeling; and 
5.  ML Collaboration tools such as Tableau, Mode, and Looker.

 

#LI-DNI

See more jobs at Clover Health

Apply for this job

+30d

Data Engineer - Senior 0010ALIS - 151

Global InfoTek, Inc.Huntsville, AL Remote
agilejiraairflowsqlDesigndockerelasticsearchpostgresqlkubernetesAWSjavascript

Global InfoTek, Inc. is hiring a Remote Data Engineer - Senior 0010ALIS - 151

Clearance Level:TS/SCI

US Citizenship: Required

Job Classification: Full-time

Location: District of Columbia

Experience:5-7 years

Education: Masters or equivalent experience in a related field.

As a Data Engineer, you will be required to interpret business needs and select appropriate technologies and have experience in implementing data governance of shared and/or master sets of data. You will work with key business stakeholders, IT experts, and subject-matter experts to plan and deliver optimal data solutions. You will create, maintain, and optimize data pipelines as workloads move from development to production for specific use cases to ensure seamless data flow for the use case. You will perform technical and non-technical analyses on project issues and help to ensure technical implementations follow quality assurance metrics. You will analyze data and systems architecture, create designs, and implement information systems solutions.

Responsibilities:

  • Bullets of responsibilities of the role, examples:
  • Define and communicate a clear product vision for our client’s software products, aligning with user needs and business objectives.
  • Create and manage product roadmaps that reflect both innovation and growth strategies.
  • Partner with a government product owner and a product team of 7-8 FTEs.
  • Develop and design data pipelines to support an end-to-end solution.
  • Develop and maintain artifacts (e.g. schemas, data dictionaries, and transforms related to ETL processes).
  • Integrate data pipelines with AWS cloud services to extract meaningful insights.
  • Manage production data within multiple datasets ensuring fault tolerance and redundancy.
  • Design and develop robust and functional dataflows to support raw data and expected data.
  • Provide Tier 3 technical support for deployed applications and dataflows.
  • Collaborate with the rest of data engineering team to design and launch new features.
  • Coordinate and document dataflows, capabilities, etc.
  • Occasionally (as needed) support to off-hours deployment such as evening or weekends.

Qualifications:

  • Understanding of cloud architectures and enabling tools and technologies, such as, AWS Cloud (GovCloud/C2S).
  • Familiar with Amazon Web Managed Services (AWS).
  • Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS, SNS, Kafka, AWS Lambda, NiFi, Airflow, or similar.
  • Proficient experience utilizing JavaScript, Elasticsearch, JSON, SQL, XML.
  • Working knowledge with datastores MongoDB/DynamoDB, PostgreSQL, S3, Redshift, JDBC/ODBC, and Redis.
  • Familiar with Linux/Unix server environments.
  • Experience with Agile development methodology.
  • Publishing and/or presenting design reports.
  • Coordinating with other team members to reach project milestones and deadlines.
  • Working knowledge with Collaboration tools, such as, Jira and Confluence.

Preferred Qualifications:

  • Familiarity and experience with the Intelligence Community (IC), and the intel cycle.
  • Familiarity and experience with the Department of Homeland Security (DHS).
  • Direct Experience with DHS and Intelligence Community (IC) component's data architectures and environments (IC-GovCloud experience preferred).
  • Experience with cloud message APIs and usage of push notifications.
  • Keen interest in learning and using the latest software tools, methods, and technologies to solve real world problem sets vital to national security.
  • Working knowledge with public keys and digital certificates.
  • Experience with DevOps environments.
  • Expertise in various COTS, GOTS, and open-source tools which support development of data integration and visualization applications.
  • Experience with cloud message APIs and usage of push notifications.
  • Specialization in Object Oriented Programming languages, scripting, and databases.

Global InfoTek, Inc. is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or based on disability.

About Global InfoTek, Inc. Global InfoTek Inc. has an award-winning track record of designing, developing, and deploying best-of-breed technologies that address the nation's pressing cyber and advanced technology needs. GITI has rapidly merged pioneering technologies, operational effectiveness, and best business practices for over two decades.

See more jobs at Global InfoTek, Inc.

Apply for this job