airflow Remote Jobs

102 Results

1d

Senior Software Engineer, Data Platform

ThumbtackRemote, Ontario
scalaairflowB2Cpython

Thumbtack is hiring a Remote Senior Software Engineer, Data Platform

A home is the biggest investment most people make, and yet, it doesn’t come with a manual. That's why we’re building the only app homeowners need to effortlessly manage their homes —  knowing what to do, when to do it, and who to hire. With Thumbtack, millions of people care for what matters most, and pros earn billions of dollars through our platform. And as one of the fastest-growing companies in a $600B+ industry — we must be doing something right. 

We are driven by a common goal and the deep satisfaction that comes from knowing our work supports local economies, helps small businesses grow, and brings homeowners peace of mind. We’re seeking people who continually put our purpose first: advocating for pros and customers, embracing change, and choosing teamwork every day.

At Thumbtack, we're creating a new era of home care. If making an impact and the chance to do good inspires you, join us. Imagine what we’ll build together. 

Thumbtack by the Numbers

  • Available nationwide in every U.S. county
  • 80 million projects started on Thumbtack
  • 10 million 5-star reviews and counting
  • Pros earn billions on our platform
  • 1000+ employees 
  • $3.2 billion valuation (June, 2021) 

About the Data Platform Team

Data is the lifeblood of modern companies, and for a two-sided digital marketplace like Thumbtack, even more so. The Data Platform team is a central team of software engineers who employees with various backgrounds can easily build pipelines, systems, and models, all while keeping Customer and Pro privacy in mind. You’ll work deeply with Data Scientists, Machine Learning Engineers, and other Software Engineers from across the company as customers, and collaborate closely with the Site Reliability and core service Engineering teams as partners.

Challenge

In 2024, Thumbtack is significantly investing in Data initiatives and the Engineering teams that support them, as a strategic growth area for the company. While there are interesting and difficult challenges across the entire focus area, this team is specifically poised to build the deep technological foundation to empower all others. It’s highly-leveraged, intricate and interesting work for the right type of engineer. We are the caretakers and creators of the foundational building blocks of the modern Thumbtack data system, and we need your help to build and deliver that

Responsibilities

  • Collaboratively refine and evangelize a comprehensive framework for integrating data-thinking into the software development lifecycle across Engineering
  • Work with the Data Engineering and Machine Learning teams as stakeholders to identify gaps in our current capabilities, and help build and execute on a multi-year roadmap to close them
  • Directly work with teams consisting of product engineers, analysts, data scientists, machine learning engineers throughout the company to understand their data needs, and make recommendations both how to build to their needs, but also to build processes and knowledge bases to support them
  • Drive data quality and best practices across the company
  • Help build the next generation of marketing data products at Thumbtack, based on real-time data products on top of Apache Kafka

Must-Have Qualifications

If you don't think you meet all of the criteria below but still are interested in the job, please apply. Nobody checks every box, and we're looking for someone excited to join the team.

  • 5+ years of experience in software engineering, at least 2 of which focused in the data domain
  • Excellent ability to understand the needs of, and collaborate with, stakeholders in other functions, especially other Engineering teams
  • Strong fluency in at least one major programming language and would be able to switch between multiple languages. Thumbtack’s main production stack is Go, however we tend to use Python with some Scala
  • Experience designing, architecting, and maintaining data systems that serve deeply technical customers
  • Strong sense of ownership and pride in your work, from ideation and requirements-gathering to project completion and maintenance

Nice-to-Have Qualifications

  • Experience building ETL data pipelines in a modern programming language, like Python or Scala, ideally with Apache Airflow
  • Understanding of database internals and query optimization
  • Experience working with semi-structured or unstructured data in a data lake or similar
  • Experience working in engineering at a two-sided marketplace or B2C technology company
  • Experience mentoring and coaching data engineers and/or analysts

Thumbtack is a virtual-first company, meaning you can live and work from any one of our approved locations across the United States, Canada or the Philippines.* Learn more about our virtual-first working model here.

#LI-Remote

Benefits & Perks
  • Virtual-first working model coupled with in-person events
  • 20 company-wide holidays including a week-long end-of-year company shutdown
  • Library (optional use collaboration & connection hub)in San Francisco
  • WiFi reimbursements 
  • Cell phone reimbursements (North America) 
  • Employee Assistance Program for mental health and well-being 

Learn More About Us

Thumbtack embraces diversity. We are proud to be an equal opportunity workplace and do not discriminate on the basis of sex, race, color, age, pregnancy, sexual orientation, gender identity or expression, religion, national origin, ancestry, citizenship, marital status, military or veteran status, genetic information, disability status, or any other characteristic protected by federal, provincial, state, or local law. We also will consider for employment qualified applicants with arrest and conviction records, consistent with applicable law. 

Thumbtack is committed to working with and providing reasonable accommodation to individuals with disabilities. If you would like to request a reasonable accommodation for a medical condition or disability during any part of the application process, please contact:recruitingops@thumbtack.com

If you are a California resident, please review information regarding your rights under California privacy laws contained in Thumbtack’s Privacy policy available athttps://www.thumbtack.com/privacy/.

See more jobs at Thumbtack

Apply for this job

1d

Senior Machine Learning Engineer

ResultantIndianapolis, IN, Remote
nosqlairflowpostgressqlDesignmongodbazureapidockerlinuxpythonAWS

Resultant is hiring a Remote Senior Machine Learning Engineer

Job Description

At Resultant, we are fearless problem solvers.  We are passionate about helping our clients solve their toughest problems, while empathizing with people our solutions will help.  Data analytics is a core component of how we do this.  We are looking for a Machine Learning Engineer to add to our talented team of team of Data Scientists, Data Engineers, and Data Architects to solve some of the most exciting and challenging problems faced by companies and governments in today’s fast-paced environment. As a machine learning, you will have the opportunity to develop novel machine learning and AI (Artificial Intelligence) solutions and deploy them to production applications and systems.   

Here is what a typical day for you might look like:   

  • Design, develop and deploy efficient data pipeline for both structured and unstructured data  

  • Productionize machine learning research in both on premises and cloud environments  

  • Rapid prototyping and adoption of new data sources  

  • Maintain best software engineering practices  

  • Drive optimization, testing and tooling to improve quality of solutions  

  • Develop and apply machine learning and advanced analytics algorithms to solve problems you have never seen before   

  • Collaborate with data architects and software developers to plan and construct the architecture of model deployment   

  • Develop and support APIs to serve models and other data solutions  

  • Learn new tools and technologies to solve the problems faced by our clients  

Qualifications

*This is not an entry level/recent graduate role and requires at least 6+ years of related industry experience. Some of the qualifications and skills we are expecting are:   

  • Bachelor’s degree (and preferably an advanced degree) in Computer Science, Engineering, or in a quantitative field is required.   

  • 4+ years of experience with software engineering in Python (additional languages like Scala/C++/R are a plus)  

  • 2+ years of experience with Databricks and/or Spark  

  • 2+ years of experience with advanced analytics and modern machine learning techniques  

  • 4+ years of experience with APIs, containerization (Docker), and orchestration (Airflow)  

  • 5+ years of experience with relational databases and SQL (Postgres, Redshift, SQL Server)  

  • 3+ years of experience with Cloud platforms (Azure, AWS, GCP) and services  

  • 3+ years of experience with Python API development frameworks (FastAPI, Django, Flask)  

  • Experience with NoSQL and streaming platforms, e.g., Kafka, MongoDB, Neo4j is a plus  

  • Experience with cloud native services such as AWS EMR, Databricks, HDInsight, Sagemaker is a plus  

  • Proficient with version control and CI/CD 

  • Proficient in software architecture and design patterns  

  • Proficient on Linux for both development and operations  

Strong working knowledge of the following:   

  • Modern data mining and machine learning methods   

  • Statistics, mathematics, computer science, numerical methods, and data visualization   

  • Linear algebra and probability   

  • Rapid prototyping and development   

  • Solution creativity and effective written, verbal, and presentation skills   

  • Progressive mindset particularly around deployment models and emerging technologies   

  • Collaborative team player who is detailed oriented, focused on solution quality and execution   

  • Comfortable working across a wide range of project sizes and industries   

  • Entrepreneurial inclination to discover novel opportunities for applying analytical techniques to business and social problems   

See more jobs at Resultant

Apply for this job

3d

Staff Software Engineer, Data Platform

ThumbtackRemote, United States
scalaairflowB2Cpython

Thumbtack is hiring a Remote Staff Software Engineer, Data Platform

A home is the biggest investment most people make, and yet, it doesn’t come with a manual. That's why we’re building the only app homeowners need to effortlessly manage their homes —  knowing what to do, when to do it, and who to hire. With Thumbtack, millions of people care for what matters most, and pros earn billions of dollars through our platform. And as one of the fastest-growing companies in a $600B+ industry — we must be doing something right. 

We are driven by a common goal and the deep satisfaction that comes from knowing our work supports local economies, helps small businesses grow, and brings homeowners peace of mind. We’re seeking people who continually put our purpose first: advocating for pros and customers, embracing change, and choosing teamwork every day.

At Thumbtack, we're creating a new era of home care. If making an impact and the chance to do good inspires you, join us. Imagine what we’ll build together. 

Thumbtack by the Numbers

  • Available nationwide in every U.S. county
  • 80 million projects started on Thumbtack
  • 10 million 5-star reviews and counting
  • Pros earn billions on our platform
  • 1000+ employees 
  • $3.2 billion valuation (June, 2021) 

About the Data Platform Team

Data is the lifeblood of modern companies, and for a two-sided digital marketplace like Thumbtack, even more so. The Data Platform team is a central team of software engineers who employees with various backgrounds can easily build pipelines, systems, and models, all while keeping Customer and Pro privacy in mind. You’ll work deeply with Data Scientists, Machine Learning Engineers, and other Software Engineers from across the company as customers, and collaborate closely with the Site Reliability and core service Engineering teams as partners.

Challenge

In 2024, Thumbtack is significantly investing in Data initiatives and the Engineering teams that support them, as a strategic growth area for the company. While there are interesting and difficult challenges across the entire focus area, this team is specifically poised to build the deep technological foundation to empower all others. It’s highly-leveraged, intricate and interesting work for the right type of engineer. We are the caretakers and creators of the foundational building blocks of the modern Thumbtack data system, and we need your help to build and deliver that

Responsibilities

  • Collaboratively refine and evangelize a comprehensive framework for integrating data-thinking into the software development lifecycle across Engineering
  • Work with the Data Engineering and Machine Learning teams as stakeholders to identify gaps in our current capabilities, and help build and execute on a multi-year roadmap to close them
  • Directly work with teams consisting of product engineers, analysts, data scientists, machine learning engineers throughout the company to understand their data needs, and make recommendations both how to build to their needs, but also to build processes and knowledge bases to support them
  • Drive data quality and best practices across the company
  • Help build the next generation of marketing data products at Thumbtack, based on real-time data products on top of Apache Kafka

Must-Have Qualifications

If you don't think you meet all of the criteria below but still are interested in the job, please apply. Nobody checks every box, and we're looking for someone excited to join the team.

  • 5+ years of experience in software engineering, at least 2 of which focused in the data domain
  • Excellent ability to understand the needs of, and collaborate with, stakeholders in other functions, especially other Engineering teams
  • Strong fluency in at least one major programming language and would be able to switch between multiple languages. Thumbtack’s main production stack is Go, however we tend to use Python with some Scala
  • Experience designing, architecting, and maintaining data systems that serve deeply technical customers
  • Strong sense of ownership and pride in your work, from ideation and requirements-gathering to project completion and maintenance

Nice-to-Have Qualifications

  • Experience building ETL data pipelines in a modern programming language, like Python or Scala, ideally with Apache Airflow
  • Understanding of database internals and query optimization
  • Experience working with semi-structured or unstructured data in a data lake or similar
  • Experience working in engineering at a two-sided marketplace or B2C technology company
  • Experience mentoring and coaching data engineers and/or analysts

Thumbtack is a virtual-first company, meaning you can live and work from any one of our approved locations across the United States, Canada or the Philippines.* Learn more about our virtual-first working model here.

For candidates living in San Francisco / Bay Area, New York City, or Seattle metros, the expected salary range for the role is currently $202,000 - $248,000. Actual offered salaries will vary and will be based on various factors, such as calibrated job level, qualifications, skills, competencies, and proficiency for the role.

For candidates living in all other US locations, the expected salary range for this role is currently $172,000 - $210,000. Actual offered salaries will vary and will be based on various factors, such as calibrated job level, qualifications, skills, competencies, and proficiency for the role.

#LI-Remote

Benefits & Perks
  • Virtual-first working model coupled with in-person events
  • 20 company-wide holidays including a week-long end-of-year company shutdown
  • Library (optional use collaboration & connection hub)in San Francisco
  • WiFi reimbursements 
  • Cell phone reimbursements (North America) 
  • Employee Assistance Program for mental health and well-being 

Learn More About Us

Thumbtack embraces diversity. We are proud to be an equal opportunity workplace and do not discriminate on the basis of sex, race, color, age, pregnancy, sexual orientation, gender identity or expression, religion, national origin, ancestry, citizenship, marital status, military or veteran status, genetic information, disability status, or any other characteristic protected by federal, provincial, state, or local law. We also will consider for employment qualified applicants with arrest and conviction records, consistent with applicable law. 

Thumbtack is committed to working with and providing reasonable accommodation to individuals with disabilities. If you would like to request a reasonable accommodation for a medical condition or disability during any part of the application process, please contact:recruitingops@thumbtack.com

If you are a California resident, please review information regarding your rights under California privacy laws contained in Thumbtack’s Privacy policy available athttps://www.thumbtack.com/privacy/.

See more jobs at Thumbtack

Apply for this job

3d

Senior Software Engineer - Data

JW PlayerUnited States - Remote
agileairflowjavadockerelasticsearchkubernetespythonAWSbackend

JW Player is hiring a Remote Senior Software Engineer - Data

About JWP:

JWP is transforming the Digital Video Economy as a trusted partner for over 40,000 broadcasters, publishers, and video-driven brands through our cutting-edge video software and data insights platform. JWP empowers customers with unprecedented independence and control over their digital video content. Established in 2004 as an open-source video player, JWP has evolved into the premier force driving digital video for businesses worldwide. With a rich legacy of pioneering video technology, JWP customers currently generate 8 billion video impressions/month and 5 billion minutes of videos watched/month. At JWP, everyone shares a passion for revolutionizing the digital video landscape. If you are ready to be a part of a dynamic and collaborative team then join us in shaping the future of video! 

The Data Engineering Team: 

At JWP, our data team is a dynamic and innovative team, managing the data lifecycle, from ingestion to processing and analysis, touching every corner of our thriving business ecosystem. Engineers on the team play a pivotal role in shaping the company's direction by making key decisions about our infrastructure, technology stack, and implementation strategies. 

The Opportunity: 

We are looking to bring on a Senior Software Engineer to join our Data Engineering team. As an Engineer on the team, you will be diving into the forefront of cutting-edge big data tools and technology. In this role, you will have the opportunity to partner closely with various teams to tackle crucial challenges for one of the world's largest and rapidly expanding video companies. Join us and make an impact at the forefront of digital innovation.

As a Senior Data Engineer, you will:

  • Contribute to the development of distributed batch and real-time data infrastructure.
  • Mentor and work closely with junior engineers on the team. 
  • Perform code reviews with peers. 
  • Lead small to medium sized projects, documenting and ticket writing the projects. 
  • Collaborate closely with Product Managers, Analysts, and cross-functional teams to gather insights and drive innovation in data products. 

Requirements for the role:

  • Minimum 5+ years of backend engineering experience with a passionate interest for big data.
  • Expertise with Python or Java and SQL. 
  • Familiarity with Kafka
  • Experience with a range of datastores, from relational to key-value to document
  • Demonstrate humility, empathy, and a collaborative spirit that fuels team success. 

Bonus Points:

  • Data engineering experience, specifically with data modeling, warehousing and building ETL pipelines
  • Familiarity with AWS - in particular, EC2, S3, RDS, and EMR
  • Familiarity with Snowflake
  • Familiarity with Elasticsearch
  • Familiarity with data processing tools like Hadoop, Spark, Kafka, and Flink
  • Experience with Docker, Kubernetes, and application monitoring tools
  • Experience and/or training with agile methodologies
  • Familiarity with Airflow for task and dependency management

Perks of being at JWP, United States

Our goal is to take care of you and ensure you will be successful in your new role. Your success is our success! 

As a full time employee, you will qualify for:

  • Private Medical, Vision and Dental Coverage for you and your family
  • Unlimited Paid Time Off
  • Stock Options Purchase Program
  • Quarterly and Annual Team Events
  • Professional Career Development Program and Career Development Progression
  • New Employee Home Office Setup Stipend
  • Monthly Connectivity Stipend
  • Free and discounted perks through JWP's benefit partners
  • Bi-Annual Hack Weeks for those who are interested in using their coding knowledge
  • Fireside chats with individuals throughout JWP

*Benefits are subject to location and can change at the discretion of the Company. 

Check out our social channels:

    

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

See more jobs at JW Player

Apply for this job

3d

Senior Data Engineer, Finance

InstacartUnited States - Remote
airflowsqlDesign

Instacart is hiring a Remote Senior Data Engineer, Finance

We're transforming the grocery industry

At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

Instacart is a Flex First team

There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

Overview

 

At Instacart, our mission is to create a world where everyone has access to the food they love and more time to enjoy it together. Millions of customers every year use Instacart to buy their groceries online, and the Data Engineering team is building the critical data pipelines that underpin all of the myriad of ways that data is used across Instacart to support our customers and partners.

About the Role 

 

The Finance data engineering team plays a critical role in defining how financial data is modeled and standardized for uniform, reliable, timely and accurate reporting. This is a high impact, high visibility role owning critical data integration pipelines and models across all of Instacart’s products. This role is an exciting opportunity to join a key team shaping the post-IPO financial data vision and roadmap for the company.

 

About the Team 

 

Finance data engineering is part of the Infrastructure Engineering pillar, working closely with accounting, billing & revenue teams to support the monthly/quarterly book close, retailer invoicing and internal/external financial reporting. Our team collaborates closely with product teams to capture critical data needed for financial use cases.

 

About the Job 

  • You will be part of a team with a large amount of ownership and autonomy.
  • Large scope for company-level impact working on financial data.
  • You will work closely with engineers and both internal and external stakeholders, owning a large part of the process from problem understanding to shipping the solution.
  • You will ship high quality, scalable and robust solutions with a sense of urgency.
  • You will have the freedom to suggest and drive organization-wide initiatives.

 

About You

Minimum Qualifications

  • 6+ years of working experience in a Data/Software Engineering role, with a focus on building data pipelines.
  • Expert with SQL and  knowledge of Python.
  • Experience building high quality ETL/ELT pipelines.
  • Past experience with data immutability, auditability, slowly changing dimensions or similar concepts.
  • Experience building data pipelines for accounting/billing purposes.
  • Experience with cloud-based data technologies such as Snowflake, Databricks, Trino/Presto, or similar.
  • Adept at fluently communicating with many cross-functional stakeholders to drive requirements and design shared datasets.
  • A strong sense of ownership, and an ability to balance a sense of urgency with shipping high quality and pragmatic solutions.
  • Experience working with a large codebase on a cross functional team.

 

Preferred Qualifications

  • Bachelor’s degree in Computer Science, computer engineering, electrical engineering OR equivalent work experience.
  • Experience with Snowflake, dbt (data build tool) and Airflow
  • Experience with data quality monitoring/observability, either using custom frameworks or tools like Great Expectations, Monte Carlo etc

 

#LI-Remote

Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policyhere.

Offers may vary based on many factors, such as candidate experience and skills required for the role.Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offeringshere.

For US based candidates, the base pay ranges for a successful candidate are listed below.

CA, NY, CT, NJ
$192,000$213,000 USD
WA
$184,000$204,000 USD
OR, DE, ME, MA, MD, NH, RI, VT, DC, PA, VA, CO, TX, IL, HI
$176,000$196,000 USD
All other states
$159,000$177,000 USD

See more jobs at Instacart

Apply for this job

4d

Senior Software Engineer - US Remote

ExperianNew York, NY, Remote
scalaairflowsqlDesignjavatypescriptkubernetesangularpythonAWS

Experian is hiring a Remote Senior Software Engineer - US Remote

Job Description

Role Summary

Experian Marketing Services is looking for a Senior Software Engineer to join our global engineering team. We need a person who can solve complex problems and build advanced software systems. We face daily challenges that are both unique and engaging, while processing data at petabyte scale. That is over one trillion data points in any given 60-day period — with consumer privacy and data security at the heart of everything we do. We use Scala, in combination with large-scale data processing and open-source technologies, to build our device graph. Across our engineering teams, we also use Scala, GCP, Spark, Kubernetes, Python, TypeScript, Angular, and anything else that helps us get the job done.

Knowledge, Skills and Experience

  • Google Cloud Platform (GCP) or AWS
  • Scala, sbt, cats
  • Airflow, Kubernetes
  • Google Dataflow/Beam or Spark, SQL, BigQuery or similar data warehouse

Key Responsibilities

  • Serve as a Senior member of the team by contributing to the architecture, design, and implementation of EMS systems
  • Mentor junior engineers and enable their growth
  • Lead and drive technical projects. Take responsibility for the planning, execution and success of complex technical projects
  • Ability to be on call if required and to accommodate east coast time zone
  • Collaborate with other engineering, product, and data science teams to ensure we’re building the best products

Qualifications

Qualifications

  • 5+ years of experience making significant contributions in the form of code
  • Strong understanding of algorithms and data structures in knowing when to apply them. Deep familiarity with Scala or Java
  • Experience working with high-scale systems: realtime and batch
  • Interested in data engineering to develop ingestion engines, ETL pipelines, and organizing the data to expose it in a consumable format
  • Passionate about helping your teammates grow by providing insightful code reviews and feedback

See more jobs at Experian

Apply for this job

6d

DevOps Support Engineer

NielsenIQBelgrade, Serbia, Remote
agileterraformairflowsqloracleazuregitdockerpostgresqlkuberneteslinuxjenkinspython

NielsenIQ is hiring a Remote DevOps Support Engineer

Job Description

As a Systems Engineer in the company’s Business Process Automation CoE Team, you will play a pivotal role in ensuring the overall health, performance, availability, and capacity of several automation tools and platforms. Your responsibilities will encompass support and proactive system improvements, incident and change management, automation, and ensuring the availability and performance of mission-critical services.

  • Ensure end-to-end availability and performance of mission-critical services.
  • Providing L2 support for users.
  • Handling of incident and change management requests.
  • Proactively seek systems improvements through performance monitoring and capacity analysis.
  • Implement and manage container architecture using tools like Docker and Kubernetes.
  • Administer and support SQL database engines such as PostgreSQL, MSSQL and Oracle.
  • Utilize Azure Cloud services including Virtual Machine, AKS, and Virtual Network.
  • Work with Linux-based infrastructure and scripting languages like Bash, Python, and PowerShell.
  • Utilize configuration management and CI/CD tools such as GIT, ELK, and Helm Charts.
  • Automate testing, code deployment, and monitoring using tools like Terraform, Jenkins, and GitLab.
  • Work closely with the development team to ensure seamless integration and deployment.
  • Ensure security and compliance of cloud assets by integrating security best practices and deploying cloud security tools.

Qualifications

  • Bachelor’s degree in computer science, Engineering, or related field.
  • 3+ years of experience as a DevOps/Systems Engineer.
  • Experience with container orchestration tools (e.g., Kubernetes, Docker).
  • Experience with SQL databases (e.g., PostgreSQL, MSSQL, Oracle).
  • Familiarity with Azure Cloud services and networking configurations.
  • Experience with Linux and Windows operating systems.
  • Strong knowledge of scripting languages such as: Bash, PowerShell and Python.
  • Experience and understanding of monitoring tools and practices (LogicMonitor, Nagios, Zabbix).
  • Experience with configuration management, CI/CD tools, and automation tools.
  • Knowledge of BPM, Workflow and Workload Automation tools like AirFlow, KissFlow, RunMyjobs, ActiveBatch.
  • Experience in Cloud Networking area and Hybrid network configurations.
  • Knowledge of RPA automation tools such as UiPath and Microsoft Power Automate.
  • You have a willingness to work as a part of a distributed and agile international team.
  • Good communication skills and fluency in English, both written and spoken.

See more jobs at NielsenIQ

Apply for this job

11d

Senior Big Data Engineer (Web and Mobile unit needs)

Sigma SoftwareKyiv, Ukraine, Remote
airflowsqlDesignapilinuxpythonAWS

Sigma Software is hiring a Remote Senior Big Data Engineer (Web and Mobile unit needs)

Job Description

  • Contributing to new technologies investigations and complex solutions design, supporting a culture of innovation considering matters of security, scalability, and reliability with the focus on building out our ETL processes
  • Working with a modern data stack, producing well-designed technical solutions and robust code, and implementing data governance processes
  • Working and professionally communicating with the customer’s team
  • Taking up responsibility for delivering major solution features
  • Participating in requirements gathering & clarification process, proposing optimal architecture strategies, leading the data architecture implementation
  • Developing core modules and functions, designing scalable and cost-effective solutions
  • Performing code reviews, writing unit and integration tests
  • Scaling the distributed system and infrastructure to the next level
  • Building data platform using the power of modern cloud providers (AWS/GCP/Azure)

Extra Responsibilities 

  • These responsibilities will help you to grow professionally and can vary depending on the project and the desire to extend your role in the company
  • Being in the AWS cloud team and actively contributing to the partnership
  • Developing the Micro Batch/Real-Time streaming pipelines (Lambda architecture)
  • Working on POCs for validating proposed solutions and migrations
  • Leading the migration to modern technology platforms, providing technical guidance
  • Adhering to CI/CD methods, helping to implement best practices in the team
  • Contributing to unit growth, mentoring other members of the team (optional)
  • Owning the whole pipeline and optimizing the engineering processes
  • Designing complex ETL processes for analytics and data management, driving the massive implementation

Qualifications

  • 5+ years of experience with Python and SQL
  • Experience with AWS, specifically API Gateway, Kinesis, Athena, RDS, and Aurora
  • Experience building ETL pipelines for analytics and internal operations
  • Experience building internal APIs and integrating with external APIs
  • Working with Linux operational system
  • Effective communication skills, especially for explaining technical concepts to nontechnical business leaders
  • Desire to work in a dynamic, research-oriented team
  • Experience with distributed application concepts and DevOps tooling
  • Excellent writing and communication skills
  • Troubleshooting and debugging ability

WILL BE A PLUS

  • 2+ years of experience with Hadoop, Spark, and Airflow
  • Experience with DAGs and orchestration tools
  • Practical experience with developing Snowflake-driven data warehouses
  • Experience with developing event-driven data pipelines

See more jobs at Sigma Software

Apply for this job

11d

Data Engineer, Staff

Stay22Montréal, QC Remote
10 years of experienceairflowsqlDesignpythonAWS

Stay22 is hiring a Remote Data Engineer, Staff

Job Description

Stay22 is a fast growing and profitable Travel Tech startup in Montreal that helps content creators to better monetize their content while improving the overall experience of their end users through different innovative solutions that makes it easy to search for travel related services. Over 100 million unique users go every month through our products that are embedded in the top most pages and platforms in travel discovery. Our Data team serves as a cornerstone for various stakeholders including Finance, Customer Service, Sales, Product. Join us to push the boundaries of predictive modeling and AI in travel technology.

Job Summary

We are searching for a Data Engineer with strong back-end experience to join our growing Data team. You will support data structure and maintain data products related to Stay22 partners, user interactions, and transactions data. In this role, you will be responsible for processing, reviewing and organizing data. You will provide accurate, timely, and consistent data to the organization and will create data management strategies including data acquisition and cleansing, data validation activities, writing specifications for data verifications, etc.

Key Responsibilities

  • Designs and implements efficient ETL/ELT processes to ingest, transform, and load data.
  • Help define and build key datasets across all Stay22 product areas. Lead the evolution of these datasets as use cases grow
  • Analyze data flows and dependencies, and design and implement data models for optimal storage and retrieval processes
  • Optimizes data pipelines for performance, scalability, and reliability
  • Developing integrations with third party systems to source and/or distribute various datasets
  • Collaborates with cross-functional teams and multiple disciplines to understand data requirements and provide technical expertise on data-related projects
  • Implements data governance best practices to ensure data quality, integrity, and security
  • Builds and maintains documentation for data processes, data dictionaries, and data lineage
  • Applies highly developed knowledge of complex algorithms, data science methods, and statistical analysis techniques to prepare and present data visualizations to key stakeholders and leaders of relevant business units
  • Remains current with industry trends and best practices in data engineering, cloud technologies, and data analytics
  • Coaches and trains junior colleagues in techniques, processes, and responsibilities

Requirements

  • B.S. or M.S in Computer Science or equivalent experience
  • 8-10 years of experience in data engineering or related roles
  • Proven experience (8+ years) as a Data Engineer or similar role, with a focus on data modeling, SQL (6+ years), Python, and Snowflake or BigQuery.
  • Experience with workflow orchestration management engines such as Airflow, DBT, etc
  • Strong proficiency in SQL for data manipulation and querying
  • Hands-on experience with cloud-based data platforms (AWS, Google Cloud)
  • Solid understanding of data warehousing concepts, dimensional modeling, and ETL/ELT processes
  • Strong communication and collaboration skills, with the ability to work effectively in a team environment
  • Self-starter who takes ownership, gets results, and enjoys moving at a fast pace
  • Excellent problem-solving skills and the ability to translate business requirements into technical solutions

Why join Stay22?

  • We are dedicated to revolutionizing the way we book travel, using ML & AI to develop new solutions to help content creators monetize their content.
  • Growing fast means growth opportunities.
  • We make our stars run their own show in the Stay22 universe.
  • We also have the coolest (and bright) lair, available if you want to mingle, in the center of Montreal’s Plateau Mont Royal, surrounded by the best shops and restaurants in town.
  • We take people as they are: come-as-you-are dress code, personalized work schedules…
  • We give them what they want: health & dental benefits, learning & development opportunities, social & team building activities including cool retreats…

See more jobs at Stay22

Apply for this job

13d

Senior Machine Learning Engineer, Financial Crimes (Cash App)

SquareRemote, CA, Remote
Bachelor's degreeairflowDesigngitmysqlpythonAWS

Square is hiring a Remote Senior Machine Learning Engineer, Financial Crimes (Cash App)

Job Description

The Financial Crimes Technology team at Cash App detects and reports illegal and suspicious activity on Cash App. We work globally with partners in Product, Counsel and Engineering to ensure we are providing a safe user experience for our customers while minimizing or eliminating bad activity on our platform.

We use Machine Learning and Generative AI as an important part of our toolkit. As Cash App scales, we monitor hundreds of billions of dollars in transactions across traditional payment and blockchain networks. Our machine learning systems monitor and surface suspicious activity (money laundering, illegal activity and terms of service violations) for agent review. Our systems block payments in real-time where appropriate. We use generative AI technologies to improve agent workflow and case review tools, by adding features that accelerate agent productivity and allow them to make more informed and accurate decisions. We are looking for a senior MLE that can integrate vertically into the ML sub-team and focus on building/enhancing tools, libraries, frameworks, developer environments etc. for ML modeling workflows.

This is an IC role reporting into the Data Science and ML Modeling Manager that has leadership responsibilities including driving strategic roadmaps and priorities to completion by collaborating with cross functional stakeholders.

You will:

  • Design, build and enhance batch and real-time inference services and tooling that support our ML use cases
  • Facilitate modelers on the team by unblocking access to the infrastructure/tools necessary for development including MLOps
  • Develop prototypes and partner with ML modelers to encourage adoption of new tools and technologies and plan for future needs of our ML teams
  • Join a new and growing team and have a significant impact on influencing team culture.

Qualifications

You have:

  • 4+ years of combined Machine Learning and Engineering industry experience (full stack ML experience)
  • A Bachelor's degree in computer science, data science, operations research, applied math, stats, physics, or related technical field
  • Familiarity with Linux/OS X command line, version control software (git), and software development principles with a machine learning software development life-cycle orientation.
  • Experience working with product, business, and engineering to prioritize, scope, design, and deploy ML models
  • Familiarity with Python computing stack, MySQL, Snowflake, Airflow, Java/Go
  • Hosted models for inference on public clouds like GCP, AWS and/or built micro-services to facilitate event based triggering, feature generation, model inference and downstream actioning.

See more jobs at Square

Apply for this job

13d

Programmatic Ads Data Science Lead

SquareSan Francisco, CA, Remote
Bachelor degreetableauairflowsqlDesignpython

Square is hiring a Remote Programmatic Ads Data Science Lead

Job Description

The Data Science team at Cash App is customer-obsessed, collaborates intensely with other key disciplines, and always makes decisions with an eye towards Cash App’s business as a whole. The goal of the DS team is to derive valuable insights from our extremely unique datasets and turn those insights into actions that improve the experience for our customers every day. 

We’re hiring a Data Science Lead to join the Commerce team, supporting our programmatic ads business. We have developed a robust onsite advertising business on Afterpay and now we’re looking to build the next generation of our advertising business that will connect our merchants and customers more deeply off-platform. 

We’re seeking an exceptional Data Scientist to help define the strategy for this business and lay the foundation for transformative, highly complex, “0 to 1” features. You will play a critical role in accelerating Cash App’s growth by focusing on developing a strategic thought partnership with our teams in Product and Sales, and leverage data to enable us to achieve our roadmap goals, make effective spend decisions across marketing channels and understand the impact of incentives on Cash App users. 

You will:

  • Build models to optimize our programmatic marketing efforts to ensure our spend has the best possible ROI
  • Design and analyze experiments to evaluate the impact of marketing campaigns we launch
  • Analyze large datasets using SQL and scripting languages to surface actionable insights and opportunities to the Marketing product team and other key stakeholders
  • Approach problems from first principles, using a variety of statistical and mathematical modeling techniques to research and understand customer behavior & segments
  • Build, forecast, and report on metrics that drive strategy and facilitate decision making for key business initiatives
  • Write code to effectively process, cleanse, and combine data sources in unique and useful ways, often resulting in curated ETL datasets that are easily used by the broader team
  • Build and share data visualizations and self-serve dashboards for your partners
  • Effectively communicate your work with team leads and cross-functional stakeholders on a regular basis

Qualifications

You have:

  • A bachelor degree in statistics, data science, or similar STEM field with 7+ years of experience in a relevant role OR
  • A graduate degree in statistics, data science, or similar STEM field with 5+ years of experience in a relevant role
  • Advanced proficiency with SQL and data visualization tools (e.g. Tableau, Looker, etc)
  • Experience with scripting and data analysis programming languages, such as Python or R
  • Worked extensively with Causal Inference techniques and off platform data
  • A knack for turning ambiguous problems into clear deliverables and actionable insights 
  • Gone deep with cohort and funnel analyses, a deep understanding statistical concepts such as selection bias, probability distributions, and conditional probabilities

Technologies we use and teach:

  • SQL, Snowflake, etc.
  • Python (Pandas, Numpy)
  • Tableau, Airflow, Looker, Mode, Prefect

See more jobs at Square

Apply for this job

13d

Senior Software / Data Engineer

DataTribeRemote
agilenosqlairflowsqlc++dockerpostgresqlkuberneteslinuxpythonAWSbackend

DataTribe is hiring a Remote Senior Software / Data Engineer

Senior Software / Data Engineer - DataTribe - Career Page

See more jobs at DataTribe

Apply for this job

13d

Data Analyst, Marketing

Bachelor's degreetableauairflowsqlDesignc++pythonAWS

hims & hers is hiring a Remote Data Analyst, Marketing

Hims & Hers Health, Inc. (better known as Hims & Hers) is the leading health and wellness platform, on a mission to help the world feel great through the power of better health. We are revolutionizing telehealth for providers and their patients alike. Making personalized solutions accessible is of paramount importance to Hims & Hers and we are focused on continued innovation in this space. Hims & Hers offers nonprescription products and access to highly personalized prescription solutions for a variety of conditions related to mental health, sexual health, hair care, skincare, heart health, and more.

Hims & Hers is a public company, traded on the NYSE under the ticker symbol “HIMS”. To learn more about the brand and offerings, you can visit hims.com and forhers.com, or visit our investor site. For information on the company’s outstanding benefits, culture, and its talent-first flexible/remote work approach, see below and visit www.hims.com/careers-professionals.

​​About the Role:

As an Analyst on the Marketing Analytics team at Hims & Hers, you would be responsible for analyzing marketing campaign performance, implementing measurement and attribution strategies, and delivering insights and recommendations around business growth drivers. You will partner cross-functionally with Engineering, Product, and Marketing to drive acquisition and customer value growth for the business.

You Will:

  • Analyze detailed marketing campaign performance, customer data, and business growth levers to identify insights and tactical recommendations
  • Investigate the root causes of data anomalies and trends
  • Present key findings and recommendations to a wide range of audiences
  • Develop and automate dashboards and reporting that are critical to business stakeholder decision-making
  • Contribute to marketing attribution models and measurement initiatives such as test planning and design
  • Enable and evangelize self-service analytics through which we empower stakeholders to explore data and perform high-level analysis

You Have:

  • 1-3+ years experience as an analyst, writing complex queries, visualizing results, and delivering insights
  • Bachelor's degree in Business Administration, Marketing, Computer Science, Engineering, or related field, or relevant years of work experience
  • Strong SQL skills are a requirement; must be able to understand and author complex SQL queries, understand database schemas, and optimize query performance
  • Hands-on experience working with a modern analytics tech stack spanning comparable tools across the following categories:
    • Business intelligence/reporting tools: (Looker, Tableau, Mode, etc.)
    • Data warehouses: (Google BigQuery, AWS, Snowflake, etc.)
    • Scheduling or orchestration: (dbt, Airflow, etc.)
  • An understanding of the marketing business domain, particularly applications such as conversion lift testing, multi-touch attribution, and Marketing Mix Modeling (MMM)
  • Experience with web analytics (e.g. Amplitude, Google Analytics, etc.) and app analytics tools (e.g. Branch, AppsFlyer, etc.)
  • Familiarity with statistical experimentation and A/B testing principles
  • Python proficiency preferred: familiarity with Jupyter notebooks, pandas, scikit-learn, etc.
  • Results-oriented with strong project management and communications skills

Our Benefits (there are more but here are some highlights):

  • Competitive salary & equity compensation for full-time roles
  • Unlimited PTO, company holidays, and quarterly mental health days
  • Comprehensive health benefits including medical, dental & vision, and parental leave
  • Employee Stock Purchase Program (ESPP)
  • Employee discounts on hims & hers & Apostrophe online products
  • 401k benefits with employer matching contribution
  • Offsite team retreats

#LI-Remote

Outlined below is a reasonable estimate of H&H’s compensation range for this role for US-based candidates. If you're based outside of the US, your recruiter will be able to provide you with an estimated salary range for your location.

The actual amount will take into account a range of factors that are considered in making compensation decisions including but not limited to skill sets, experience and training, licensure and certifications, and location. H&H also offers a comprehensive Total Rewards package that may include an equity grant.

Consult with your Recruiter during any potential screening to determine a more targeted range based on location and job-related factors. We don’t ever want the pay range to act as a deterrent from you applying!

An estimate of the current salary range for US-based employees is
$90,000$117,000 USD

We are focused on building a diverse and inclusive workforce. If you’re excited about this role, but do not meet 100% of the qualifications listed above, we encourage you to apply.

Hims is an Equal Opportunity Employer and considers applicants for employment without regard to race, color, religion, sex, orientation, national origin, age, disability, genetics or any other basis forbidden under federal, state, or local law. Hims considers all qualified applicants in accordance with the San Francisco Fair Chance Ordinance.

Hims & hers is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, you may contact us at accommodations@forhims.com. Please do not send resumes to this email address.

For our California-based applicants – Please see our California Employment Candidate Privacy Policy to learn more about how we collect, use, retain, and disclose Personal Information. 

See more jobs at hims & hers

Apply for this job

13d

Manager, Product Analytics

tableauairflowsqlDesignc++python

hims & hers is hiring a Remote Manager, Product Analytics

Hims & Hers Health, Inc. (better known as Hims & Hers) is the leading health and wellness platform, on a mission to help the world feel great through the power of better health. We are revolutionizing telehealth for providers and their patients alike. Making personalized solutions accessible is of paramount importance to Hims & Hers and we are focused on continued innovation in this space. Hims & Hers offers nonprescription products and access to highly personalized prescription solutions for a variety of conditions related to mental health, sexual health, hair care, skincare, heart health, and more.

Hims & Hers is a public company, traded on the NYSE under the ticker symbol “HIMS”. To learn more about the brand and offerings, you can visit hims.com and forhers.com, or visit our investor site. For information on the company’s outstanding benefits, culture, and its talent-first flexible/remote work approach, see below and visit www.hims.com/careers-professionals.

​​About the Role:

As a Manager of Product Analytics you and your team will shape the customer experience through high quality experimental design and hypothesis testing. You will work cross-functionally with product managers, growth leads, and marketing managers in a fast paced collaborative environment. Your knowledge of A/B testing and digital analytics combined with your background in experimental design will allow Hims and Hers to build best-in-class customer experiences. This position will report to the Senior Manager of Product Analytics.

You Will:

  • Build best-in-class customer experiences by testing into sticky product experiences
  • Build business-facing dashboards and do in-depth analyses that state both the statistical significance and the business impact of an experiment
  • Work with your team to define and curate the experimentation roadmap for the product and growth teams
  • Enable data self-service by designing templates that are easy to understand using relevant KPIs
  • Define the success of tests as well as recommended improvements to the product from the results
  • Collaborate cross-functionally across analytics teams, engineering teams, and the growth team to improve the customer experience
  • Distill our knowledge of  tests into playbooks, that can be implemented and utilized to help us transform our digital experience
  • Segment users based on demographic, behavioral, and psychographic attributes to tailor product experiences and lifecycle communications
  • Partner with cross-functional teams including product, engineering, growth, and finance to align analytics initiatives with business objectives
  • Conduct deep-dive analyses to answer specific business questions and provide actionable recommendations to product, marketing, and operational teams

You Have:

  • 5+ years of analytics experience
  • 4+ years of experience in A/B testing
  • Experience working with subscription metrics
  • Experience working with CRM data and lifecycle communication
  • A strong work ethic and the drive to learn more and understand a problem in detail
  • Strong organizational skills with an aptitude to manage long-term projects from end to end
  • Strong SQL skills 
  • Expert working with GitHub
  • Experience programming in Python, SAS, or R 
  • Experience in data modeling and statistics with a strong knowledge of experimental design and statistical inference 
  • Advanced knowledge of data visualization and BI in Looker or Tableau
  • Ability to explain technical analyses to non-technical audiences

A Big Plus If You Have:

  • 6+ years of analytics experience
  • Advanced degree in Statistics, Mathematics, or a related field
  • Experience with incentives and loyalty programs
  • Experience with price testing and modeling price elasticity
  • Experience with telehealth concepts
  • Project management experience 
  • Extensive experience working with Data Engineering
  • Model development and training (Predictive Modeling)
  • DBT, airflow, and Databricks experience

Our Benefits (there are more but here are some highlights):

  • Competitive salary & equity compensation for full-time roles
  • Unlimited PTO, company holidays, and quarterly mental health days
  • Comprehensive health benefits including medical, dental & vision, and parental leave
  • Employee Stock Purchase Program (ESPP)
  • Employee discounts on hims & hers & Apostrophe online products
  • 401k benefits with employer matching contribution
  • Offsite team retreats

#LI-Remote

Outlined below is a reasonable estimate of H&H’s compensation range for this role for US-based candidates. If you're based outside of the US, your recruiter will be able to provide you with an estimated salary range for your location.

The actual amount will take into account a range of factors that are considered in making compensation decisions including but not limited to skill sets, experience and training, licensure and certifications, and location. H&H also offers a comprehensive Total Rewards package that may include an equity grant.

Consult with your Recruiter during any potential screening to determine a more targeted range based on location and job-related factors. We don’t ever want the pay range to act as a deterrent from you applying!

An estimate of the current salary range for US-based employees is
$144,000$166,500 USD

We are focused on building a diverse and inclusive workforce. If you’re excited about this role, but do not meet 100% of the qualifications listed above, we encourage you to apply.

Hims is an Equal Opportunity Employer and considers applicants for employment without regard to race, color, religion, sex, orientation, national origin, age, disability, genetics or any other basis forbidden under federal, state, or local law. Hims considers all qualified applicants in accordance with the San Francisco Fair Chance Ordinance.

Hims & hers is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, you may contact us at accommodations@forhims.com. Please do not send resumes to this email address.

For our California-based applicants – Please see our California Employment Candidate Privacy Policy to learn more about how we collect, use, retain, and disclose Personal Information. 

See more jobs at hims & hers

Apply for this job

16d

Sr Data Engineer GCP

Ingenia AgencyMexico Remote
Bachelor's degree5 years of experience3 years of experienceairflowsqlapipython

Ingenia Agency is hiring a Remote Sr Data Engineer GCP


AtIngenia Agency we’re looking for a Sr Data Engineer to join our team.

Responsible for creating and sustaining pipelines that allow for the analysis of data.

What will you be doing?

  • Sound understanding of Google Cloud Platform.
  • Should have worked on Big Query, Workflow or Composer.
  • Should know how to reduce BigQuery costs by reducing the amount of data processed by the queries.
  • Should be able to speed up queries by using denormalized data structures, with or without nested repeated fields.
  • Exploring and preparing data using BigQuery.
  • Experience in delivering artifacts scripts Python, dataflow components, SQL, Airflow and Bash/Unix scripting.
  • Building and productionizing data pipelines using dataflow.

What are we looking for?

  • Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field.
  • Age indifferent.
  • 3 to 5 years of experience in GCP is required.
  • Must have Excellent GCP, Big Query and SQL skills.
  • Should have at least 3 years of experience in BigQuery Dataflow and Experience with Python and Google Cloud SDK API Scripting to create reusable framework.
  • Candidate should have strong hands-on experience in PowerCenter
  • In depth understanding of architecture, table partitioning, clustering, type of tables, best practices.
  • Proven experience as a Data Engineer, Software Developer, or similar.
  • Expert proficiency in Python, R, and SQL.
  • Candidates with Google Cloud certification will be preferred
  • Excellent analytical and problem-solving skills.
  • A knack for independent and group work.
  • Capacity to successfully manage a pipeline of duties with minimal supervision.
  • Advanced English.
  • Be Extraordinary!

What are we offering?

  • Competitive salary
  • Law benefits:
    • 10 days of vacations to the first year fulfilled
    • IMSS
  • Additional benefits:
    • Contigo Membership (Insurance of minor medical expenses)
      • Personal accident policy.
      • Funeral assistance.
      • Dental and visual health assistance.
      • Emotional wellness.
      • Benefits & discounts.
      • Network of medical services and providers with a discount.
      • Medical network with preferential prices.
      • Roadside assistance with preferential price, among others.
    • 3 special permits a year, to go out to any type of procedure that you have to do half day equivalent
    • Half day off for birthdays
    • 5 days of additional vacations in case of marriage
    • 50% scholarship in language courses in the Anglo
    • Percentage scholarship in the study of graduates or masters with the Tec. de Mty.
    • Agreement with ticket company for preferential rates for events of entertainment.

See more jobs at Ingenia Agency

Apply for this job

16d

Data Scientist

In All Media IncMexico Remote
7 years of experienceairflowsqljenkinspythonAWS

In All Media Inc is hiring a Remote Data Scientist

As a Data Scientist, your expertise lies in comprehending both the technical and business facets of Data Analysis, Artificial Intelligence, and Machine Learning within the digital media industry. Your responsibility is to lead the execution of data science projects within the defined scope, while also anticipating and handling unforeseen DS/AI/ML needs in a dynamic setting.

Responsibilities Overview:

  1. Collaborate with business stakeholders to gather, examine, and transform business goals into problem statements for data science, artificial intelligence (AI), and machine learning (ML) projects.
  2. Convert problem statements in data science into technical specifications by defining calculations, custom groups, parameters, filtering criteria, aggregations, clusters, and other relevant components.
  3. Coordinate with the data engineering team to collect, process, cleanse, and validate multivariate data from internal or external systems, adhering to the technical specifications.
  4. Choose features, optimize classifiers, perform statistical analysis, develop diverse data models using machine learning techniques to identify data patterns and trends, and generate actionable insights and key performance indicators (KPIs).
  5. Deliver insights through intuitive presentations, interfaces, infographics, and visualizations, effectively conveying the business implications to stakeholders andleadership in a clear manner.
  6. Leverage consumer behavior data to uncover innovative product insights, driving consumer engagement and revenue growth for the business.
  7. Assist in segmenting consumers, predicting churn, and constructing recommendation systems.
  8. Build automated anomaly detection systems and continuously monitor their performance.
  9. Conduct ongoing evaluations of machine learning technology solutions, ensuring alignment with business objectives, identifying potential risks, and identifying areas for improvement within the current environment.

Must Haves:

To be successful in this role, we require an individual with the following qualifications and skills:

  • A minimum of 4 years of top-tier experience in Data Science/Machine Learning, and 7 years of experience in Data Analysis on cloud Infrastructure (predominantly AWS and preferred GCP), using languages such as R, Python, and SQL, among others.
  • An excellent understanding of the practical application of machine learning techniques and algorithms, including k-NN, Naive Bayes, SVM, Decision Tree, Random Forests, and others.
  • The ability to create, execute, and analyze complex AB/MVT test constructs to test hypotheses.
  • Experience using Amazon SageMaker/Tensorflow/Google Cloud AutoML/Keras.
  • Experience with Spark Streaming (Databricks) or other data science focused data processing platforms for real-time big data analytics.
  • Familiarity with other data engineering tools and systems such as Snowflake, Airflow, Jenkins, Github, and others.
  • Comfortable working in a fast-paced, high-tech environment (preferably in software development) and able to navigate conflicting priorities and ambiguous problems.
  • Proficient with data visualization tools such as Looker and Tableau.
  • Working knowledge of digital media ecosystems, including how digital video streaming, ad servers, DSPs, SSPs, Log analytics, and other related areas function.
  • Possesses a data-driven mindset, with excellent communication and collaboration skills that enable interaction with technical and non-technical stakeholders.

See more jobs at In All Media Inc

Apply for this job

20d

Data Engineer with TS/SCI Clearance

Maania Consultancy ServicesPartial Remote/Washington DC, DC
agilejiraairflowsqldockerelasticsearchpostgresqlkubernetesAWSjavascript

Maania Consultancy Services is hiring a Remote Data Engineer with TS/SCI Clearance

Data Engineer with TS/SCI Clearance - Maania Consultancy Services - Career Page

See more jobs at Maania Consultancy Services

Apply for this job

22d

Senior Data Engineer

Nile BitsCairo, Egypt, Remote
agileairflowsqlDesigndockerlinuxpythonAWS

Nile Bits is hiring a Remote Senior Data Engineer

Job Description

  • Designing and implementing core functionality within our data pipeline in order to support key business processes
  • Shaping the technical direction of the data engineering team
  • Supporting our Data Warehousing approach and strategy
  • Maintaining our data infrastructure so that our jobs run reliably and at scale
  • Taking responsibility for all parts of the data ecosystem, including data governance, monitoring and alerting, data validation, and documentation
  • Mentoring and upskilling other members of the team

Qualifications

  • Experience building data pipelines and/or ETL processes
  • Experience working in a Data Engineering role
  • Confident writing performant and readable code in Python, building upon the rich Python ecosystem wherever it makes sense to do so.
  • Good software engineering knowledge & skills: OO programming, design patterns, SOLID design principles and clean code
  • Confident writing SQL and good understanding of database design.
  • Experience working with web APIs.
  • Experience leading projects from a technical perspective
  • Knowledge of Docker, shell scripting, working with Linux
  • Experience with a cloud data warehouse
  • Experience in managing deployments and implementing observability and fault tolerance in cloud based infrastructure (i.e. CI/CD, Infrastructure as Code, container-based infrastructure, auto-scaling, monitoring and alerting)
  • Pro-active with a self-starter mindset; able to identify elegant solutions to difficult problems and able to suggest new and creative approaches.
  • Analytical, problem-solving and an effective communicator; leveraging technology and subject matter expertise in the business to accelerate our roadmap.
  • Able to lead technical discussions, shape the direction of the team, identify opportunities for innovation and improvement
  • Able to lead and deliver projects, ensuring stakeholders are kept up-to-date through regular communication
  • Willing to support the rest of the team when necessary, sharing knowledge and best practices, documenting design decisions, etc.
  • Willing to step outside your comfort zone to broaden your skills and learn new technologies.
  • Experience working with open source orchestration frameworks like Airflow or data analytics tools such as dbt
  • Experience with AWS services or those of another cloud provider
  • Experience with Snowflake
  • Good understanding of Agile

See more jobs at Nile Bits

Apply for this job

24d

Senior Site Reliability Engineer

AdwerxDurham, NC Remote
terraformairflowRabbitMQDesignqarubydockermysqlkubernetesNode.js

Adwerx is hiring a Remote Senior Site Reliability Engineer

Durham, NC or Remote

Adwerx is on the lookout for a Site Reliability Engineer to join our small and talented infrastructure team and help us design, build, and automate performant, resilient, and highly-available systems that our teams and customers rely on. In this role, you’ll help us run a handful of mature (and in some cases brand-new) services in the cloud and apply your skills to make them resilient, performant, and highly-available during the rapid adoption of our products. The infrastructure you’ll build has a large impact on an organization that is focused on software development best practices and standards.

The starting title for this experienced role will be based on tenure/experience/work history.

Our culture

Adwerx is a place where you can thrive in our highly collaborative teams and where everyone is encouraged to contribute ideas across all levels of the organization.

Our engineering charter is centered around humility, respect and trust. We abide by the mantra “if it’s not in version control, it doesn’t exist”, strive to write documentation our peers will love, and always try to leave things better than we found it. We employ testing and continuous delivery for all our services and empower our developers to iterate and deploy as often as they need.

Infrastructure engineers share an on-call schedule, but our systems are stable and fire drills are rare. We host lunch and learns, conduct blameless post-mortems and regularly recognize our peers with shout outs and a fun badge program to recognize leaders in specific technical disciplines.

How we work

We apply the Agile/Scrum methodology to run the day to day projects at Adwerx and are heavily inspired by the “Shape Up” process with our product development process. In addition we:

  • Utilize a mature CI/CD process and deploy to production many times a day.
  • Have production-like QA environments with a culture of writing automated tests.
  • Define department SLOs and Engineering KPIs to better understand how we work..
  • Relentlessly strive for excellence with not only the products we build but also the health of our codebase and our developer ecosystem.

Technologies we work with

  • Our primary application is built with Ruby on Rails. You’ll also encounter or work with Node.js, Go, and Python.
  • Our production systems run primarily in Google Cloud Platform though we also have a small footprint in Amazon Web Services
  • Besides our primary application, some services you will support include our VPN/Tailscale, CI/CD pipelines, Google Kubernetes Engine Clusters, MySQL databases, Airflow, RabbitMQ, and Redshift
  • Some tools we use include Terraform, Kubernetes, Datadog, Helm, Nginx, docker, NewRelic, and CircleCI

In this mission-critical role, you will:

  • Design, build, and maintain the core infrastructure for Adwerx
  • Create, maintain, and/or iterate on various workloads in Google Kubernetes Engine
  • Contribute to the Ruby on Rails monolith to upgrade dependencies, integrate with infrastructure features, or optimize performance
  • Maintain reliable network paths and connections between all external and internal services (DNS, VPN, VPC peering)
  • Participate and run point in handling production incidents
  • Participate in solution design for new features, products, systems, and tooling
  • Find new ways to use existing systems to improve scalability and performance for our platform
  • Interact with the larger organization to ensure the uptime and reliability of our infrastructure
  • Iterate on security standards and reviewing code for secure coding practices
  • Partner with engineering teams closely to educate and consult
  • Continually monitor application/system performance and costs (SLOs), generate actionable insights and either implement or advocate for them
  • Participate in on-call rotations, along with every member of the engineering team
  • Work closely with engineering teams to conduct root cause analyses for production incidents and make plans to remediate or prevent recurrences
  • Collaboratively plot the course and document Adwerx infrastructure
  • Build a great customer experience for people using your infrastructure

What You’ll Get:

  • Competitive salary and potential for equity.
  • Comprehensive medical, dental, and vision plan options (100% of basic plan premiums paid by company)
  • 401(k) plan with a company match of up to 4%
  • A collaborative work environment where you’ll learn about and influence every aspect of the business
  • The opportunity to work with and learn from talented leaders, developers, marketers and designers and advancement opportunities.
  • The ability to help define the foundational technology that will power the growth of our business
  • Flexible work scheduling

See more jobs at Adwerx

Apply for this job

25d

Software Engineer, Data

JW PlayerUnited States - Remote
agileairflowjavadockerelasticsearchkubernetespythonAWSbackend

JW Player is hiring a Remote Software Engineer, Data

About JWP:

JWP is transforming the Digital Video Economy as a trusted partner for over 40,000 broadcasters, publishers, and video-driven brands through our cutting-edge video software and data insights platform. JWP empowers customers with unprecedented independence and control over their digital video content. Established in 2004 as an open-source video player, JWP has evolved into the premier force driving digital video for businesses worldwide. With a rich legacy of pioneering video technology, JWP customers currently generate 8 billion video impressions/month and 5 billion minutes of videos watched/month. At JWP, everyone shares a passion for revolutionizing the digital video landscape. If you are ready to be a part of a dynamic and collaborative team then join us in shaping the future of video! 

The Data Engineering Team: 

At JWP, our data team is a dynamic and innovative team, managing the data lifecycle, from ingestion to processing and analysis, touching every corner of our thriving business ecosystem. Engineers on the team play a pivotal role in shaping the company's direction by making key decisions about our infrastructure, technology stack, and implementation strategies. 

The Opportunity: 

We are looking to bring on a Software Engineer to join our Data Engineering team. As an Engineer on the team, you will be diving into the forefront of cutting-edge big data tools and technology. In this role, you will have the opportunity to partner closely with various teams to tackle crucial challenges for one of the world's largest and rapidly expanding video companies. Join us and make an impact at the forefront of digital innovation.

As a Data Engineer, you will:

  • Contribute to the development of distributed batch and real-time data infrastructure.
  • Mentor and work closely with junior engineers on the team. 
  • Perform code reviews with peers. 
  • Lead small to medium sized projects, documenting and ticket writing the projects. 
  • Collaborate closely with Product Managers, Analysts, and cross-functional teams to gather insights and drive innovation in data products. 

Requirements for the role:

  • Minimum 3+ years of backend engineering experience with a passionate interest for big data.
  • Expertise with Python or Java and SQL. 
  • Familiarity with Kafka
  • Experience with a range of datastores, from relational to key-value to document
  • Demonstrate humility, empathy, and a collaborative spirit that fuels team success. 

Bonus Points:

  • Data engineering experience, specifically with data modeling, warehousing and building ETL pipelines
  • Familiarity with AWS - in particular, EC2, S3, RDS, and EMR
  • Familiarity with Snowflake
  • Familiarity with Elasticsearch
  • Familiarity with data processing tools like Hadoop, Spark, Kafka, and Flink
  • Experience with Docker, Kubernetes, and application monitoring tools
  • Experience and/or training with agile methodologies
  • Familiarity with Airflow for task and dependency management

Perks of being at JWP, United States

Our goal is to take care of you and ensure you will be successful in your new role. Your success is our success! 

As a full time employee, you will qualify for:

  • Private Medical, Vision and Dental Coverage for you and your family
  • Unlimited Paid Time Off
  • Stock Options Purchase Program
  • Quarterly and Annual Team Events
  • Professional Career Development Program and Career Development Progression
  • New Employee Home Office Setup Stipend
  • Monthly Connectivity Stipend
  • Free and discounted perks through JWP's benefit partners
  • Bi-Annual Hack Weeks for those who are interested in using their coding knowledge
  • Fireside chats with individuals throughout JWP

*Benefits are subject to location and can change at the discretion of the Company. 

Check out our social channels:

    

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

See more jobs at JW Player

Apply for this job