airflow Remote Jobs

140 Results

1h

Distributed Cloud l Google Data Project

DevoteamLisboa, Portugal, Remote
Bachelor's degreeterraformairflowsqlazurejavadockerpythonAWS

Devoteam is hiring a Remote Distributed Cloud l Google Data Project

Job Description

Devoteam Distributed Cloud is our Google, AWS and Azure strategy and identity within the group Devoteam. We focus on developing solutions end to end within all the 3 major Cloud Platforms and its technologies.

Our Devoteam Google Cloud Team is looking for a Cloud Data Engineer to join our Data Engineer specialists.

  • Delivery of Data projects more focused on the Engineering component;
  • Working with GCP Data Services such as BigQuery, Cloud Storage , Dataflow, Dataproc, Pub/Sub and Dataplex;
  • Write efficient SQL queries;
  • Develop data processing pipelines using programming frameworks like Apache Beam and CI/CD automatisms; 
  • Automate data engineering tasks;
  • Building and managing data pipelines, with a deep understanding of workflow orchestration, task scheduling, and dependency management;
  • Data Integration and Streaming, including data ingestion from various sources (such as databases, APIs, or logs) into GCP. 

Qualifications

  • Bachelor's degree in IT or a similar field;
  • 2+ years of professional experience in a data engineering role;
  • Experience with GCP Data Services;
  • Data warehousing knowledge;
  • Knowledge of programming languages such as Python, Java, and SQL (mandatory);
  • Experience with tools like Apache Airflow, Google Cloud Composer, or Cloud Data Fusion;
  • A code-review mindset;
  • Experience with Terraform, GitHub, Github Actions, Bash, and/or Docker;
  • Knowledge of streaming data processing using tools like Apache Kafka;
  • GCP certifications (a plus);
  • Proficiency in English (written and spoken.

Note: All of the above qualifications are optional, but it is preferred that candidates have some of them.

See more jobs at Devoteam

Apply for this job

1d

Senior Site Reliability Engineer

CatalystRemote (US & Canada)
kotlinterraformairflowDesignansiblerubyjavadockerelasticsearchpostgresqlkuberneteslinuxpythonAWSbackendNode.js

Catalyst is hiring a Remote Senior Site Reliability Engineer

Company Overview

Totango + Catalyst have joined forces to build a leading customer growth platform that helps businesses protect and grow their revenue. Built by an experienced team of industry leaders, our software integrates with all the tools CS teams already use to provide one centralized view of customer data.  Our modern and intuitive dashboards help CS leaders develop impactful workflows and take the right actions to understand health, prevent churn, increase adoption, and drive expansion.

Position Overview

As a Senior Site Reliability Engineer at Totango + Catalyst, you will help shape our infrastructure and build the foundation our team relies on for the rapid delivery of our product. We’ll depend on you to instill best practices for building scalable distributed systems, emphasizing development experience, observability and fault tolerance. Our current stack consists of technologies such as Ruby on Rails, RDS, Elasticsearch, Java, and Kubernetes, and we are moving towards microservices and serverless.  If you thrive in a growth-stage startup environment and are looking for more ownership and the ability to have a significant impact, we would love to meet you.

This role is opened to candidates working remotely anywhere in Canada and the U.S.

What You’ll Do

  • Manage our AWS infrastructure, with an emphasis on configuration as code.
  • Keep our site and our services up and running, or get it back up and running quickly when a failure occurs
  • Improve monitoring and work with developers to improve performance and reliability
  • Participate in technical design reviews and architecture planning
  • Debugging complex problems across an entire stack and creating solid solutions
  • Collaborate with product managers and developers to evolve our delivery pipeline
  • Working closely with internal partners and teams to ensure that we ship software that meets security, SLA, performance, and budget requirements
  • Help build our on-call policies and runbooks
  • Take ownership of projects and demonstrate a high level of accountability
  • Manage our data infrastructure and pipeline
  • Focus on quality, cost-effective scalability, and distributed system reliability and establish automated mechanisms

Who You Are:

  • You are passionate about learning. Obstacles and challenges don’t deter you, you find these as opportunities to learn and grow.
  • You have a positive demeanor and a go-getter attitude! 
  • You are a strong team player. You collaborate well with others, and want to work together to solve common goals.
  • You are proactive in seeking opportunities to learn and identifying opportunities to improve our processess. 



What You’ll Need

  • 5+ years of experience building and maintaining cloud infrastructure for distributed production systems
  • 1+ year of experience as a backend engineer developing enterprise web applications
  • Excellent communication skills, both verbal and written
  • Know your way around a Unix/Linux shell, can write shell scripts, and understands Linux internals
  • Experience debugging complex problems
  • Experience designing, building, and operating large-scale production systems
  • Proficiency in Bash, Python, or other scripting languages
  • Experience in databases and data warehouses
  • Experience with security requirements for SOC2/ISO
  • FinOps experience
  • Strong Project Management skills
  • A strong desire to show ownership of problems you identify
  • Optional CKAD, CKS, CKA Exam, AWS Certified Exams

Technologies You’ll Need

  • Demonstrated experience with configuration and orchestration tools such as Terraform, CloudFormation and Ansible
  • Experience with containers, such as Docker 
  • Experience with administering, securing, and optimizing Kubernetes clusters
  • Experience building monitoring, observability, logging, and developer tooling
  • Experience with Helm, Kustomize, ArgoCD, Grafana, Prometheus, Thanos, VictoriaMetrics, Cilium, Linkerd, Envoy, AWS App Mesh, CoreDNS
  • Experience creating CI/CD Pipelines for different coding languages
  • Experience with one or more: Ruby on Rails, Python, Java, Kotlin, Go, Node.js
  • Experience with version control systems like GitHub
  • Familiarity with AWS services, AWS best practices and securing AWS accounts
  • Experience operating and tuning data stores such as PostgreSQL and Elasticsearch
  • Experience with managing the infrastructure that backs data pipelines and data lakes such as Airflow
  • Experience managing streaming infrastructure such as Kafka or Kinesis

Why You’ll Love Working Here!

  • Work from anywhere!
  • Highly competitive compensation package, including equity 
  • Comprehensive benefits, including up to 100% paid medical, dental, & vision insurance coverage for you & your loved ones
  • Open vacation policy, encouraging you to take the time you need
  • Monthly Mental Health Days and Mental Health Weeks twice per year 
  • Ability to influence and drive key technical and architectural decisions
  • High visibility and impact across the whole company

 

Your base pay is one part of your total compensation package and is determined within a range. The base salary for this role is from $140,000.00 - $175,000.00 per year. We take into account numerous factors in deciding on compensation, such as experience, job-related skills, relevant education or training, and other business and organizational requirements. The salary range provided corresponds to the level at which this position has been defined.

Totango + Catalyst is an equal opportunity employer, meaning that we do not discriminate based on race, religion, national origin, gender identity, age, sexual orientation, or any other protected class. Diversity is more than just good intentions; we are committed to creating an inclusive environment for all employees

See more jobs at Catalyst

Apply for this job

2d

Senior Software Engineer I, Data

NarvarHybrid - Bangalore
golangBachelor's degreescalaairflowsqlDesignjavapythonAWS

Narvar is hiring a Remote Senior Software Engineer I, Data

Narvar is growing! We are looking for a highly skilled and experienced Senior Software Engineer to join our Data Engineering team. In this role, you will lead, design and build data pipelines and systems that can efficiently store, process, and analyze large and complex datasets. 

Data products are at the heart of Narvar’s core business strategy and competitive advantage. The work you’ll do will impact Narvar’s whole business, our partners, and the lives of millions of consumers globally!

Narvar handles transactional data for more than 1200+ leading brands and retailers worldwide using our shipment tracking, returns, customer care, bidirectional multi-channel communication, and analytics products to transform their customers' post-purchase experiences.  

Day-to-day

  • Develop and implement data pipelines and systems that can handle large volumes of data
  • Process TBs of data delivering actionable insights and intelligence using technologies such as Spark, Airflow, Google Pubsub, Pulsar, BigQuery, DBT. 
  • Collaborate with data scientists and other teams to integrate data into business processes and decision making
  • Maintain and optimize existing data systems for costs, ease of access, and data governance
  • Improve data quality by building any tooling, testing, and observability pipelines.
  • Stay up to date with the latest advances in data engineering and implement new technologies as needed.

What we're looking for

  • Bachelors in Computer Science, Engineering or similar
  • You have 7+ years of relevant experience 
  • Proficiency with Java, Golang, Scala, or Python
  • Strong knowledge of computer science fundamentals and data structures.
  • Expert SQL skills.
  • Hands-on experience building big data processing systems
  • Experience with Cloud technology stacks (e.g., GCP or AWS and their product offerings)
  • You have dealt with large amounts of data in production and have built distributed data processing using frameworks like Spark, Hadoop, Apache Beam, or Flink
  • Experience with large-scale data warehousing architecture, data lakes, and data modeling
  • Experience with Data Ops and data reliability
  • Experience with error handling, data validation, dbt models.
  • Previous startup experience strongly preferred

Why Narvar?

We're on a mission to simplify the everyday lives of consumers. Post-purchase is a critical phase of the customer journey. That's why we created Narvar - a platform focused on driving customer loyalty through seamless post-purchase experiences that allow retailers to retain, engage, and delight customers. If you've ever bought something online, there's a good chance you've used our platform!

From the hottest new direct-to-consumer companies to retail’s most renowned brands, Narvar works with GameStop, Neiman Marcus, Sonos, Nike, and 1300+ other brands. With hubs in San Francisco, Atlanta, London, and Bangalore, we've served over 125 million consumers worldwide across 10+ billion interactions, 38 countries, and 55 languages.

Pioneering the post-purchase movement means navigating into the unknown. Our team thrives on this sense of adventure while nurturing a mindset of innovation. We're a home for big hearts and we leave our egos at the door. We work hard but we always make time to celebrate professional wins, baby showers, birthday parties, and everything in between.

We are an equal-opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

#LI-PR1

#LI-Hybrid

Please read our Privacy Policy to learn what personal information we collect in connection with your job application, and how we may use and share it. 

See more jobs at Narvar

Apply for this job

2d

Pessoa Engenheira de Dados Sênior

ExperianSão Paulo, Brazil, Remote
S3LambdascalaairflowjavapythonAWS

Experian is hiring a Remote Pessoa Engenheira de Dados Sênior

Job Description

Já pensou em alavancar a plataforma de dados e MLOps do maior banco de dados da América Latina? Se você é movido por inovação e gosta de atuar em parceria contínua com todas as áreas da empresa, venha fazer parte do nosso time!
 
Como será o seu dia a dia?

  •  Atuação direta nas discussões de arquitetura da plataforma;
  • Desenvolvimento dos produtos de dados;
  • Interação com principais usuários da plataforma visando mapear e entender dores, traduzindo em backlog de desenvolvimento da plataforma;
  • Identificação e resolução de problemas relacionados aos projetos em que está atuando;
  • Identificação de oportunidades de ganho de eficiência operacional e financeira das plataforma em que está atuando.

Você será responsável por:

  • Desenvolvimento de plataforma de dados para consumo de diversas fontes, batch e real-time;
  • Desenvolvimento APIs de acesso e consumo dos dados (data products);
  • isseminação do conhecimento sobre as melhores práticas de uso de dados (data contracts, governança federada e engenharia de plataformas);
  • Garantir que arquitetura comporte necessidades e requisitos técnicos do time de execução, trabalhando em conjunto com times de TI (Arquitetura, Segurança, Infraestrutura) e com áreas de negócio;
  • Influnciar técnicamente o time direcionando boas práticas de dados e engenharia de software.

Qualifications

O que buscamos em você?

  • Autonomia e iniciativa para atuação na resolução de problemas;
  • Boa comunicação e perfil resolutivo;
  • Spark (Batch & Streaming);
  • Proficiência em pelo menos umas das linguagens de programação como Scala, Java e Python;
  • Orquestração de pipelines de dados (Airflow);
  • AWS Data Stack (DynamoDB , s3, Lambda, EMR, MSK, Glue, Athena, Lake Formation, RDS);
  • Data lake/ Lakehouse (iceberg/delta);
  • Arquitetura de dados;
  • Melhores práticas de engenharia de software;
  • Testes Automatizados;
  • Conhecimento em CI/CD.

See more jobs at Experian

Apply for this job

3d

Senior Data Engineer, Core

InstacartUnited States - Remote
airflowsqlDesign

Instacart is hiring a Remote Senior Data Engineer, Core

We're transforming the grocery industry

At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

Instacart is a Flex First team

There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

Overview

At Instacart, our mission is to create a world where everyone has access to the food they love and more time to enjoy it together. Millions of customers every year use Instacart to buy their groceries online, and the Data Engineering team is building the critical data pipelines that underpin all of the myriad of ways that data is used across Instacart to support our customers and partners.

 

About the Role 

Instacart’s Core Data Engineering team plays a critical role in defining and maintaining company-wide datasets, standardized for uniform, reliable, timely and accurate insights from our data. This is a high impact, high visibility role owning critical data integration pipelines and models across all of Instacart’s products. This role is an exciting opportunity to join a key team shaping our most critical data.



About the Team 

Core Data Engineering is part of the Infrastructure Engineering pillar, working closely with data engineers, data scientists and senior leaders across the company on developing and standardizing critical company-wide datasets. Our team also collaborates closely with other data infrastructure teams on designing and building key data platforms, systems and tools to make everyone at Instacart more productive with data.



About the Job 

  • You will be part of a team with a large amount of ownership and autonomy.
  • Large scope for company-level impact working on critical data.
  • You will work closely with engineers and both internal and external stakeholders, owning a large part of the process from problem understanding to shipping the solution.
  • You will ship high quality, scalable and robust solutions with a sense of urgency.
  • You will have the freedom to suggest and drive organization-wide initiatives.




About You

Minimum Qualifications

  • 6+ years of working experience in a Data/Software Engineering role, with a focus on building data pipelines.
  • Expert knowledge of SQL and Python.
  • Experience building high quality ETL/ELT pipelines.
  • Experience with cloud-based data technologies such as Snowflake, Databricks, Trino/Presto, or similar.
  • Adept at fluently communicating with many cross-functional stakeholders to drive requirements and design shared datasets.
  • A strong sense of ownership, and an ability to balance a sense of urgency with shipping high quality and pragmatic solutions.
  • Experience working with cross functional stakeholders on metric development, including data scientists, analysts, finance and senior leaders.
  • Experience working with a large codebase on a cross functional team.

 

Preferred Qualifications

  • Bachelor’s degree in Computer Science, Computer Engineering, Electrical Engineering OR equivalent work experience.
  • Experience with Snowflake, dbt and Airflow
  • Experience with data quality monitoring/observability, either using custom frameworks or tools like Great Expectations, Monte Carlo or similar

Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policyhere.

Offers may vary based on many factors, such as candidate experience and skills required for the role.Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offeringshere.

For US based candidates, the base pay ranges for a successful candidate are listed below.

CA, NY, CT, NJ
$192,000$213,000 USD
WA
$184,000$204,000 USD
OR, DE, ME, MA, MD, NH, RI, VT, DC, PA, VA, CO, TX, IL, HI
$176,000$196,000 USD
All other states
$159,000$177,000 USD

See more jobs at Instacart

Apply for this job

3d

Data Engineer

Maker&Son LtdBalcombe, United Kingdom, Remote
golangtableauairflowsqlmongodbelasticsearchpythonAWS

Maker&Son Ltd is hiring a Remote Data Engineer

Job Description

We are looking for a highly motivated individual to join our team as a Data Engineer.

We are based in Balcombe [40 mins from London by train, 20 minutes from Brighton] and we will need you to be based in our offices at least 3 days a week.

You will report directly to the Head of Data.

Candidate Overview

As a part of the Technology Team your core responsibility will be to help maintain and scale our infrastructure for analytics as our data volume and needs continue to grow at a rapid pace. This is a high impact role, where you will be driving initiatives affecting teams and decisions across the company and setting standards for all our data stakeholders. You’ll be a great fit if you thrive when given ownership, as you would be the key decision maker in the realm of architecture and implementation.

Responsibilities

  • Understand our data sources, ETL logic, and data schemas and help craft tools for managing the full data lifecycle
  • Play a key role in building the next generation of our data ingestion pipeline and data warehouse
  • Run ad hoc analysis of our data to answer questions and help prototype solutions
  • Support and optimise existing ETL pipelines
  • Support technical and business stakeholders by providing key reports and supporting the BI team to become fully self-service
  • Own problems through to completion both individually and as part of a data team
  • Support digital product teams by performing query analysis and optimisation

 

Qualifications

Key Skills and Requirements

  • 3+ years experience as a data engineer
  • Ability to own data problems and help to shape the solution for business challenges
  • Good communication and collaboration skills; comfortable discussing projects with anyone from end users up to the executive company leadership
  • Fluency with a programming language - we use NodeJS and Python but looking to use Golang
  • Ability to write and optimise complex SQL statements
  • Familiarity with ETL pipeline tools such as Airflow or AWS Glue
  • Familiarity with data visualisation and reporting tools, like Tableau, Google Data Studio, Looker
  • Experience working in a cloud-based software development environment, preferably with AWS or GCP
  • Familiarity with no-SQL databases such as ElasticSearch, DynamoDB, or MongoDB

See more jobs at Maker&Son Ltd

Apply for this job

3d

Engenheiro de Dados / Desenvolvedor

HolosMediaSão Paulo, Brazil, Remote
Djangonosqlairflowsqllinuxpython

HolosMedia is hiring a Remote Engenheiro de Dados / Desenvolvedor

Descrição da vaga

Suas principais atribuições serão ligadas a área tech de desenvolvimento, mais especificamente, implementação, gestão e manutenção de diferentes pipelines de dados e desenvolvimento de soluções de automação de processos. Você será responsável por manter os pipelines atuais da empresa em funcionamento, buscando melhorias contínuas e trazendo potenciais inovações através de implementações de frameworks e stacks que possam aprimorar a qualidade dos processos. É importante que você tenha dinâmica para navegar em áreas diferentes de negócio, principalmente áreas relacionadas a Marketing e Comunicações, e um lado de desenvolvedor apurado, com boa capacidade técnica.

  • Levantamento, arquitetura, análise, desenvolvimento e implantação de projetos baseados em tecnologias de Business Intelligence e Data Science;
  • Extração, transformação e armazenamentos de dados nas várias camadas de dados hospedados na nuvem e populados por batch ou streaming data pipelines;
  • Identificação de requisitos de infra-estrutura para soluções de Business Intelligence e Data Science;
  • Definição de componentes, integração e implementação de soluções de Business Intelligence e Data Science;
  • Desenvolvimento de serviços utilizando APIs;
  • Projetar, desenvolver, testar, monitorar, gerenciar e validar atividades de data warehouse, incluindo processos de extração, transformação, movimentação, carregamento, limpeza e atualização de dados.

Qualificações

  • Ensino superior completo;
  • Experiência sólida com as linguagens: Python e SQL;
  • Experiência sólida na criação e manutenção de pipelines de transformação de dados (ETL/ELT);
  • Experiência em análise e modelagem de dados;
  • Experiência desenvolvendo APIs Rest com FastAPI, Django ou Flask;
  • Experiência sólida com Apache Airflow;
  • Experiência sólida com Apache Spark;
  • Experiência sólida com banco de dados SQL e NoSQL (ao menos um framework de cada);
  • Experiência sólida com servidores Linux;
  • Sólidos conhecimentos em Linux Shell;
  • Experiência com Datawarehouse (implementação e manutenção) modelo colunar.
  • Experiência com alguma ferramenta de dashboard (preferência DataStudio ou PowerBI);
  • Familiaridade com as melhores práticas de desenvolvimento de software;
  • Experiência com GitHub e metodologias de versionamento;
  • Experiência com desenvolvimento de software ou similar;
  • Inglês técnico avançado;
  • Motivado e autogerido.

See more jobs at HolosMedia

Apply for this job

4d

Software Engineer (Backend)

redisagileBachelor's degreescalanosqlairflowpostgressqlDesignc++jenkinsAWSbackend

SecurityScorecard is hiring a Remote Software Engineer (Backend)

About SecurityScorecard:

SecurityScorecard is the global leader in cybersecurity ratings, with over 12 million companies continuously rated, operating in 64 countries. Founded in 2013 by security and risk experts Dr. Alex Yampolskiy and Sam Kassoumeh and funded by world-class investors, SecurityScorecard’s patented rating technology is used by over 25,000 organizations for self-monitoring, third-party risk management, board reporting, and cyber insurance underwriting; making all organizations more resilient by allowing them to easily find and fix cybersecurity risks across their digital footprint. 

Headquartered in New York City, our culture has been recognized by Inc Magazine as a "Best Workplace,” by Crain’s NY as a "Best Places to Work in NYC," and as one of the 10 hottest SaaS startups in New York for two years in a row. Most recently, SecurityScorecard was named to Fast Company’s annual list of theWorld’s Most Innovative Companies for 2023and to the Achievers 50 Most Engaged Workplaces in 2023 award recognizing “forward-thinking employers for their unwavering commitment to employee engagement.”  SecurityScorecard is proud to be funded by world-class investors including Silver Lake Waterman, Moody’s, Sequoia Capital, GV and Riverwood Capital.

About the Team

The Data Analytics Engineering team is responsible for developing and managing the core data platform for ratings infrastructure, architecting and implementing business-critical data solutions and pipelines, and enabling data-driven decisions within the organization and for our customers.

About the Role

As a Software Engineer (Backend) - you will work alongside outstanding engineers, refine requirements with product management and implement new products and features, focused on meeting the evolving needs of our customers. All team members actively participate in product definition, technical architecture review, iterative development, code review, and operations. Along with this, you’ll have the opportunity to interact with customers to ensure their needs are met.You will be working in a high-performance, fast-paced environment and contribute to an inclusive work environment.

Responsibilities:

  • Collaborate with engineers to deliver projects from inception to successful execution
  • Write well-crafted, well-tested, readable, maintainable code
  • Participate in code reviews to ensure code quality and distribute knowledge
  • Share engineering support, release, and on-call responsibilities for an always-on 24x7 site
  • Participate in Technical Design Review sessions, and have the ability to explain the various trade-offs made in decisions
  • Maintain existing APIs and data pipelines, contribute to increasing code-coverage 
  • Understand requirements, build business logic and ability to learn and quickly adopt for changing needs 
  • Automate and improve existing processes to sustainably maintain the current features and pipelines
  • Analyze our internal systems and processes and locate areas for improvement/automation

Requirements

  • BS/MS in computer science or equivalent technical experience, and must have worked in Data engineering space for 2+ years
  • Must have experience in full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations 
  • Technical requirements:
    • Must have experience in building and maintaining big data pipelines using Scala with Spark, Airflow, Hive, Presto, Redis
    • Experience in developing batch/real-time data streams to create meaningful analytics
    • Worked with NoSQL databases, preferably Clickhouse, Cassandra / Scylla; and SQL databases, preferably Postgres
    • Worked with CI/CD pipelines using Jenkins
    • Experience with cloud environments, preferably AWS
    • Worked with variety of data (structured/unstructured), data formats (flat files, XML, JSON, relational, parquet)
  • Worked in Agile methodology

Benefits:

Specific to each country, we offer a competitive salary, stock options, Health benefits, and unlimited PTO, parental leave, tuition reimbursements, and much more!

SecurityScorecard is committed to Equal Employment Opportunity and embraces diversity. We believe that our team is strengthened through hiring and retaining employees with diverse backgrounds, skill sets, ideas, and perspectives. We make hiring decisions based on merit and do not discriminate based on race, color, religion, national origin, sex or gender (including pregnancy) gender identity or expression (including transgender status), sexual orientation, age, marital, veteran, disability status or any other protected category in accordance with applicable law. 

We also consider qualified applicants regardless of criminal histories, in accordance with applicable law. We are committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. If you need assistance or accommodation due to a disability, please contact talentacquisitionoperations@securityscorecard.io.

Any information you submit to SecurityScorecard as part of your application will be processed in accordance with the Company’s privacy policy and applicable law. 

SecurityScorecard does not accept unsolicited resumes from employment agencies.  Please note that we do not provide immigration sponsorship for this position. 

See more jobs at SecurityScorecard

Apply for this job

4d

Data Engineer

Blue Orange DigitalBrazil - Remote
DevOPSterraformairflowsqlDesignazuredockerlinuxpythonAWS

Blue Orange Digital is hiring a Remote Data Engineer

Company Overview:

Blue Orange Digital is a cloud-based data transformation and predictive analytics development firm with offices in NYC and Washington, DC. From startups to Fortune 500s, we help companies make sense of their business challenges by applying modern data analytics techniques, visualizations, and AI/ML. Founded by engineers, we love passionate technologists and data analysts. Our startup DNA means everyone on the team makes a direct contribution to the growth of the company.

Position Overview:

Blue Orange is seeking a Data Engineer to join our team to help build up our data engineering practice. Our engineers require a diverse skill set including system administration, DevOps, infrastructure automation, data modeling, and workflow orchestration. Blue Orange builds enterprise data platforms and systems for a variety of clients, so this candidate should have experience with supporting modern data technologies. The ideal candidate will have experience with multiple data engineering technologies across multiple clouds and deployment scenarios. In particular, we’re looking for someone with experience with Azure DevOps, Databricks, and Python.

Responsibilities:

  • Work with data teams to help design, build, and deploy data platforms in the cloud (Azure, AWS, GCP) and automate their operation.
  • Work with Azure DevOps, Azure Pipelines, Terraform, CloudFormation, and other Automation and infrastructure tools to build robust systems.
  • Work with Databricks, Spark, Python other data orchestration, and ETL tools to build high-performance data pipelines.
  • Provide leadership in applying software development principles and best practices, including Continuous Integration, Continuous Delivery/Deployment, and managing Infrastructure as Code and automated Testing across multiple software applications.
  • Support heterogeneous technologies environments including both Windows and Linux systems.
  • Develop reusable, automated processes, and custom tools.
  • Any other duties as directed by your direct manager.

Requirements:

  • BA/BS degree in Computer Science or a related technical field, or equivalent practical experience.
  • At least 2 years experience building and supporting data platforms; exposure to data technologies like Azure Data Factory, Azure Synapse Analytics, Airflow, Spark, etc.
  • Experience with Cloud Data Platforms, like Snowflake and Databricks.
  • Advanced level Python, SQL, and Bash scripting.
  • Experience designing and building robust CI/CD pipelines.
  • Strong Linux system administration skills.
  • Comfortable with Docker, configuration management, and monitoring tools.
  • Knowledge of best practices related to security, performance, and disaster recovery.
  • Experience working in cloud environments, at a minimum experience in Azure and AWS.
  • Enjoys collaborating with other engineers on architecture and sharing designs with the team.
  • Excellent verbal and written English communication.
  • Interacts with others using sound judgment, good humor, and consistent fairness in a fast-paced environment.

Preferred qualifications:

  • Hold certifications for Azure DevOps, Azure Data Fundamentals, Databricks, Snowflake

Benefits:

  • Fully remote
  • Flexible Schedule
  • Unlimited Paid Time Off (PTO)
  • Paid parental/bereavement leave
  • Worldwide recognized clients to build skills for an excellent resume
  • Top-notch team to learn and grow with

Salary: USD $4850 - $5250 (monthly salary range)

Background checks may be required for certain positions/projects.

Blue Orange Digital is an equal opportunity employer.

See more jobs at Blue Orange Digital

Apply for this job

4d

Principal Data Engineer

MLairflowsqlB2CRabbitMQDesignjavac++pythonAWS

hims & hers is hiring a Remote Principal Data Engineer

Hims & Hers Health, Inc. (better known as Hims & Hers) is the leading health and wellness platform, on a mission to help the world feel great through the power of better health. We are revolutionizing telehealth for providers and their patients alike. Making personalized solutions accessible is of paramount importance to Hims & Hers and we are focused on continued innovation in this space. Hims & Hers offers nonprescription products and access to highly personalized prescription solutions for a variety of conditions related to mental health, sexual health, hair care, skincare, heart health, and more.

Hims & Hers is a public company, traded on the NYSE under the ticker symbol “HIMS”. To learn more about the brand and offerings, you can visit hims.com and forhers.com, or visit our investor site. For information on the company’s outstanding benefits, culture, and its talent-first flexible/remote work approach, see below and visit www.hims.com/careers-professionals.

​​About the Role:

We're looking for an experienced Principal Data Engineer to join our Data Platform Engineering team. Our team is responsible for enabling H&H business (Product, Analytics, Operations, Finance, Data Science, Machine Learning, Customer Experience, Engineering) by providing a platform with a rich set of data and tools to leverage.

You Will:

  • Serve as a technical leader within the Data Platform org. Provide expert guidance and hands-on development of complex engineering problems and projects
  • Collaborate with cross-functional stakeholders including product management, engineering, analytics, and key business representatives to align the architecture, vision, and roadmap with stakeholder needs
  • Establish guidelines, controls, and processes to make data available for developing scalable data-driven solutions for Analytics and AI
  • Create and set best practices for data ingestion, integration, and access patterns to support both real-time and batch-based consumer data needs
  • Implement and maintain data governance practices to ensure compliance, data security, and privacy.
  • Design and lead development on scalable, high-performance data architecture solutions that supports both the consumer side of the business as well as analytic use cases
  • Plan and oversee large-scale and complex technical migrations to new data systems and platforms
  • Drive continuous data transformation to minimize technical debt
  • Display strong thought leadership and execution in pursuit of modern data architecture principles and technology modernization
  • Define and lead technology proof of concepts to ensure feasibility of new data technology solutions
  • Provide technical leadership and mentorship to the members of the team, fostering a culture of technical excellence
  • Create comprehensive documentation for design, and processes to support ongoing maintenance and knowledge sharing
  • Conduct design reviews to ensure that proposed solutions address platform and stakeholder pain points, as well as meet business, and technical requirements, with alignment to standards and best practices
  • Prepare and deliver efficient communications to convey architectural direction and how it aligns with company strategy. Be able to explain the architectural vision and implementation to executives

You Have:

  • Bachelor's or Master's degree in Computer Science or equivalent, with over 12 years of Data Architecture and Data Engineering experience, including team leadership
  • Proven expertise in designing data platforms for large-scale data and diverse data architectures, including warehouses, lakehouses, and integrated data stores.
  • Proficiency and hands-on knowledge in a variety of technologies such as SQL, Bash, Python, Java, Presto, Spark, AWS, data streaming like Kafka, RabbitMQ,
  • Hands-on experience and proficiency with data stacks including Airflow, Databricks, and dbt, as well as data stores such as Cassandra, Aurora, and ZooKeeper
  • Experience with data security (including PHI and PII), as well as data privacy regulations (CCPA and GDPR)
  • Proficient in addressing data-related challenges through analytical problem-solving and aligning data architecture with organizational business goals and objectives
  • Exposure to analytics techniques using ML and AI to assist data scientists and analysts in deriving insights from data
  • Analytical and problem-solving skills to address data-related challenges and find optimal solutions
  • Ability to manage projects effectively, plan tasks, set priorities, and meet deadlines in a fast-paced and ever changing environmen

Nice To Have:

  • Experience working in healthcare or in a B2C company

Our Benefits (there are more but here are some highlights):

  • Competitive salary & equity compensation for full-time roles
  • Unlimited PTO, company holidays, and quarterly mental health days
  • Comprehensive health benefits including medical, dental & vision, and parental leave
  • Employee Stock Purchase Program (ESPP)
  • Employee discounts on hims & hers & Apostrophe online products
  • 401k benefits with employer matching contribution
  • Offsite team retreats

 

#LI-Remote

 

Outlined below is a reasonable estimate of H&H’s compensation range for this role for US-based candidates. If you're based outside of the US, your recruiter will be able to provide you with an estimated salary range for your location.

The actual amount will take into account a range of factors that are considered in making compensation decisions including but not limited to skill sets, experience and training, licensure and certifications, and location. H&H also offers a comprehensive Total Rewards package that may include an equity grant.

Consult with your Recruiter during any potential screening to determine a more targeted range based on location and job-related factors.

An estimate of the current salary range for US-based employees is
$210,000$250,000 USD

We are focused on building a diverse and inclusive workforce. If you’re excited about this role, but do not meet 100% of the qualifications listed above, we encourage you to apply.

Hims is an Equal Opportunity Employer and considers applicants for employment without regard to race, color, religion, sex, orientation, national origin, age, disability, genetics or any other basis forbidden under federal, state, or local law. Hims considers all qualified applicants in accordance with the San Francisco Fair Chance Ordinance.

Hims & hers is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, you may contact us at accommodations@forhims.com. Please do not send resumes to this email address.

For our California-based applicants – Please see our California Employment Candidate Privacy Policy to learn more about how we collect, use, retain, and disclose Personal Information. 

See more jobs at hims & hers

Apply for this job

6d

Big Data Engineer

Plain ConceptsSpain, Remote
agilescalaairflowazurec++pythonAWS

Plain Concepts is hiring a Remote Big Data Engineer

Estamos ampliando nuestros equipos de Ingeniería, no nos importa mucho el título, pero a este rol lo llamamos Big Data Engineer, y la clave es la experiencia en Python o Scala, Spark y Cloud.

Nuestra visión es construir equipos multidisciplinares, los cuales autogestionan directamente de forma AGILE los proyectos, para encontrar y realizar las mejores soluciones????

¿Qué harás?

  • Desarrollarás proyectos desde cero bajo supervisión mínima y con la colaboración del equipo.
  • Participarás en el diseño de arquitecturas y toma de decisiones en un entorno constructivo y con dinámica de cocreación.
  • Serás pieza clave en el desarrollo de buenas prácticas, código limpio y reusable.
  • Desarrollarás ETLs con Spark (Python / Scala).
  • Desarrollarás proyectos en cloud (Azure / AWS).
  • Construirás pipelines escalables con diferentes tecnologías: airflow, data factory,...
  • Al menos 3 años de experiencia en ingeniería de software / datos.
  • Sólida experiencia en Python o Scala y Spark procesando grandes volúmenes de datos.
  • Experiencia en Cloud (Azure, AWS o GCP).
  • Experiencia en la creación de data pipelines (CI/CD).
  • Conocimiento en bases de datos SQL y NoSQL.
  • Experiencia en testing (unitarios, de integración, etc).
  • Se valorará positivamente experiencia en Databricks o Snowflake.
  • Se valorará positivamente experiencia con IaC.
  • Se valorará positivamente experiencia o conocimiento en Power BI.
  • English it´s a must.
  • Team player.
  • Salario acorde al mercado y tu experiencia ????
  • Horario flexible 35 horas / semana (sin reducción de salario) ????
  • Trabajo remoto 100% (opcional) ????
  • Retribución flexible (restaurante, transporte y guardería) ✌
  • Seguro médico y dental (totalmente gratuito para el empleado) ????
  • Presupuesto individual para formación y certificaciones de Microsoft gratuitas ????
  • Clases de inglés ????
  • Día libre por tu cumpleaños ????????
  • Bonus mensual en concepto de electricidad e internet en casa ????
  • Descuento en plan de gimnasio y actividades deportivas ????
  • Plain Camp (evento anual de team building) ????

➕ El gusto de trabajar siempre con las últimas herramientas tecnológicas.

Con toda esta info ya conoces mucho de nosotros, ¿nos dejas que conozcamos más de ti?

¿El proceso de selección? – Sencillo, 3 pasos: una llamada y 2 entrevistas con el equipo ????

Y te preguntarás… ¿Quién es Plain Concepts?

Plain Concepts somos más de 400 personas apasionadas por la tecnología, movidas por el cambio hacia la búsqueda de las mejores soluciones para nuestros clientes y proyectos.

A lo largo de estos años, la empresa ha crecido gracias al gran potencial técnico que tenemos dentro y apoyándonos siempre en nuestras ideas más locas e innovadoras. Contamos con más de 14 oficinas en 6 países diferentes. Nuestro objetivo principal es seguir creciendo como equipo, realizando los mejores y más avanzados proyectos en el mercado.

Realmente creemos en la importancia de reunir personas de diferentes ámbitos y países para formar el mejor equipo, con una cultura plural e inclusiva.

¿Qué hacemos en Plain?

We are characterized for having a 100% technical DNA. Desarrollamos proyectos a medida desde 0, consultorías técnicas, formación y nuestro producto propio, Sidra????

  • No hacemos bodyshopping ni outsorcing
  • Nuestros equipos son multidisciplinares y la estructura de organización es plana y horizontal
  • Muy comprometidos con los valores AGILE
  • Vivir es compartir, nos ayudamos, apoyamos y animamos mutuamente para ampliar nuestros conocimientos internamente y también de cara a la comunidad (con conferencias, eventos, charlas..)
  • Siempre buscamos la creatividad e innovación, incluso cuando la idea es una locura para otros
  • La transparencia, clave para cualquier relación.

Hacemos realidad las ideas y soluciones de nuestros clientes con un alto grado de excelencia técnica, para más información visita nuestra web:

https://www.plainconcepts.com/es/casos-estudio/

En Plain Concepts, sin duda, buscamos ofrecer igualdad de oportunidades. Queremos solicitantes con diversidad sin importar la raza, color, género, religión, nacionalidad, ciudadanía, discapacidad, edad, orientación sexual o cualquier otra característica protegida por la ley.

See more jobs at Plain Concepts

Apply for this job

6d

Senior Big Data Engineer

Plain ConceptsSpain, Remote
agilescalaairflowazurec++pythonAWS

Plain Concepts is hiring a Remote Senior Big Data Engineer

Estamos ampliando nuestros equipos de Ingeniería, no nos importa mucho el título, pero a este rol lo llamamos Big Data Engineer, y la clave es la experiencia en Python o Scala, Spark y Cloud.

Nuestra visión es construir equipos multidisciplinares, los cuales autogestionan directamente de forma AGILE los proyectos, para encontrar y realizar las mejores soluciones????

¿Qué harás?

  • Desarrollarás proyectos desde cero bajo supervisión mínima y con la colaboración del equipo.
  • Participarás en el diseño de arquitecturas y toma de decisiones en un entorno constructivo y con dinámica de cocreación.
  • Serás pieza clave en el desarrollo de buenas prácticas, código limpio y reusable.
  • Desarrollarás ETLs con Spark (Python / Scala).
  • Desarrollarás proyectos en cloud (Azure / AWS).
  • Construirás pipelines escalables con diferentes tecnologías: airflow, data factory,...
  • Al menos 5 años de experiencia en ingeniería de software / datos.
  • Sólida experiencia en Python o Scala y Spark procesando grandes volúmenes de datos.
  • Experiencia en Cloud (Azure, AWS o GCP).
  • Experiencia en la creación de data pipelines (CI/CD).
  • Conocimiento en bases de datos SQL y NoSQL.
  • Experiencia en testing (unitarios, de integración, etc).
  • Experiencia o conocimiento de IaC.
  • Se valorará positivamente experiencia en Databricks y Snowflake.
  • Se valorará positivamente experiencia o conocimiento en Power BI.
  • English it´s a must.
  • Team player.
  • Salario acorde al mercado y tu experiencia ????
  • Horario flexible 35 horas / semana (sin reducción de salario) ????
  • Trabajo remoto 100% (opcional) ????
  • Retribución flexible (restaurante, transporte y guardería) ✌
  • Seguro médico y dental (totalmente gratuito para el empleado) ????
  • Presupuesto individual para formación y certificaciones de Microsoft gratuitas ????
  • Clases de inglés ????
  • Día libre por tu cumpleaños ????????
  • Bonus mensual en concepto de electricidad e internet en casa ????
  • Descuento en plan de gimnasio y actividades deportivas ????
  • Plain Camp (evento anual de team building) ????

➕ El gusto de trabajar siempre con las últimas herramientas tecnológicas.

Con toda esta info ya conoces mucho de nosotros, ¿nos dejas que conozcamos más de ti?

¿El proceso de selección? – Sencillo, 3 pasos: una llamada y 2 entrevistas con el equipo ????

Y te preguntarás… ¿Quién es Plain Concepts?

Plain Concepts somos más de 400 personas apasionadas por la tecnología, movidas por el cambio hacia la búsqueda de las mejores soluciones para nuestros clientes y proyectos.

A lo largo de estos años, la empresa ha crecido gracias al gran potencial técnico que tenemos dentro y apoyándonos siempre en nuestras ideas más locas e innovadoras. Contamos con más de 14 oficinas en 6 países diferentes. Nuestro objetivo principal es seguir creciendo como equipo, realizando los mejores y más avanzados proyectos en el mercado.

Realmente creemos en la importancia de reunir personas de diferentes ámbitos y países para formar el mejor equipo, con una cultura plural e inclusiva.

¿Qué hacemos en Plain?

We are characterized for having a 100% technical DNA. Desarrollamos proyectos a medida desde 0, consultorías técnicas, formación y nuestro producto propio, Sidra????

  • No hacemos bodyshopping ni outsorcing
  • Nuestros equipos son multidisciplinares y la estructura de organización es plana y horizontal
  • Muy comprometidos con los valores AGILE
  • Vivir es compartir, nos ayudamos, apoyamos y animamos mutuamente para ampliar nuestros conocimientos internamente y también de cara a la comunidad (con conferencias, eventos, charlas..)
  • Siempre buscamos la creatividad e innovación, incluso cuando la idea es una locura para otros
  • La transparencia, clave para cualquier relación.

Hacemos realidad las ideas y soluciones de nuestros clientes con un alto grado de excelencia técnica, para más información visita nuestra web:

https://www.plainconcepts.com/es/casos-estudio/

En Plain Concepts, sin duda, buscamos ofrecer igualdad de oportunidades. Queremos solicitantes con diversidad sin importar la raza, color, género, religión, nacionalidad, ciudadanía, discapacidad, edad, orientación sexual o cualquier otra característica protegida por la ley.

See more jobs at Plain Concepts

Apply for this job

7d

Data Analyst, Platform Excellence Ops Analytics

InstacartUnited States - Remote
tableaujiraairflowsqlDesignpython

Instacart is hiring a Remote Data Analyst, Platform Excellence Ops Analytics

We're transforming the grocery industry

At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

Instacart is a Flex First team

There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

Overview

About the Role 

As a data analyst, you will be responsible for understanding and improving the quality of Instacart’s online grocery business. You will help set up scalable monitoring systems to understand our business performance for driving strategic decisions and help inform when operations or processes are broken. You will play a pivotal role in leveraging operational performance data to diagnose performance or user experience gaps in order to create data-driven recommendations to improve our product and processes. Your work will also entail supporting cross-functional data needs to facilitate improvements across multiple fronts.

About the Team 

The Platform Excellence Operations (PEO) team is dedicated to improving the quality of Instacart’s online grocery experience for our customers, shoppers, and retailers. The analytics team focuses on identifying, understanding, and eliminating defects in the online grocery experience by analyzing and interpreting operational data. The team ensures that stakeholders have a consistent understanding of why business performance is changing and advocates for improving poor experiences that may be overlooked. You’ll be part of a dynamic cross-functional team that collaborates closely with stakeholders to provide critical data insights to enhance our operations and overall customer experience.

As part of the Consumer Pillar within PEO Analytics, you will be working on problems anywhere between a customer landing on the homepage to when they contact Instacart for support after their order. Your goal is to make sense of how the customer uses Instacart and help the business understand where friction might occur in the shopping or contact funnel.

About the Job 

Metrics and Reporting 

  • Work with XFN stakeholders to create 'source of truth' dashboards and future-proof, flexible dashboards.
  • Monitor performance fluctuations and outliers in key performance metrics within the customer funnel to understand root causes and determine if corrective actions are required

Data Analysis

  • Analyze large text and numerical datasets within search, browse, order, and post-order contact to derive business insights
  • Design new data schemas matching business needs and link them to existing data for analysis
  • Define structured frameworks to measure and track against qualitative top customer problems and opportunities)
  • Independently create and execute analytical plans to identify, understand, and reduce known customer problems
  • Provide support in sizing initiatives on product and analytical roadmaps

Presentations and Communication

  • Create and present insights decks to a XFN audience of business leaders, Product Managers, Product Operations and engineers on a weekly cadence
  • Approach problems with a solutions-oriented mindset by developing business cases with clear logical narratives backed by data that result in actionable recommendations

Leadership and Mentorship

  • Help elevate the team’s technical acumen through mentorship and initiating learning opportunities
  • Provide thought leadership to Pillar Lead when planning future strategic initiatives for the team 

About You

Minimum Qualifications

  • 3+ years experience in a data analytics related role (e.g. data science, business intelligence, corporate strategy, consulting, etc)
  • Experience analyzing and visualizing large datasets
  • Ability to aggregate and synthesize large amounts of data and leverage a deep understanding of micro and macro fulfillment metrics and systems to drive insights
  • Strong problem solving skills - able to quickly detect anomalies in performance metrics and trends, form hypothesis for investigation, and draw conclusions by conducting nuanced analysis
  • Strong verbal and written communication skills, including the ability to lead meetings and synthesize complex topics by creating compelling narratives for various stakeholders
  • Advanced proficiency of SQL and Excel / GSheets for data analysis
  • Proficient with Powerpoint / GSlides for effective communication
  • The ideal candidate will have a knack for identifying unusual patterns, a keen eye for detail, and is passionate about investigating problems to determine root causes.
  • Effective prioritization and ability to balance quality and speed in a time constrained environment
  • Ability to work independently with a strong sense of ownership
  • Comfortable proposing new ideas to challenge the business and rally stakeholder champions behind it

Preferred Qualifications

  • Experience working with consumer shopping funnel or post-order contact data
  • Experience in optimizing data models and queries for performance
  • Familiarity with: Snowflake/Databricks/BigQuery or similar data warehouses, DBT/Apache Airflow or similar orchestration tools, Github, and Jira 
  • Familiarity with Visualization Tools: Mode, Tableau, or similar
  • Proficiency in Python or R for data analysis and visualizations
  • Experience working directly with product/engineering teams on scalable process solutions

#LI-REMOTE

Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policyhere.

Offers may vary based on many factors, such as candidate experience and skills required for the role.Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offeringshere.

For US based candidates, the base pay ranges for a successful candidate are listed below.

CA, NY, CT, NJ
$126,000$140,000 USD
WA
$121,000$134,000 USD
OR, DE, ME, MA, MD, NH, RI, VT, DC, PA, VA, CO, TX, IL, HI
$116,000$129,000 USD
All other states
$104,000$116,000 USD

See more jobs at Instacart

Apply for this job

8d

Python Engineer - Analytics

AmpleInsightIncToronto, Canada, Remote
DevOPSairflowsqlDesignpython

AmpleInsightInc is hiring a Remote Python Engineer - Analytics

Job Description

We are looking for an ambitious Python Engineer/Developer. You are passionate about technology but very pragmatic in the application of it to real-world engineering problems. You are experienced in launching new products and scaling them. Critical thinking and problem-solving skills are essential for this role.

As a Python Engineer, you will contribute in a multitude of ways, from architecting phenomenal systems, creating and encouraging good software development practices, driving strategic technical improvements, and mentoring other engineers.

At Ample Insight, you will have a unique opportunity to work with best-in-class engineers on large engineering problems, but in an environment with small teams and abundant opportunities for personal impact and growth. 

Please note that although this role is remote, you are required to be located in Canada/US.

Qualifications

Responsibilities

  • You will be part of a small but highly impactful team, with a large amount of ownership and autonomy for managing things directly
  • You will architect important systems and anticipate strategic and scaling-related challenges via thoughtful long-term planning
  • You will need to design, prototype, and create solutions that support highly reliable, scalable, performant AI and analytics products


Requirements

  • BS (or MS, or PhD) in Computer Science or related engineering field involving coding
  • 3+ years of professional software development experience
  • 3+ years of experience working with Python and data/ML related Python libraries such as Pandas, NumPy and scikit-learn
  • Hands on experience working with data and analytics relating to user engagement, social, marketing, and/or finance data
  • Extensive experience working on relational databases, designing complex data schemas, and writing SQL queries
  • Deep knowledge on performance tuning of ETL jobs, SQL, and databases
  • Solid CS fundamentals with thorough understanding of and demonstrated experience in Object-Oriented Design
  • Strong understanding of design patterns and capable of incorporating them in software design
  • Experience setting technical strategy for a large or important company initiative
  • Strong knowledge of shipping impactful and complex software projects
  • Experience working with Airflow is a strong plus
  • Devops experience is a plus

See more jobs at AmpleInsightInc

Apply for this job

8d

Data Engineer

AmpleInsightIncToronto, Canada, Remote
DevOPSairflowsqlpython

AmpleInsightInc is hiring a Remote Data Engineer

Job Description

We are looking for a data engineer who is passionate about analytics and helping companies build and scale data. You enjoy working with data and are motivated to produce high quality data tools and pipelines that help empower other data scientists. You are experienced in architecting data ETL workflows and schemas. Critical thinking and problem-solving skills are essential for this role.

Qualifications

  • BS (or higher, e.g., MS, or PhD) in Computer Science, Engineering, Math, or Statistics
  • Hands on experience working with user engagement, social, marketing, and/or finance data
  • Proficient in Python (i.e. Pandas, Numpy, scikit-learn, etc), R, TensorFlow, amongst other data science related tools and libraries
  • Extensive experience working on relational databases, designing complex data schemas, and writing SQL queries
  • Deep knowledge on performance tuning of ETL Jobs, SQL, and databases
  • Working knowledge of Snowflake
  • Experience working with Airflow is a strong plus
  • Devops experiences is a plus

See more jobs at AmpleInsightInc

Apply for this job

8d

Senior Product Analyst

StyleSeat100% Remote (U.S. Based Only - Select States)
tableauairflowsqlB2CDesignqac++python

StyleSeat is hiring a Remote Senior Product Analyst

Senior Product Analyst

100% Remote (U.S. Based Only, Select States - See Below)

About the role

As a Senior Product Analyst, you will use hands-on proven data analysis skills to help contribute to StyleSeat’s Analytics function. You'll be deeply involved in driving decision-making across all aspects of the business, from exploratory analysis to better understand our customers, to building data pipelines that democratize a standard level of data across the company.

You will work closely with Product and Engineering teams to define and answer key questions, as well as enabling stakeholders & supporting a data-driven culture. This role will be fully embedded within a product squad (PM, Designer, QA, Engineers, and you - the Analyst), but also have the opportunity for personal growth by supporting other areas of the business as well: Customer Experience, Finance, Product Marketing, etc.

What you’ll do

  • Lead the ideation and execution of product changes that drive growth, by partnering with Product, Engineering, Design, and Marketing
  • Design A/B tests and analyze results to inform strategic decision-making & next steps
  • Translate analytical insights into actionable recommendations for business and process improvements, presenting all the way up to senior leadership
  • Design and assist in building analytical infrastructure (Reporting, Dashboards, Pipelines, and Analyses)
  • Work with business stakeholders to recommend data standards and best practices to align the way we measure, think, and talk about our Product + Business
  • Routinely communicate metrics, trends and other key indicators to Leadership
  • Utilize your personal data-driven tendencies to explore your own curiosities within the data – going off the beaten path to identify areas for improvement + growth

Who you are 

Successful candidates can come from a variety of backgrounds, yet here are some of the critical experiences we’re looking for:

Main Responsibilities:

  • 4+ years of relevant experience in product analytics/data science, or other quantitative disciplines
  • Experience working with large datasets and an ability to write complex SQL queries
  • Experience translating business objectives into actionable analyses, and explaining technical concepts and implications to a broad, non-technical audience
  • Experience with data visualization tools/techniques (Tableau preferred, Looker, Quicksight, Amplitude, etc)
  • Proficiency in designing/building data pipelines or using ETL tools
  • Experience working directly embedded within product squads, going deep into the user problems or pain points and solving them with data

Nice to haves:

  • Knowledgeable in one or more advanced data pipeline tools: Airflow, DBT, Hevo
  • Experience with Python
  • Experience in B2B2C marketplace, eCommerce, or B2C organization
  • Experience at a startup or late-stage growth company

Some year 1 deliverables:

  • Develop framework (reporting, metrics, dimensions) from the ground up for a central & rapidly growing area of the business – our Client experience
  • Utilize said framework to generate insights and create active recommendations to inform our product roadmap ranging from impacts of current sprint to quarters away.
  • Actively present findings & recommendations to not only product stakeholders but senior stakeholders across the organization, following up and ensuring they are actionable

Desired traits 

  • Strong product mindset and knowledge, with the ability to drive roadmaps & PMs with data
  • Curiosity - a natural drive to find out and explain why things happen.
  • The ability to tell a story with data, and the ability to explain technical matters to less technical co-workers
  • A strong and adaptable communicator who can ably interact with executives
  • Ability to manage projects simultaneously while understanding which to prioritize alongside their stakeholder partners

Salary Range

Our job titles may span more than one career level. The career level we are targeting for this role has a base pay between $112,000 and $140,000.The actual base pay is dependent upon many factors, such as: training, transferable skills, work experience, business needs and market demands. Base pay ranges are subject to change and may be modified in the future. 

Who we are 

StyleSeat is the premier business platform for SMBs in the beauty and wellness industry to run and grow their business; and destination for consumers to discover, book and pay. To date, StyleSeat has powered more than 200 million appointments totaling over $12 billion in revenue for small businesses.StyleSeat is a platform and marketplace designed to support and promote the beauty and personal care community. 

Today, StyleSeat connects consumers with top-rated beauty professionals in their area for a variety of services, including hair styling, barbering, massage, waxing, and nail care, among others. Our platform ensures that Pros maximize their schedules and earnings by minimizing gaps and cancellations, effectively attracting and retaining clientele.

StyleSeat Culture & Values 

At StyleSeat, our team is committed to fostering a positive and inclusive work environment. We respect and value the unique perspectives, experiences, and skills of our team members and work to create opportunities for all to grow and succeed. 

  • Diversity - We celebrate and welcome diversity in backgrounds, experiences, and perspectives. We believe in the importance of creating an inclusive work environment where everyone can thrive. 
  • Curiosity- We are committed to fostering a culture of learning and growth. We ask questions, challenge assumptions, and explore new ideas. 
  • Community - We are committed to making a positive impact on each, even when win-win-win scenarios are not always clear or possible in every decision. We strive to find solutions that benefit the community as a whole and drive our shared success.
  • Transparency - We are committed to open, honest, and clear communication. We hold ourselves accountable for maintaining the trust of our customers and team.
  • Entrepreneurship - We are self-driven big-picture thinkers - we move fast and pivot when necessary to achieve our goals. 

Applicant Note: 

StyleSeat is a fully remote, distributed workforce, however, we only have business entities established in the below list of states and, thus are unable to consider candidates who live in states not on this list for the time being. **Applicants must be authorized to work for ANY employer in the U.S. We are unable to sponsor or take over sponsorship of an employment visa at this time.

* Arizona

* Alabama

* California

* Colorado

* Florida

* Georgia

* Illinois

* Indiana

* Massachusetts

* Maryland

* Michigan

* Nebraska

* New York

* New Jersey 

* Ohio

* Oregon

* Pennsylvania

* Washington

See more jobs at StyleSeat

Apply for this job

8d

Senior Data Engineer

carsalesMelbourne, Australia, Remote
SQSEC2LambdascalaairflowDesignpythonAWS

carsales is hiring a Remote Senior Data Engineer

Job Description

What you’ll do

  • Contributing to the delivery of scalable data architectures, and development & design best practices
  • Leading collaborations across data disciplines to develop, optimise and maintain data pipelines and solutions
  • Engages actively in facilitating team-based problem-solving sessions and contribute to the development of best practices
  • Initiating and nurturing effective working relationships, acting as a trusted advisor on product analytics and commercial data solutions
  • Leading technical recommendations and decision-making while, mentoring early-career engineers playing a key role in growing the team's capabilities
  • Owning the delivery of their allocated initiatives within specified scope, times and budgets

Qualifications

What we are looking for?

Critical to success in the role is the ability to operate in the liminal space between business, data and technical practice.

  • An all-of-business ownership mindset over siloed success; leading with high levels of personal integrity and accountability
  • Ability to distil business and analytics requirements into well-defined engineering problems
  • Skilled at identifying appropriate software engineering methods (e.g. modularisations, abstractions) that make data assets tractable
  • Strong software engineering fundamentals (e.g. data structures, principles of software design, build & testing)
  • Strong data engineering experience (e.g. transformations, modelling, pipelines), grounded in the basics of an analytical discipline (e.g. analytics or science)
  • Skilled in designing and building pipelines using cloud services such as AWS EC2, Glue, Lambda, SNS, SQS, IAM, ECS or equivalent
  • Demonstrated experience with distributed technologies such as Airflow, HDFS, EMR
  • Proficient in two or more programming languages such as Python, Spark, Scala or similar

See more jobs at carsales

Apply for this job

8d

Sr. Data Engineer - Data Analytics

R.S.ConsultantsPune, India, Remote
SQSLambdaBachelor's degreescalaairflowsqlDesigntypescriptpythonAWSNode.js

R.S.Consultants is hiring a Remote Sr. Data Engineer - Data Analytics

Job Description

We are looking for a Sr. Data Engineer for an International client. This is a 100% remote job. The person will be working from India and will be collaborating with global team. 

Total Experience: 7+ Years

Your role

  • Have key responsibilities within the requirements analysis, scalable & low latency streaming platform solution design, architecture, and end-to-end delivery of key modules in order to provide real-time data solutions for our product
  • Write clean scalable code using Go, Typescript / Node.js / Scala / python / SQL and test and deploy applications and systems
  • Solve our most challenging data problems, in real-time, utilizing optimal data architectures, frameworks, query techniques, sourcing from structured and unstructured data sources.
  • Be part of an engineering organization delivering high quality, secure, and scalable solutions to clients
  • Involvement in product and platform performance optimization and live site monitoring
  • Mentor team members through giving and receiving actionable feedback.

Our tech. stack:

  • AWS (Lambda, SQS, Kinesis, KDA, Redshift, Athena, DMS, Glue,Go/Typescript, Dynamodb), Airflow, Flink, Spark, Looker, EMR
  • A continuous deployment process based on GitLab

A little more about you:

  • A Bachelor's degree in a technical field (eg. computer science or mathematics). 
  • 3+ years experience with real-time, event driven architecture
  • 3+ years experience with a modern programming language such as Scala, Python, Go, Typescript
  • Experience of designing complex data processing pipeline
  • Experience of data modeling(star schema, dimensional modeling etc)
  • Experience of query optimisation
  • Experience of kafka is a plus
  • Shipping and maintaining code in production
  • You like sharing your ideas, and you're open-minded

Why join us?

???? Key moment to join in term of growth and opportunities

????‍♀️ Our people matter, work-life balance is important

???? Fast-learning environment, entrepreneurial and strong team spirit

???? 45+ Nationalities: cosmopolite & multi-cultural mindset

???? Competitive salary package & benefits (health coverage, lunch, commute, sport

DE&I Statement: 

We believe diversity, equity and inclusion, irrespective of origins, identity, background and orientations, are core to our journey. 

Qualifications

Hands-on experience in Scala / Python with Data Modeling, Real Time / Streaming Data. Experience of complex data processing pipeline and Data Modeling.

BE/ BTech in Computer Science

See more jobs at R.S.Consultants

Apply for this job

9d

Databricks Data Engineer - Data & Analytics team (remote / Costa Rica- or LATAM-based)

HitachiSan Jose, Costa Rica, Remote
scalaairflowsqlDesignazuregitpythonAWS

Hitachi is hiring a Remote Databricks Data Engineer - Data & Analytics team (remote / Costa Rica- or LATAM-based)

Job Description

 

Please note: Although our position is primarily remote / virtual (could be some occasional onsite in downtown San Jose, should you live close enough) you MUST live, and be authorized to work, in Costa Rica without sponsorship. Candidates in other Latin America (LATAM) countries can be considered as an employee if willing to relocate to Costa Rica or can work via our 3rd party payroll company.

 

DATA ENGINEER (DATABRICKS, PYTHON, SPARK) 

This is a full-time, well benefited, career opportunity in our Data & Analytics organization (Azure DataWarehouse / DataLakehouse and Business Intelligence) for a highly experienced Data Engineer in Big Data systems design with hnads-on knowledge in data architecture, especially Spark and Delta/Data Lake technologies.

Individuals in this role will assist in the design, development, enhancement, and maintenance of complex data pipelines products that manage business critical operations, and large-scale analytics pipelines.   Qualified applicants will have a demonstrated capability to learn new concepts quickly, have a data engineering background, and/or have robust software engineering expertise.  

Responsibilities

  • Scope and execute together with team leadership. Work with the team to understand platform capabilities and how to best improve and expand those capabilities.
  • Strong independence and autonomy.
  • Design, development, enhancement, and maintenance of complex data pipeline products which manage business-critical operations and large-scale analytics applications.
  • Experience leading mid- and senior-level data engineers. 
  • Support analytics, data science and/or engineering teams and understand their unique needs and challenges. 
  • Instill excellence into the processes, methodologies, standards, and technology choices embraced by the team.
  • Embrace new concepts quickly to keep up with fast-moving data engineering technology.
  • Dedicate time to continuous learning to keep the team appraised of the latest developments in the space.
  • Commitment to developing technical maturity across the company.

Qualifications

  • 5+ years of Data Engineering experience including 2+ years designing and building Databricks data pipelines is REQUIRED; Azure cloud is highly preferred, however will consider AWS, GCP or other cloud platform experience in lieu of Azure
  • Experience with conceptual, logical and/or physical database designs is a plus
  • 2+ years of hands-on Python/Pyspark/SparkSQL and/or Scala experience is REQUIRED
  • 2+ years of experience with Big Data pipelines or DAG Tools (Data Factory, Airflow, dbt, or similar) is REQUIRED
  • 2+ years of Spark experience (especially Databricks Spark and Delta Lake) is REQUIRED
  • 2+ years of hands-on experience implementing Big Data solutions in a cloud ecosystem, including Data/Delta Lakes, is REQUIRED
  • Experience with source control (git) on the command line is REQUIRED
  • 2+ years of SQL experience, specifically to write complex, highly optimized queries across large volumes of data is HIGHLY DESIRED
  • Data modeling / data profiling capabilities with Kimball/star schema methodology is a plus
  • Professional experience with Kafka, or other live data streaming technology, is HIGHLY DESIRED
  • Professional experience with database deployment pipelines (i.e., dacpac’s or similar technology) is HIGHLY DESIRED
  • Professional experience with one or more unit testing or data quality frameworks is HIGHLY DESIRED

#LI-CA1

#REMOTE

#databricks

#python

#spark

#dataengineer

#datawrangler

Apply for this job

10d

Lead Data Architect (AWS, Azure, GCP)

CapTech ConsultingDenver, CO, Remote
nosqlairflowsqlDesignmongodbazurepythonAWS

CapTech Consulting is hiring a Remote Lead Data Architect (AWS, Azure, GCP)

Job Description

CapTech Data Architects match our clients’ business goals with available technologies when developing a strategy for a successful data delivery implementation. We improve our clients’ business value by enhancing data use, improving effectiveness of information stewardship, and streamlining data flows. After gaining in-depth understanding of our client’s business challenges, our architects apply experience-based insight and use state-of-the-art tools and techniques to identify the best solutions. We view our Data Architects as thought leaders in the data space. We task them with growing CapTech talent and expanding data and analytics delivery capabilities. 

Specific responsibilities for the Data Architect position include:  

  • Assessing and advocating data management technologies and practices eliminating gaps between the current state and a well-targeted future state 
  • Interpreting and delivering impactful strategic plans improving data integration, data quality, and data delivery in support of business initiatives and roadmaps 
  • Formulating and articulating architectural trade-offs across solution options before recommending an optimal solution ensuring technical requirements are met 
  • Leading teams of data engineers and other technical team members through design, implementation and best practices.
  • Providing insights to executive stakeholders and development teams to ensure data architecture recommendations maximize the value of client data across the organization
  • Works to ensure the solutions recommended are providing business value and alignment with client’s strategic goals
  • Communication with non-technical executives focusing on value of modern enterprise level data solutions
  • Driving innovative technology solutions through thought leadership on emerging trends 
  • Sharing project solutions and outcomes with colleagues to improve delivery on future projects 
  • Leadership within the Data & Analytics practice area focused on growing and development capabilities and talent
  • Partnering with CapTech business development team to demonstrate CapTech’s technical capabilities, envision a proposed solution CapTech can offer, and estimate proposed work plans. 

Qualifications

Typical experience for successful candidates includes: 

  • 7+ years of experience implementing with a variety of on-premises and cloud data management, integration, visualization, and analytical technologies 
  • Advanced proficiency in the design and implementation of modern data architectures and concepts such as cloud services (e.g., AWS, Azure, GCP), real-time data distribution (e.g., Kafka, Kinesis, DataFlow, Airflow), NoSQL (e.g., MongoDB, DynamoDB, HBase, CosmosDB) and modern data warehouse tools including Snowflake and DataBricks
  • Advanced proficiency in end-to-end data architecture solutions including ingestion, storage and relational modeling leveraging industry standard languages including SQL and Python
  • Ability to think strategically and relate architectural decisions and recommendations to business needs and client culture 
  • Ability to assess traditional and modern data architectural components based on business needs
  • Experience in recommending data governance best practices including MDM, security, privacy and policies
  • Experience leading enterprise engineering teams through implementations serving as POC on design decisions and best practices

Preferred QualificationsPrevious consulting industry experience

  • Experience in recommending data governance best practices including MDM, security, privacy and policies
  • Providing thought leadership and internal engagement with leadership and innovation across enterprise
  • Participating in providing mentorship and investing talent growth within Data & Analytics practice area
  • Awareness and continued education around emerging technologies and skills in data landscape

See more jobs at CapTech Consulting

Apply for this job