Data Engineer Remote Jobs

109 Results

+30d

Data Engineer

SonderMindDenver, CO or Remote
S3scalaairflowsqlDesignjavac++pythonAWS

SonderMind is hiring a Remote Data Engineer

About SonderMind

At SonderMind, we know that therapy works. SonderMind provides accessible, personalized mental healthcare that produces high-quality outcomes for patients. SonderMind's individualized approach to care starts with using innovative technology to help people not just find a therapist, but find the right, in-network therapist for them, should they choose to use their insurance. From there, SonderMind's clinicians are committed to delivering best-in-class care to all patients by focusing on high-quality clinical outcomes. To enable our clinicians to thrive, SonderMind defines care expectations while providing tools such as clinical note-taking, secure telehealth capabilities, outcome measurement, messaging, and direct booking.

To follow the latest SonderMind news, get to know our clients, and learn about what it’s like to work at SonderMind, you can follow us on Instagram, Linkedin, and Twitter. 

About the Role

In this role, you will be responsible for designing, building, and managing the information infrastructure systems used to collect, store, process, and distribute data. You will also be tasked with transforming data into a format that can be easily analyzed. You will work closely with data engineers on data architectures and with data scientists and business analysts to ensure they have the data necessary to complete their analyses.

Essential Functions

  • Strategically design, construct, install, test, and maintain highly scalable data management systems
  • Develop and maintain databases, data processing procedures, and pipelines
  • Integrate new data management technologies and software engineering tools into existing structures
  • Develop processes for data mining, data modeling, and data production
  • Translate complex functional and technical requirements into detailed architecture, design, and high-performing software and applications
  • Create custom software components and analytics applications
  • Troubleshoot data-related issues and perform root cause analysis to resolve them
  • Manage overall pipeline orchestration
  • Optimize data warehouse performance

 

What does success look like?

Success in this role will be by the seamless and efficient operations of data infrastructure. This includes minimal downtime, accurate and timely data delivery and the successful implementation of new technologies and tools. The individual will have demonstrated their ability to collaborate effectively to define solutions with both technical and non-technical team members across data science, engineering, product and our core business functions. They will have made significant contributions to improving our data systems, whether through optimizing existing processes or developing innovative new solutions. Ultimately, their work will enable more informed and effective decision-making across the organization.

 

Who You Are 

  • Bachelor’s degree in Computer Science, Engineering, or a related field
  • Minimum three years experience as a Data Engineer or in a similar role
  • Experience with data science and analytics engineering is a plus
  • Experience with AI/ML in GenAI or data software - including vector databases - is a plus
  • Proficient with scripting and programming languages (Python, Java, Scala, etc.)
  • In-depth knowledge of SQL and other database related technologies
  • Experience with Snowflake, DBT, BigQuery, Fivetran, Segment, etc.
  • Experience with AWS cloud services (S3, RDS, Redshift, etc.)
  • Experience with data pipeline and workflow management tools such as Airflow
  • Strong negotiation and interpersonal skills: written, verbal, analytical
  • Motivated and influential – proactive with the ability to adhere to deadlines; work to “get the job done” in a fast-paced environment
  • Self-starter with the ability to multi-task

Our Benefits 

The anticipated salary rate for this role is between $130,000-160,000 per year.

As a leader in redesigning behavioral health, we are walking the walk with our employee benefits. We want the experience of working at SonderMind to accelerate people’s careers and enrich their lives, so we focus on meeting SonderMinders wherever they are and supporting them in all facets of their life and work.

Our benefits include:

  • A commitment to fostering flexible hybrid work
  • A generous PTO policy with a minimum of three weeks off per year
  • Free therapy coverage benefits to ensure our employees have access to the care they need (must be enrolled in our medical plans to participate)
  • Competitive Medical, Dental, and Vision coverage with plans to meet every need, including HSA ($1,100 company contribution) and FSA options
  • Employer-paid short-term, long-term disability, life & AD&D to cover life's unexpected events. Not only that, we also cover the difference in salary for up to seven (7) weeks of short-term disability leave (after the required waiting period) should you need to use it.
  • Eight weeks of paid Parental Leave (if the parent also qualifies for STD, this benefit is in addition which allows between 8-16 weeks of paid leave)
  • 401K retirement plan with 100% matching which immediately vests on up to 4% of base salary
  • Travel to Denver 1x a year for annual Shift gathering
  • Fourteen (14) company holidays
  • Company Shutdown between Christmas and New Years
  • Supplemental life insurance, pet insurance coverage, commuter benefits and more!

Application Deadline

This position will be an ongoing recruitment process and will be open until filled.

 

Equal Opportunity 
SonderMind does not discriminate in employment opportunities or practices based on race, color, creed, sex, gender, gender identity or expression, pregnancy, childbirth or related medical conditions, religion, veteran and military status, marital status, registered domestic partner status, age, national origin or ancestry, physical or mental disability, medical condition (including genetic information or characteristics), sexual orientation, or any other characteristic protected by applicable federal, state, or local laws.

Apply for this job

+30d

Data Engineer

Tiger AnalyticsLondon,England,United Kingdom, Remote Hybrid

Tiger Analytics is hiring a Remote Data Engineer

Tiger Analytics is pioneering what AI and analytics can do to solve some of the toughest problems faced by organizations globally. We develop bespoke solutions powered by data and technology for several Fortune 100 companies. We have offices in multiple cities across the US, UK, Canada, India, and Singapore, and a substantial remote global workforce.

If you are passionate about working on business problems that can be solved using structured and unstructured data on a large scale, Tiger Analytics would like to talk to you. We are seeking an experienced and dynamic Data Engineer to play a key role in designing and implementing robust data solutions that help in solving the client's complex business problem

Responsibilities:

  • Design, develop, and maintain scalable data pipelines using Scala, DBT, and SQL.
  • Implement and optimize distributed data processing solutions using MPP databases and technologies.
  • Build and deploy machine learning models using distributed processing frameworks such as Spark, Glue, and Iceberg.
  • Collaborate with data scientists and analysts to operationalize ML models and integrate them into production systems.
  • Ensure data quality, reliability, and integrity throughout the data lifecycle.
  • Continuously optimize and improve data processing and ML workflows for performance and scalability.

Requirements:

  • 5+ years of experience in data engineering and machine learning.
  • Proficiency in Scala programming language for building data pipelines and ML models.
  • Hands-on experience with DBT (Data Build Tool) for data transformation and modeling.
  • Strong SQL skills for data querying and manipulation.
  • Experience with MPP (Massively Parallel Processing) databases and distributed processing technologies.
  • Familiarity with distributed processing frameworks such as Spark, Glue, and Iceberg.
  • Ability to work independently and collaboratively in a team environment.

Significant career development opportunities exist as the company grows. The position offers a unique opportunity to be part of a small, fast-growing, challenging and entrepreneurial environment, with a high degree of individual responsibility.

See more jobs at Tiger Analytics

Apply for this job

+30d

Senior Data Engineer

Expression NetworksDC, US - Remote
agileAbility to travelnosqlsqlDesignscrumjavapythonjavascript

Expression Networks is hiring a Remote Senior Data Engineer

Expression is looking to hire a Senior Data Engineer (individual contributor) to add to the continued growth we are seeing with our Data Science department. This position will daily report to the program manager and data team manager on projects and will be responsible for the design and execution of high-impact data architecture and engineering solutions to customers across a breadth of domains and use cases.

Location:

  • Remote with the ability to travel monthly when needed
    • Local (DC/VA/MD Metropolitan area) is preferred but not required
    • Relocation assistance available for highly qualified candidates

Security Clearance:

  • US Citizenship required
  • Ability to obtain Secret Clearance or higher

Primary Responsibilities:

  • Directly working and leading others on the development, testing, and documentation of software code and data pipelines for data extraction, ingestion, transformation, cleaning, correlation, and analytics
  • Leading end-to-end architectural design and development lifecycle for new data services/products, and making them operate at scale
  • Partnering with Program Managers, Subject Matter Experts, Architects, Engineers, and Data Scientists across the organization where appropriate to understand customer requirements, design prototypes, and optimize existing data services/products
  • Setting the standard for Data Science excellence in the teams you work with across the organization, and mentoring junior members in the Data Science department

Additional Responsibilities:

  • Participating in technical development of white papers and proposals to win new business opportunities
  • Analyzing and providing feedback on product strategy
  • Participating in research, case studies, and prototypes on cutting-edge technologies and how they can be leveraged
  • Working in a consultative fashion to improve communication, collaboration, and alignment amongst teams inside the Data Science department and across the organization
  • Helping recruit, nurture, and retain top data engineering talent

Required Qualifications:

  • 4+ years of experience bringing databases, data integration, and data analytics/ML technologies to production with a PhD/MS in Computer Science/Data Science/Computer Engineering or relevant field, or 6+ years of experience with a Bachelor’s degree
  • Mastery in developing software code in one or more programming languages (Python, JavaScript, Java, Matlab, etc.)
  • Expert knowledge in databases (SQL, NoSQL, Graph, etc.) and data architecture (Data Lake, Lakehouse)
  • Knowledgeable in machine learning/AI methodologies
  • Experience with one or more SQL-on-Hadoop technology (Spark SQL, Hive, Impala, Presto, etc.)
  • Experience in short-release cycles and the full software lifecycle
  • Experience with Agile development methodology (e.g., Scrum)
  • Strong writing and oral communication skills to deliver design documents, technical reports, and presentations to a variety of audiences

Benefits:

  • 401k matching
  • PPO and HDHP medical/dental/vision insurance
  • Education reimbursement
  • Complimentary life insurance
  • Generous PTO and holiday leave
  • Onsite office gym access
  • Commuter Benefits Plan

About Expression:

Founded in 1997 and headquartered in Washington DC, Expression provides data fusion, data analytics, software engineering, information technology, and electromagnetic spectrum management solutions to the U.S. Department of Defense, Department of State, and national security community. Expression’s “Perpetual Innovation” culture focuses on creating immediate and sustainable value for our clients via agile delivery of tailored solutions built through constant engagement with our clients. Expression was ranked #1 on the Washington Technology 2018's Fast 50 list of fastest growing small business Government contractors and a Top 20 Big Data Solutions Provider by CIO Review.

Equal Opportunity Employer/Veterans/Disabled

See more jobs at Expression Networks

Apply for this job

+30d

Senior Data Engineer

Tiger AnalyticsJersey City,New Jersey,United States, Remote

Tiger Analytics is hiring a Remote Senior Data Engineer

Tiger Analytics is a fast-growing advanced analytics consulting firm. Our consultants bring deep expertise in Data Science, Machine Learning and AI. We are the trusted analytics partner for several Fortune 100 companies, enabling them to generate business value from data. Our business value and leadership has been recognized by various market research firms, including Forrester and Gartner. We are looking for top-notch talent as we continue to build the best analytics global consulting team in the world.

We are seeking an experienced Data Engineer to join our data team. As a Data Engineer, you will be responsible for designing, building, and maintaining data pipelines, data integration processes, and data infrastructure using Dataiku. You will collaborate closely with data scientists, analysts, and other stakeholders to ensure efficient data flow and support data-driven decision making across the organization.

  • 8+ years of overall industry experience specifically in data engineering
  • Strong knowledge of data engineering principles, data integration, and data warehousing concepts.
  • Strong understanding of the pharmaceutical/ Life Science domain, including knowledge of patient data, Commercial data, drug development processes, and healthcare data.
  • Proficiency in data engineering technologies and tools, such as SQL, Python, ETL frameworks, data integration platforms, and data warehousing solutions.
  • Experience with data modeling, database design, and data architecture principles.
  • Familiarity with big data technologies (e.g., Hadoop, Spark) and cloud platforms - AWS, Azure
  • Strong analytical and problem-solving skills, with the ability to work with large and complex datasets.
  • Strong communication and collaboration abilities.
  • Attention to detail and a focus on delivering high-quality work.

Significant career development opportunities exist as the company grows. The position offers a unique opportunity to be part of a small, challenging, and entrepreneurial environment, with a high degree of individual responsibility.

See more jobs at Tiger Analytics

Apply for this job

+30d

Senior Data Engineer

SendleAustralia (Remote)
SalestableausqlDesigngit

Sendle is hiring a Remote Senior Data Engineer

Sendle builds shipping that is good for the world. We help small businesses thrive by making parcel delivery simple, reliable, and affordable. We’re a B Corp and the first 100% carbon neutral delivery service in Australia, Canada, and the United States, where we harness major courier networks to create a delivery service that levels the playing field for small businesses.

We envision a world where small businesses can compete on a level playing field with the big guys. Sendle is a fast-growing business with bold ambitions and big dreams. 

In the last few years, we’ve made huge strides towards our goal of becoming the largest SMB eCommerce courier in the world, moving from a single-country operation in Australia to a successful launch and operation in the US and Canada.  We’ve also launched major partnerships with Vestiaire Collective, eBay, Shopify, and Etsy too! 

But most importantly, we’re a bunch of good people doing good work. Wanna join us?

A bit about the role

We are looking for a Senior Data Engineer who is passionate about building scalable data systems that will enable our vision of data democratization to drive value for the business.

As a company, data is at the center of every critical business decision we make. With this role, you will work across many different areas of the business, learning about everything from marketing and sales to courier logistics and network performance. Additionally, there is the opportunity to work directly with stakeholders, with you being a technical thought partner and working collaboratively to design and build solutions to address key business questions. 

What you’ll do

  • Develop, deploy, and maintain data models to support the data needs of various teams across the company
  • Build data models with DBT, utilizing git for source control
  • Ingest data from different sources (via Fivetran, APIs, etc.) into Snowflake for use by the DBT models
  • Collaborate with the Data Engineering team to brainstorm, scope, and implement process improvements
  • Work with the entire Data and Analytics team to enhance data observability and monitoring
  • Act as a thought partner for stakeholders and peers across the company on ad hoc data requests and identify the best approach and design for our near-term and long-term growth objectives
  • Understand the tradeoffs between technical possibilities and stakeholder needs and strive for balanced solutions
  • Hold self and others accountable to meet commitments and act with a clear sense of ownership
  • Demonstrate persistence in the face of obstacles, resolve them effectively, and involve others as needed
  • Contribute to our data literacy efforts by improving the accessibility, discoverability, and interpretability of our data
  • Research industry trends and introduce new methodologies and processes to the team

What you’ll need

  • Experience with data modeling, data warehousing, and building ETL pipelines (Dagster, DBT, and Snowflake experience a plus)
  • Advanced SQL knowledge
  • Experience with source control technologies such as git
  • Strong communication skills and the ability to partner with business stakeholders to translate business requirements into technical solutions
  • Ability to effectively communicate technical approach with teammates and leaders 
  • Ability to thrive in a remote environment through effective async communication and collaboration
  • Ability to manage multiple projects simultaneously
  • A can-do attitude and demonstrates flexibility by readily taking on new opportunities and assisting others
  • The 5Hs (our core values) in their approach to work and building partnerships with stakeholders and teammates

What we’re offering

  • The chance to work with a creative team in a supportive environment
  • A personal development budget
  • You are able to create your own work environment, connecting to a remote team from anywhere in Australia
  • EAP access for you and your immediate family, because we care about your wellbeing
  • Options through participation in Sendle’s ESOP

What matters to us

We believe that our culture is one of our most important assets. We have 5 key values that we look for in every member of our team.

  • Humble - We put others first. We embrace and seek feedback from others.
  • Honest - We speak gently but frankly. We take ownership of our mistakes and speak the truth.
  • Happy - We enjoy the journey. We are optimistic and find opportunities in all things.
  • Hungry -We aspire to make a difference. We aim high, step out of our comfort zones, and tackle the hard problems.
  • High-Performing - We relentlessly deliver.We know the goal and work fearlessly towards it.

Legally, we need you to know this: 

We are an equal opportunity employer and value diversity. We do not discriminate on the basis of race, religion, colour, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

If you require accommodations due to a disability to participate in the application or interview process, please get in touch with our team at careers@sendle.com to discuss your needs.

But it’s important to us that you know this:

We strongly believe that diversity of experience contributes to a broader collective perspective that will consistently lead to a better company and better outcomes. We are working hard to increase the diversity of our team wherever we can and we actively encourage everyone to consider becoming a part of it.

If you want to be a part of something remarkable then we’re excited to hear from you.

Interested in knowing more? Check us out on our Careers Page, Being Carbon Neutral and LinkedIn.

 

#LI-Remote

See more jobs at Sendle

Apply for this job

FanDuel is hiring a Remote Staff Data Platform Engineer

Job Application for Staff Data Platform Engineer at FanDuel

See more jobs at FanDuel

Apply for this job

+30d

Data Engineer (Colombia)

SezzleColombia, Remote
MLSalesgolangBachelor's degreeterraformsqlDesignansiblec++dockerkubernetespythonAWS

Sezzle is hiring a Remote Data Engineer (Colombia)

About the Role: 

We are looking for a Data Engineer who will assist us building, running and improving the data infrastructure that data and engineering teams use to power their services.  Your duties will include the development, testing, and maintenance of data tooling and services, using a combination of cloud products, open source tools and internal applications. You should be able to build high-quality, scalable solutions for a variety of problems. We are seeking a talented and motivated Data Engineer who is best in class with a high IQ plus a high EQ. This role presents an exciting opportunity to thrive in a dynamic, fast-paced environment within a rapidly growing team, with abundant prospects for career advancement.

About Sezzle:

Sezzle is a cutting-edge fintech company dedicated to financially empowering the next generation. With only one in three millennials owning a credit card and the majority lacking their desired credit scores, Sezzle addresses these challenges through a payment platform that offers interest-free installment plans at online stores. By increasing consumers' purchasing power, Sezzle drives sales and basket sizes for thousands of eCommerce merchants that it partners with.

Key Responsibilities Include:

  • Work with a team to plan, design and build tools and services that improve our internal data infrastructure platform and the pipelines that feed it using Python, Go, AWS, Terraform, and Kubernetes.
  • Develop monitoring and alerting of our data infrastructure to detect problems.
  • Perform ongoing maintenance of our data infrastructure, such as apply upgrades.
  • Assist product developers, data scientists and machine learning engineers in debugging and triaging production issues.
  • Collaborate with cross-functional teams to integrate machine learning solutions into production systems
  • Take part in the postmortem reviews, suggesting ways we can improve the reliability of our platform.
  • Document the actions you take and produce both runbooks and automation to reduce day to day toil.

Minimum Requirements:

  • Bachelor's in Computer Science, Data Science, Machine Learning or a related field

Preferred Knowledge and Skills:

  • Experience with AWS services like Redshift, Glue, Sagemaker etc
  • Experience with Data Focused Languages such as Python or SQL
  • Knowledge of ML model training/deploy is a plus
  • Knowledge in data analysis algorithms (e.g. statistics, machine learning)
  • Familiarity with machine learning frameworks and libraries, such as TensorFlow or PyTorch
  • Experience with MLOps principles is a plus
  • Familiarity with orchestration tools like Dagster/Airflow
  • Basic knowledge of Golang, Docker and Kubernetes
  • Familiarity with deployment/provisioning tools like Terraform, Helm, Ansible
  • Experience documenting requirements and specification

About You: 

  • You have relentlessly high standards - many people may think your standards are unreasonably high. You are continually raising the bar and driving those around you to deliver great results. You make sure that defects do not get sent down the line and that problems are fixed so they stay fixed.
  • You’re not bound by convention - your success—and much of the fun—lies in developing new ways to do things
  • You need action - speed matters in business. Many decisions and actions are reversible and do not need extensive study. We value calculated risk-taking.
  • You earn trust - you listen attentively, speak candidly, and treat others respectfully.
  • You have backbone; disagree, then commit - you can respectfully challenge decisions when you disagree, even when doing so is uncomfortable or exhausting. You have conviction and are tenacious. You do not compromise for the sake of social cohesion. Once a decision is determined, you commit wholly.

What Makes Working at Sezzle Awesome? 

At Sezzle, we are more than just brilliant engineers, passionate data enthusiasts, out-of-the-box thinkers, and determined innovators. We believe in surrounding ourselves with only the best and the brightest individuals. Our culture is not defined by a certain set of perks designed to give the illusion of the traditional startup culture, but rather, it is the visible example living in every employee that we hire. 

Equal Employment Opportunity: Sezzle Inc. is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We do not discriminate based on race, color, religion, sex, national origin, age, disability, genetic information, pregnancy, or any other legally protected status. Sezzle recognizes and values the importance of diversity and inclusion in enriching the employment experience of its employees and in supporting our mission.

#Li-remote

See more jobs at Sezzle

Apply for this job

+30d

Data Engineer (m/w/d)

GerresheimerEssen, Germany, Remote
azure

Gerresheimer is hiring a Remote Data Engineer (m/w/d)

Stellenbeschreibung

  • Du hilfst uns, den Data Analytics / Business Intelligence Bereich für die Business Unit Moulded Glass aufzubauen und weiterzuentwickeln, um Mehrwert und Business Insights zu generieren.
  • Du entwickelst, implementierst und wartest ETL-Prozesse mit Azure Data Factory und anderen Tools in enger Abstimmung mit unserem Gerresheimer Data Science Center.
  • Du überführst bestehende Auswertungen in standardisierte und automatisierte Lösungen mit Fokus auf Datenmodellierung, und optimierst sie kontinuierlich.
  • Du entwickelst und optimierst Datenmodelle, um einen effizienten Datenzugriff und -analyse zu ermöglichen und integrierst verschiedene Datenquellen und Datenbanken (wie S4 HANA, interne und externe Quellen), um eine konsistente Datenbasis sicherzustellen.
  • Du entwirfst, implementierst und überwachst Datenpipelines, -architektur und Mechanismen zur Datenqualitätsüberwachung und -verbesserung, inklusive Fehlererkennung und -behebung.
  • Du setzt AI/ML-Projekte in enger Zusammenarbeit mit unserem Gerresheimer Data Science Center um.

Die Tätigkeit kann remoteausgeführt werden, falls gewünscht.

Qualifikationen

  • Du verfügst über einen Bachelor- oder Master-Abschluss in Informatik, Mathematik, Ingenieurswissenschaften oder einer verwandten Fachrichtung.
  • Du besitzt fundierte technische Kenntnisse und idealerweise Erfahrungen in der Arbeit mit relevanten Azure-Diensten wie Azure Data Factory, Microsoft Azure, SQL/DWH/Analysis Services, Azure Data Lake, Azure Synapse und Azure DevOps.
  • Erfahrung im Umgang mit Massendaten, Datenmodellierung und Datenverarbeitung sind für diese Position unerlässlich.
  • Du beherrschst mindestens eine Programmiersprache, idealerweise Python.
  • Idealerweise verfügst du über Kenntnisse der Datenstrukturen in ERP-Systemen wie SAP FI/CO/MM/SD.
  • Du hast idealerweise Erfahrungen mit BI-Tools wie Power BI und mit agilen Projektmanagementmethoden wie Scrum.
  • Du bist begeistert von Themen wie Data Analytics, Data Science, AI, Machine und Deep Learning.
  • Du zeigst eine kontinuierliche Lernbereitschaft und Interesse daran, dich auch in anderen Bereichen weiterzubilden (z.B. RPA).
  • Du sprichst Deutsch und Englisch.

See more jobs at Gerresheimer

Apply for this job

+30d

Senior Data Engineer, Finance

InstacartUnited States - Remote
airflowsqlDesign

Instacart is hiring a Remote Senior Data Engineer, Finance

We're transforming the grocery industry

At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

Instacart is a Flex First team

There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

Overview

 

At Instacart, our mission is to create a world where everyone has access to the food they love and more time to enjoy it together. Millions of customers every year use Instacart to buy their groceries online, and the Data Engineering team is building the critical data pipelines that underpin all of the myriad of ways that data is used across Instacart to support our customers and partners.

About the Role 

 

The Finance data engineering team plays a critical role in defining how financial data is modeled and standardized for uniform, reliable, timely and accurate reporting. This is a high impact, high visibility role owning critical data integration pipelines and models across all of Instacart’s products. This role is an exciting opportunity to join a key team shaping the post-IPO financial data vision and roadmap for the company.

 

About the Team 

 

Finance data engineering is part of the Infrastructure Engineering pillar, working closely with accounting, billing & revenue teams to support the monthly/quarterly book close, retailer invoicing and internal/external financial reporting. Our team collaborates closely with product teams to capture critical data needed for financial use cases.

 

About the Job 

  • You will be part of a team with a large amount of ownership and autonomy.
  • Large scope for company-level impact working on financial data.
  • You will work closely with engineers and both internal and external stakeholders, owning a large part of the process from problem understanding to shipping the solution.
  • You will ship high quality, scalable and robust solutions with a sense of urgency.
  • You will have the freedom to suggest and drive organization-wide initiatives.

 

About You

Minimum Qualifications

  • 8+ years of working experience in a Data/Software Engineering role, with a focus on building data pipelines.
  • Expert with SQL and  knowledge of Python.
  • Experience building high quality ETL/ELT pipelines.
  • Past experience with data immutability, auditability, slowly changing dimensions or similar concepts.
  • Experience building data pipelines for accounting/billing purposes.
  • Experience with cloud-based data technologies such as Snowflake, Databricks, Trino/Presto, or similar.
  • Adept at fluently communicating with many cross-functional stakeholders to drive requirements and design shared datasets.
  • A strong sense of ownership, and an ability to balance a sense of urgency with shipping high quality and pragmatic solutions.
  • Experience working with a large codebase on a cross functional team.

 

Preferred Qualifications

  • Bachelor’s degree in Computer Science, computer engineering, electrical engineering OR equivalent work experience.
  • Experience with Snowflake, dbt (data build tool) and Airflow
  • Experience with data quality monitoring/observability, either using custom frameworks or tools like Great Expectations, Monte Carlo etc

 

#LI-Remote

Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policyhere.

Offers may vary based on many factors, such as candidate experience and skills required for the role.Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offeringshere.

For US based candidates, the base pay ranges for a successful candidate are listed below.

CA, NY, CT, NJ
$221,000$245,000 USD
WA
$212,000$235,000 USD
OR, DE, ME, MA, MD, NH, RI, VT, DC, PA, VA, CO, TX, IL, HI
$203,000$225,000 USD
All other states
$183,000$203,000 USD

See more jobs at Instacart

Apply for this job