airflow Remote Jobs

70 Results

3h

Analyst 2, Data Analytics

Western DigitalBatu Kawan, Malaysia, Remote
Bachelor's degreeairflowmariadbsqloracleapidockercsskubernetesjenkinspythonjavascriptPHP

Western Digital is hiring a Remote Analyst 2, Data Analytics

Job Description

  • Excellent interpersonal communication and organizational skills to contribute as a leading member of global, distributed teams focused on delivering quality services and solutions
  • Able to distill complex technical challenges to actionable and explainable decisions
  • Work in DevOps teams by building consensus and mediating compromises when necessary
  • Demonstrate excellent engineering & automation skills in the context of application development using continuous integration (CI) and continuous deployment (CD)
  • Demonstrate ability to rapidly learn new and emerging technologies with ability to rapidly define engineering standards and produce automation code
  • Operational abilities including early software release support and driving root cause analysis and remediation
  • Ability to work with and engage multiple functional groups

Qualifications

  • Bachelor's Degree in Computer Science, Software Engineering, Computer Engineering or a related field or equivalent work experience. 
  • 5+ years overall IT industry experience
  • 3+ years in an engineering role using service and hosting solutions such as factory dashboards
  • 3+ years functional knowledge of server-side languages: Python, PHP
  • 3+ years functional knowledge of client-side programming: JavaScript, HTML, CSS
  • Experience with relational SQL database: MariaDB, MSSQL, Oracle
  • Experience with data pipeline and workflow management tools: Airflow
  • Solid understanding of containerization and orchestration tools: Docker, Kubernetes
  • Experience with version control systems: BitBucket
  • Experience with Dash framework (Python web framework) for building interactive web applications
  • Exposure to Common Web Frameworks & REST API’s
  • Experience with continuous integration and deployment using Jenkins is a plus

See more jobs at Western Digital

Apply for this job

3h

Principal Engineer

Western DigitalShanghai, China, Remote
airflowmariadbsqloracleapidockercsskubernetesjenkinspythonjavascriptPHP

Western Digital is hiring a Remote Principal Engineer

Job Description

  • Excellent interpersonal communication and organizational skills to contribute as a leading member of global, distributed teams focused on delivering quality services and solutions
  • Able to distill complex technical challenges to actionable and explainable decisions
  • Work in DevOps teams by building consensus and mediating compromises when necessary
  • Demonstrate excellent engineering & automation skills in the context of application development using continuous integration (CI) and continuous deployment (CD)
  • Demonstrate ability to rapidly learn new and emerging technologies with ability to rapidly define engineering standards and produce automation code
  • Operational abilities including early software release support and driving root cause analysis and remediation
  • Ability to work with and engage multiple functional groups

Qualifications

  • Bachelor’s Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline)
  • 5+ years overall IT industry experience
  • 3+ years in an engineering role using service and hosting solutions such as factory dashboards
  • 3+ years functional knowledge of server-side languages: Python, PHP
  • 3+ years functional knowledge of client-side programming: JavaScript, HTML, CSS
  • Experience with relational SQL database: MariaDB, MSSQL, Oracle
  • Experience with data pipeline and workflow management tools: Airflow
  • Solid understanding of containerization and orchestration tools: Docker, Kubernetes
  • Experience with version control systems: BitBucket
  • Experience with Dash framework (Python web framework) for building interactive web applications
  • Exposure to Common Web Frameworks & REST API’s
  • Experience with continuous integration and deployment using Jenkins is a plus

See more jobs at Western Digital

Apply for this job

9h

Senior Data Engineer, Modeling

ThumbtackRemote, United States
tableauairflowsqlDesignpython

Thumbtack is hiring a Remote Senior Data Engineer, Modeling

A home is the biggest investment most people make, and yet, it doesn’t come with a manual. That's why we’re building the only app homeowners need to effortlessly manage their homes —  knowing what to do, when to do it, and who to hire. With Thumbtack, millions of people care for what matters most, and pros earn billions of dollars through our platform. And as one of the fastest-growing companies in a $600B+ industry — we must be doing something right. 

We are driven by a common goal and the deep satisfaction that comes from knowing our work supports local economies, helps small businesses grow, and brings homeowners peace of mind. We’re seeking people who continually put our purpose first: advocating for pros and customers, embracing change, and choosing teamwork every day.

At Thumbtack, we're creating a new era of home care. If making an impact and the chance to do good inspires you, join us. Imagine what we’ll build together. 

Thumbtack by the Numbers

  • Available nationwide in every U.S. county
  • 80 million projects started on Thumbtack
  • 10 million 5-star reviews and counting
  • Pros earn billions on our platform
  • 1000+ employees 
  • $3.2 billion valuation (June, 2021) 

About the Data Engineering Team

Thumbtack’s Data Engineering team is a centralized team that works closely with engineers, analysts, data scientists, and machine learning engineers to help design and curate data sets originating from internal and third-party sources to meet current and future needs. Over the next year, it will continue to build on its prior successes in building a more cohesive data warehouse while starting to work more deeply upstream to build data best practices into the full software development lifecycle (SDLC).

About the Role

As a Senior Data Engineer, you will work closely with product and engineering teams throughout Thumbtack, helping turn data into insight into action. The Data Engineering team is a hybrid-embedded team of engineers, some of whom consult directly with product teams to integrate data into the development lifecycle, and others who help build core pipelines and data models for use across the entire company. You’ll work to understand requirements, then design, deploy, test, and deploy data pipelines and transformations for use by Analysts, Machine Learning Engineers, and Data Scientists, and Product Managers. Major project areas include: working across product and marketing data teams to build a centralized customer data warehouse, developing advanced ingress/egress validation in the data lake, and modeling cost of supply acquisition for our two-sided marketplace.

Challenge

In 2024, Thumbtack is significantly investing in Data and Data Engineering as a strategic growth area for the company. While there are interesting and difficult challenges across the entire focus area, we’re building on a solid foundation of the modern data stack, are committed to supporting each other, and have internal champions and strong advocates on our partner teams to ensure we succeed. Our primary mandate is to take these core building blocks of a modern data system, and extend them to make simple analysis simple, and deeply complex analysis easier. As a Senior Data Engineer, you will be instrumental in making this happen.

Responsibilities

  • Collaboratively refine and evangelize a comprehensive framework for integrating data-thinking into the software development lifecycle for product teams
  • Design, build, and maintain core datasets, data marts, and feature stores that support a blend of mature products and features with a rapidly evolving product line, in partnership with analytics, data science, and machine learning
  • Integrate with teams consisting of product engineers, analysts, data scientists, machine learning engineers throughout the organization to understand their data needs, and help design datasets with the same engineering rigor as any other software we design
  • Drive data quality and best practices across key product and business areas

What you’ll need

If you don't think you meet all of the criteria below but still are interested in the job, please apply. Nobody checks every box, and we're looking for someone excited to join the team.

  • 5+ years experience designing and building data sets and warehouses
  • Hands-on experience with SQL, ETLs, Python, data pipelines, distributed systems
  • Ability to understand the needs of and collaborate with stakeholders in other functions, especially Analytics, and identify opportunities for process improvements across teams
  • Familiarity building the above with a modern data stack based on a cloud-native data warehouse, in our case we use BigQuery, dbt, and Apache Airflow, but a similar stack is fine
  • Strong sense of ownership and pride in your work, from ideation and requirements-gathering to project completion and maintenance

Bonus points if you have 

  • Domain experience working with data in a relevant area, such as Marketing, Customer Behavior & Engagement, Finance, et al.
  • Experience mentoring and coaching engineers
  • Experience using and/or configuring Business Intelligence tools (Looker, Tableau, Mode, et al.)
  • Experience working with semi-structured or unstructured data in a data lake or similar
  • Understanding of database internals and query optimization

Thumbtack is a virtual-first company, meaning you can live and work from any one of our approved locations across the United States, Canada or the Philippines.* Learn more about our virtual-first working modelhere.

For candidates living in San Francisco / Bay Area, New York City, or Seattle metros, the expected salary range for the role is currently $180,000 - $250,000. Actual offered salaries will vary and will be based on various factors, such as calibrated job level, qualifications, skills, competencies, and proficiency for the role.

For candidates living in all other US locations, the expected salary range for this role is currently $170,000 - $215,000. Actual offered salaries will vary and will be based on various factors, such as calibrated job level, qualifications, skills, competencies, and proficiency for the role.

#LI-Remote

Benefits & Perks
  • Virtual-first working model coupled with in-person events
  • 20 company-wide holidays including a week-long end-of-year company shutdown
  • Libraries (optional use collaboration & connection hubs)in San Francisco and Salt Lake City  
  • WiFi reimbursements 
  • Cell phone reimbursements (North America) 
  • Employee Assistance Program for mental health and well-being 

Learn More About Us

Thumbtack embraces diversity. We are proud to be an equal opportunity workplace and do not discriminate on the basis of sex, race, color, age, pregnancy, sexual orientation, gender identity or expression, religion, national origin, ancestry, citizenship, marital status, military or veteran status, genetic information, disability status, or any other characteristic protected by federal, provincial, state, or local law. We also will consider for employment qualified applicants with arrest and conviction records, consistent with applicable law. 

Thumbtack is committed to working with and providing reasonable accommodation to individuals with disabilities. If you would like to request a reasonable accommodation for a medical condition or disability during any part of the application process, please contact:recruitingops@thumbtack.com

If you are a California resident, please review information regarding your rights under California privacy laws contained in Thumbtack’s Privacy policy available athttps://www.thumbtack.com/privacy/.

See more jobs at Thumbtack

Apply for this job

9h

Senior Software Engineer, App Security

Muck RackRemote (US)
airflowpostgresvuec++elasticsearchmysqlpythonAWS

Muck Rack is hiring a Remote Senior Software Engineer, App Security

Muck Rack is the leading SaaS platform for public relations and communications professionals. Our mission is to enable organizations to build trust, tell their stories and demonstrate the unique value of earned media. Muck Rack’s Public Relations Management (PRM) platform enables organizations to build relationships with the media, manage crisis risk and demonstrate PR’s impact on business outcomes.

Founder controlled, fully distributed, and growing sustainably, Muck Rack has received several awards for its unparalleled culture and product from organizations like Inc., Quartz, G2, and BuiltIn. We value resilience, transparency, ownership, & customer devotion and infuse these values into everything we do.

We’re looking for a collaborative and self-motivated Senior Software Engineer, App Security to join our quickly growing team and make a big impact. 

As a Senior Software Engineer, App Security on the Security Team, you’ll work closely with software engineers, product managers, and designers, to ensure that our infrastructure is secure. You’ll work on major technical projects with large data volumes, lead the building of new features, and help shape our engineering culture and processes. You will help to perform in-house vulnerability identification, triaging and remediation. You will help to automate our security processes and build out new alerting and detection mechanisms.  Our tech stack includes Python, Django, Celery, MySQL, Elasticsearch, Vue, and Webpack. Our technology team is focused on scale, quality, delivery, and thoughtful customer experience. We ship frequently without sacrificing work/life balance. 

To be set up for success in this role, you’ll need to have:

  • 5+ years total professional experience as an application security engineer
  • 5+ years experience in penetration testing and vulnerability remediation
  • Python or significant web experience in a similar framework
  • Experience in risk analysis and security frameworks

If the details below describe you, you could be a great fit for this role:

  • Worked on a complex, high-traffic site at a startup or software-as-a-service company, ideally with large amounts of data
  • Experience with MySQL (or Postgres) and/or ElasticSearch
  • Any combination of the following: experience with Celery, Luigi or Airflow, Kafka, AWS, NLP, data model performance tuning, content extraction, application performance tuning
  • Interest in journalism, news, media or social media

In addition, we’re always looking for candidates who:

  • Have excellent communication skills, with an ability to explain ideas clearly, give and receive feedback, and work well with team members
  • Exhibit a willingness to learn in areas where they have less experience with our tech stack
  • Take pride in the quality of their code. (Your code should be  readable, testable, and understandable years later. You adhere to the Zen of Python)
  • Work well in a fast-paced development environment with testing, continuous integration and multiple daily deploys
  • Have the ability to manage complexity in a large project, and incur technical debt only after considering the tradeoffs
  • Take a logical approach to problem solving that combines analytical thinking and intuition

Interview Overview

  • 30 min interview with a member of our Talent Team
  • 1 hour zoom interview with the hiring manager
  • Take-home coding assignment (2 hours max)
  • Peer interviews, including a 30 min code review discussion
  • Final call(s) with executive team member(s) 

Salary

The starting salary for this role is between $140,000 - $170,000, depending on skills and experience. We take a geo-neutral approach to compensation within the US, meaning that we pay based on job function and level, not location. For all other countries, we have competitive pay bands based on market standards. 

Individual compensation decisions are based on a number of factors, including experience level, skillset, and balancing internal equity relative to peers at the company. We expect the majority of the candidates who are offered roles at our company to fall healthily throughout the range based on these factors. We recognize that the person we hire may be less experienced (or more senior) than this job description as posted. If that ends up being the case, the updated salary range will be communicated with you as a candidate.

Why Muck Rack?

Remote Work, Forever. We’re a fully distributed team and have pledged to remain that way forever. We offer employees a full home office setup, phone & internet reimbursement, and a monthly coworking membership. We build culture through virtual and in-person team bonding opportunities including team lunches, friendly competitions, and celebratory events!

Transparent Compensation. We offer competitive geo-neutral pay in the U.S. and review compensation at least once annually to ensure internal equity and alignment with the external market. Depending on the role, we offer either a standardized bonus program or attainable commission structure and an opportunity to earn equity in the company. All employees are eligible for our 401(k) plan* with employer contributions.

Health & Wellness*. Muck Rack provides comprehensive health, dental, vision, disability and life insurance for employees and their families. We offer a high-deductible health plan with 100% premium coverage for individuals, as well as a range of other plan options. Our team also has access to 24/7 Virtual Care, an Employee Assistance Program, employer-funded HSA contributions, and other pre-tax benefits. Team members have access to a quarterly wellness stipend and a free Headspace subscription.

PTO and Family Benefits.Our team enjoys 4+ weeks of off-the-grid PTO, plus paid sick/mental health days, summer Fridays, and 13 paid holidays. In order to combat Zoom fatigue and allow for deep work without interruption, we have implemented “No Internal Meeting Fridays” year round. We also provide up to 16 weeks of fully paid parental leave.

Personal & Professional Development. We grow talent by creating internal pathways for advancement and promotion. Muck Rack conducts bi-annual performance reviews, hosts team-wide workshops, and offers management training and leadership training opportunities. We also provide unlimited subscriptions to L&D platforms including Coursera & O’Reilly, as well as 2 additional days of PTO to dedicate to learning and development.

Culture of Inclusion.We know that diverse perspectives breed innovation and help us better serve our customers. We are committed to ensuring employees feel their identities are valued and that people of all backgrounds and points of view are treated equitably.

Customer-First. Founder-controlled means we have the freedom to be nimble, highly collaborative and innovative, building forward-thinking products that enable 3,000+ companies around the world to build trust, tell their stories and demonstrate the unique value of earned media.

*These benefits are specific to US-based employees. In some, but not all, cases we are able to offer equivalent benefits to employees located outside of the United States.

While we are a fully distributed team, we do have limitations on where we can hire and maintain a list of acceptable working locations based on job function. If we are unable to hire in your current location for the role for which you applied, you will be notified via email. While we enjoy many benefits as a permanently distributed and remote company, we cannot always support relocation or extended travel and have guidelines in place to ensure compliant work away from your designated permanent residence.

If you're excited about an opportunity at Muck Rack but your experience doesn't align perfectly with the requirements of the role outlined here, please don't let it stop you from applying. We're committed to building a diverse and inclusive workplace, and we want to hear from you. You may be a great fit for this role or another position on our team. We deliberately encourage individuals from all backgrounds, including race, gender identity, sexual orientation, and disability status to apply for positions. We are an equal opportunity employer and we're committed to a fair and consistent interview process and candidate experience.
 
#LI-Remote

See more jobs at Muck Rack

Apply for this job

1d

Senior Software Engineer, Data

JW PlayerUnited States - Remote
agileairflowjavadockerelasticsearchkubernetespythonAWSbackend

JW Player is hiring a Remote Senior Software Engineer, Data

About JWP:

JWP is transforming the Digital Video Economy as a trusted partner for over 40,000 broadcasters, publishers, and video-driven brands through our cutting-edge video software and data insights platform. JWP empowers customers with unprecedented independence and control over their digital video content. Established in 2004 as an open-source video player, JWP has evolved into the premier force driving digital video for businesses worldwide. With a rich legacy of pioneering video technology, JWP customers currently generate 8 billion video impressions/month and 5 billion minutes of videos watched/month. At JWP, everyone shares a passion for revolutionizing the digital video landscape. If you are ready to be a part of a dynamic and collaborative team then join us in shaping the future of video! 

The Data Engineering Team: 

At JWP, our data team is a dynamic and innovative team, managing the data lifecycle, from ingestion to processing and analysis, touching every corner of our thriving business ecosystem. Engineers on the team play a pivotal role in shaping the company's direction by making key decisions about our infrastructure, technology stack, and implementation strategies. 

The Opportunity: 

We are looking to bring on a Senior Software Engineer to join our Data Engineering team. As an Engineer on the team, you will be diving into the forefront of cutting-edge big data tools and technology. In this role, you will have the opportunity to partner closely with various teams to tackle crucial challenges for one of the world's largest and rapidly expanding video companies. Join us and make an impact at the forefront of digital innovation.

As a Senior Data Engineer, you will:

  • Contribute to the development of distributed batch and real-time data infrastructure.
  • Mentor and work closely with junior engineers on the team. 
  • Perform code reviews with peers. 
  • Lead small to medium sized projects, documenting and ticket writing the projects. 
  • Collaborate closely with Product Managers, Analyst, and cross-functional teams to gather insights and drive innovation in data products. 

Requirements for the role:

  • Minimum 5+ years of backend engineering experience with a passionate interest for big data.
  • Expertise with Python or Java and SQL. 
  • Familiarity with Kafka
  • Experience with a range of datastores, from relational to key-value to document
  • Demonstrate humility, empathy, and a collaborative spirit that fuels team success. 

Bonus Points:

  • Data engineering experience, specifically with data modeling, warehousing and building ETL pipelines
  • Familiarity with AWS - in particular, EC2, S3, RDS, and EMR
  • Familiarity with Snowflake
  • Familiarity with Elasticsearch
  • Familiarity with data processing tools like Hadoop, Spark, Kafka, and Flink
  • Experience with Docker, Kubernetes, and application monitoring tools
  • Experience and/or training with agile methodologies
  • Familiarity with Airflow for task and dependency management

Perks of being at JWP, United States

Our goal is to take care of you and ensure you will be successful in your new role. Your success is our success! 

As a full time employee, you will qualify for:

  • Private Medical, Vision and Dental Coverage for you and your family
  • Unlimited Paid Time Off
  • Stock Options Purchase Program
  • Quarterly and Annual Team Events
  • Professional Career Development Program and Career Development Progression
  • New Employee Home Office Setup Stipend
  • Monthly Connectivity Stipend
  • Free and discounted perks through JW Player's benefit partners
  • Bi-Annual Hack Weeks for those who are interested in using their coding knowledge
  • Fireside chats with individuals throughout JW Player

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

See more jobs at JW Player

Apply for this job

1d

Principal Software Engineer

Procore TechnologiesUS - Remote TX - Austin, TX
agileMaster’s DegreescalanosqlairflowDesignscrumjavadockerpostgresqlkubernetesjenkinspython

Procore Technologies is hiring a Remote Principal Software Engineer

Job Description

Procore’s Business Systems Technology group is looking for a Principal Software Engineer to elevate our business systems technology landscape, enhance scalability, drive operational excellence, and enable efficient growth for the business.

 

As a Principal Software Engineer, you’ll use your expert-level technical skills to craft innovative solutions while influencing and mentoring other technical leaders. You’ll collaborate with cross-functional teams and play a pivotal role to design, develop, and optimize business systems, platforms, services, integrations, and transactional data across diverse domains including finance, accounting, e-commerce, billing, payments, expenses, tax, and talent. To be successful in this role, you’re passionate about domain-driven design, systems optimization, event based integrations, configurable cloud services, with a strong bias for action and outcomes. If you’re an inspirational technology leader comfortable translating vague problems into pragmatic solutions that open up the boundaries of technical possibilities—we’d love to hear from you!

 

This role is based out of our Austin, Texas office, reports into the VP Technology of DTS Business Systems and offers flexibility to work remotely as schedule permits.

 

What you’ll do:

  • Lead the design, development, and implementation of scalable software and data solutions to meet business needs.
  • Optimize performance and scalability of existing systems to support business growth.
  • Architect and implement robust integrations between diverse systems and services.
  • Collaborate with cross-functional teams to define technical strategies, roadmaps, and drive outcome delivery.
  • Contribute to setting standards and development principles across multiple teams and the larger organization.
  • Champion best practices for software development, code reviews, and quality assurance processes.
  • Generate technical documentation and presentations to communicate architectural and design options, and educate development teams and business users.
  • Mentor and guide junior engineers to foster their growth and development.
  • Roughly 40-60% hands-on coding.

 

What we’re looking for:

  • Bachelor’s or Master’s degree in Computer Science or related field.
  • 10+ years of experience designing & implementing complex systems and business application integrations with SaaS applications (including enterprise integration patterns, middleware frameworks, SOA web services) 
  • 10+ years of demonstrated success in software development and building cloud-based, highly available, and scalable online services or streaming systems 
  • Deep understanding of micro-services architecture and containerization technologies (e.g., Docker, Kubernetes, Mesos).
  • Expertise with diverse DB technologies like RDMS PostgreSQL, Graph, NoSQL (document, columnar, key-value), Snowflake. 
  • Strength in the majority of commonly used data technologies and languages such as Python, Java, Go or Scala, Kafka, Spark, Flink, Airflow, Splunk, Datadog, Jenkins, or similar
  • Skilled in software development lifecycle processes and experience with scrum, agile and iterative approaches 
  • Excellent communication skills: Demonstrated ability to explain complex technical issues to both technical and non-technical audiences.
  • Knowledge of accounting, billing and payment processing concepts and experience with finance (ERP), billing applications and payment processors preferred

Qualifications

See more jobs at Procore Technologies

Apply for this job

1d

Staff Data Scientist - Marketing

SquareSan Francisco, CA, Remote
Bachelor degreetableauairflowsqlDesignpython

Square is hiring a Remote Staff Data Scientist - Marketing

Job Description

The Data Science team at Cash App derives valuable insights from our extremely unique datasets and turns those insights into actions that improve the experience for our customers every day. As a Marketing Data Scientist, you will play a critical role in accelerating Cash App’s growth by creating and improving how we measure the impact of all our marketing efforts.

In this role, you’ll be embedded in our Marketing organization and work closely with marketers, product management as well as other cross-functional partners to make effective spend decisions across marketing channels, understand the impact of incentives and explore new opportunities to enable Cash App to become the top provider of primary banking services to our customers.

You will:

  • Build models to optimize our marketing efforts to ensure our spend has the best possible ROI
  • Design and analyze A/B experiments to evaluate the impact of marketing campaigns we launch
  • Analyze large datasets using SQL and scripting languages to surface actionable insights and opportunities to the Marketing product team and other key stakeholders
  • Partner directly with the Cash App Marketing org to influence their roadmap and define success metrics to understand the impact to business,
  • Approach problems from first principles, using a variety of statistical and mathematical modeling techniques to research and understand customer behavior & segments
  • Build, forecast, and report on metrics that drive strategy and facilitate decision making for key business initiatives
  • Write code to effectively process, cleanse, and combine data sources in unique and useful ways, often resulting in curated ETL datasets that are easily used by the broader team
  • Build and share data visualizations and self-serve dashboards for your partners
  • Effectively communicate your work with team leads and cross-functional stakeholders on a regular basis

Qualifications

You have:

  • A bachelor degree in statistics, data science, or similar STEM field with 7+ years of experience in a relevant role OR
  • A graduate degree in statistics, data science, or similar STEM field with 5+ years of experience in a relevant role
  • Advanced proficiency with SQL and data visualization tools (e.g. Tableau, Looker, etc)
  • Experience with scripting and data analysis programming languages, such as Python or R
  • Worked extensively with Causal Inference techniques and off platform data
  • A knack for turning ambiguous problems into clear deliverables and actionable insights 
  • Gone deep with cohort and funnel analyses, a deep understanding statistical concepts such as selection bias, probability distributions, and conditional probabilities

Technologies we use and teach:

  • SQL, Snowflake, etc.
  • Python (Pandas, Numpy)
  • Tableau, Airflow, Looker, Mode, Prefect

See more jobs at Square

Apply for this job

2d

Senior Backend Engineer

DataCampEurope Remote
airflowsqlB2BDesignslackgraphqlapirubytypescriptbackend

DataCamp is hiring a Remote Senior Backend Engineer

Who We Are:

At DataCamp, we're not just a platform; we're the catalyst for a data-fluent world. We enable individuals and businesses to leap forward in data science, providing them with top-tier education, certification, and collaboration tools.

By the Numbers:

  • 400+ dynamic courses
  • 270+ renowned instructors from 35 Countries
  • 90+ hands-on projects
  • 12 million+ global learners

We're proud to be backed by Spectrum Equity, Accomplice, and Arthur Ventures, aiming to hit $100M ARR in the upcoming years. While our roots are in New York City, our presence spans London to Leuven, with a vibrant team of 200+ members working both on-site and remotely.

About the role:

We are looking for a talented Senior Backend Engineer to help us build integrations with our B2B customer platforms. 

The Enterprise Integrations Squad's goal is to bring our extensive learning catalog to our learners with as little friction as possible. To achieve this, our team is responsible for building seamless integrations with our B2B customers platforms. We integrate DataCamp with Learning Management Systems (LMS/LXP), Single Sign-on providers (SSO) and communication tools (Slack and MS Teams). The Enterprise Integrations squad is also responsible for maintaining our B2B customer's reporting data as well as building reporting integrations which allow our customers to access their learning data in their own BI tools or Data Warehouse.

Excited about playing a critical role in contributing to the product's technical direction and expanding the learning experience? Find out more about the role below and apply to join our team!

Responsibilities:

  • Help us build the best platform to learn and teach Data Science.
  • Contribute to the technical direction and architecture of our integrations and reporting platform.
  • Work with the latest backend technologies to solve challenging problems.
  • Improve existing integrations as well as research, build and document new integrations with 3rd party systems.
  • Engage closely with peers within and outside your team to build scalable, fast and maintainable solutions.
  • Coordinate with our B2B customers and their technical teams to streamline integration setups and deployments
  • Act as a technical sidekick to our PM during customer discussions on new integration possibilities.

What we’re looking for?

  • 5+ years experience with Typescript & NodeJS (required)
  • Experience building front-ends with GraphQL
  • Experience integrating third-party services via APIs
  • Experience building services in a highly distributed platform with async processing (SNS, SQS, BullMQ)
  • Good architectural knowledge of distributed systems and API design
  • Strong SQL skills are a must

Nice to have:

  • Ruby on Rails (not mandatory, experience is preferred)
  • Experience building front-ends with React
  • Experience with data pipelines (Airflow, DBT) and Data Marts

What's in it for you:

In addition to joining a creative and international start-up, as a permanent employee you’ll enjoy:

  • A very competitive salary
  • An exciting job that will offer you technical challenges every day
  • Flexible working hours
  • International company retreats 
  • Conference and hardware budget
  • Working with a great team (everyone says this, but we’re serious—we’re pretty great)

DataCamp is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.

 

See more jobs at DataCamp

Apply for this job

Genesis is hiring a Remote User Acquisition Specialist (Paid Social) at HolyWater

ПІДТРИМУЄМО УКРАЇНУ ????????

Holy Water засуджує війну росії проти України й допомагає державі. На початку повномасштабної війни ми запустили продаж NFT-колекції про події в Україні, щоб зібрати 1 млн доларів на потреби української армії, а також долучилися до корпоративного благодійного фонду Genesis for Ukraine. Команда фонду закуповує необхідне спорядження, техніку й медикаменти для співробітників та їхніх родичів, що захищають країну на передовій, крім того, ми постійно донатимо на ЗСУ.

ЗУСТРІЧАЙТЕ СВОЮ МАЙБУТНЮ КОМАНДУ!

Ви будете працювати в Holy Water — це стартап в сфері ContentTech, який займається створенням та паблішингом книжок, аудіокнижок, інтерактивних історій та відео серіалів. Ми будуємо синергію між ефективністю AI та креативністю письменників, допомагаючи їм надихати своїм контентом десятки мільйонів користувачів у всьому світі.

HolyWater була заснована в 2020 році в екосистемі Genesis. З того часу команда зросла з 6 до 90 спеціалістів, а наші додатки неодноразово ставали лідерами в своїх категоріях в США, Австралії, Канаді та Європі.

За допомогою нашої платформи, ми даємо можливість будь-якому талановитому письменнику вийти на мільйону аудиторію користувачів наших додатків та надихати їх своїм історіями. Нашими продуктами користуються вже більше 10 мільйонів користувачів по всьому світу.

НАШІ ДОСЯГНЕННЯ ЗА 2023:

1. Наш додаток з інтерактивними історіями 3 місяці ставав топ 1 за завантаженнями у світі у своїй ніші.
2. Наш додаток з книжками, Passion, в грудні став топ 1 в своїй ніші в США та Європі.
3. Ми запустили платформу з відео серіалами на основі наших книжок та зробили перший успішний пілотний серіал.
4. Кількість нових завантажень та виручка зросли майже в 2 рази в порівнянні з 2022.

Основна цінність HolyWater
- це люди, які працюють з нами. Саме тому ми прикладаємо всі зусилля, щоб створити такі умови, де кожен співробітник зможе реалізувати свій потенціал наповну та досягнути найамбітніших цілей.

КУЛЬТУРА КОМПАНІЇ

У своїй роботі команда спирається на шість ключових цінностей: постійне зростання, внутрішня мотивація, завзятість і гнучкість, усвідомленість, свобода та відповідальність, орієнтація на результат.

У команді Holy Waterти зможеш працювати над продуктами впізнаваними по всьому світу! Разом ми зможемо відслідковувати сучасні тренди, аналізувати ринок, створювати креативи, які чіпляють, а також ефективно працювати з різними рекламними платформами.

Зараз ми зосереджені на масштабуванні команди та пошуку людей, які допоможуть вивести наш застосунки на нові висоти. Якщо ви смілива, працьовита, допитлива, самосвідома людина, яка не боїться робити помилки та вчитися на них, давай поспілкуємось!

Ми наразі активно розширюємо команду, для масштабування наших існуючих та нових продуктів на Meta та TikTok. І тому шукаємо в команду амбітного User Acquisition Specialist, щоб всі наші плани стали реалізованими.

ВАШІ ОБОВ'ЯЗКИ ВКЛЮЧАТИМУТЬ:

Запуск, аналіз та оцінка результатів рекламних кампаній, їх оптимізація на платформах Meta/TikTok;
Глибокий аналіз та ефективну оптимізацію маркетингових кампаній;
Надання відгуків та співпраця з креативною командою щодо покращення креативів;
Пошук додаткових точок росту продукту шляхом тестування різноманітних маркетингових гіпотез;
Створення та регулярне оновлення звітності за проектами.

ЩО ПОТРІБНО, АБИ ПРИЄДНАТИСЯ:

1+ рік досвіду закупки трафіку з місячним бюджетом від $600k+ в Meta;
Досвід роботи з TikTok, Snapchat буде перевагою;
Робота з web2app та SKAN закупкою;
Користування аналітичними тулами, розуміння конверсій та вміння працювати з репортами в Appsflyer, Tableau, Amplitude;
Вміння оцінювати та розбирати результати креативів, формувати гіпотези;
Аналітичні здібності, розвинуте критичне мислення;
Проактивність, готовність пропонувати та обговорювати ідеї.

ЩО МИ ПРОПОНУЄМО:

Ви будете частиною згуртованої команди професіоналів, де зможете обмінюватися знаннями та досвідом, а також отримувати підтримку та поради від колег.
Гнучкий графік роботи, можливість працювати віддалено з будь-якої безпечної точки світу.
Можливість відвідувати офіс на київському Подолі. Самостійно вирішувати працювати віддалено, в офісі чи поєднувати ці опції. В офісах можна не турбуватися про рутину: тут на вас чекають сніданки, обіди, безліч снеків та фруктів, лаунжзони, масаж та інші переваги ????
20 робочих днів оплачуваної відпустки на рік, необмежена кількість лікарняних.
Є можливість звернутися за консультацією до психолога.
Уся необхідна для роботи техніка.
У компанії ми активно застосовуємо сучасні інструменти та технології, такі як BigQuery, Tableau, Airflow, Airbyte і DBT. Це дасть вам можливість працювати з передовими інструментами та розширити свої навички в галузі аналітики.
Онлайн-бібліотека, регулярні лекції від спікерів топрівня, компенсація конференцій, тренінгів та семінарів.
Професійне внутрішнє ком’юніті для вашого кар’єрного розвитку (Analytics та Marketing).
Культура відкритого фідбеку.

ЕТАПИ ВІДБОРУ:

1. Первинний скринінг. Рекрутер ставить декілька запитань (телефоном або в месенджері), аби скласти враження про ваш досвід і навички перед співбесідою.
2. Тестове завдання. Підтверджує вашу експертизу та показує, які підходи, інструменти й рішення ви застосовуєте в роботі. Ми не обмежуємо вас у часі та ніколи не використовуємо напрацювання кандидатів без відповідних домовленостей.
3. Співбесіда з менеджером. Всеохопна розмова про ваші професійні компетенції та роботу команди, в яку подаєтесь.
4. Бар-рейзинг. На останню співбесіду ми запрошуємо одного з топменеджерів екосистеми Genesis, який не працюватиме напряму з кандидатом. У фокусі бар-рейзера — ваші софт-скіли та цінності, аби зрозуміти, наскільки швидко ви зможете зростати разом з компанією.


Хочеш стати частиною сильної команди? Відправляй своє резюме чи зв’яжись зі мною через LinkedIn ????.

See more jobs at Genesis

Apply for this job

8d

Senior Data Engineer with Cloud

Accesa - RatiodataRemote, Romania, Remote
tableauterraformscalanosqlairflowsqlDesignazurejavalinuxjenkinspythonAWS

Accesa - Ratiodata is hiring a Remote Senior Data Engineer with Cloud

Job Description

As part of our Data Engineering Team, you will help out shaping the future of our software.

You will develop testing tools, test to keep this data accessible and ready for analysis. Among your tasks, you will do building testing tools, writing test data generators, ETL (Extraction Transformation and Load) and Testing of the Database Architecture.

  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Identify, design, and implement process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Cloud ‘big data’ technologies.
  • Build/use analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Guide/mentor the development team and provide support in the technical implementation

Qualifications

Must have

  • 5+ years of professional experience 
  • Experience with
    • Object function scripting/ object-oriented languages: Java or Python 
    • SQL 
    • relational and NoSQL databases
    • Bash shell script (Linux)
    • Google (preferred), Azure, AWS - Cloud service providers
    • GCP Dataproc, BigQuery, Pub/Sub, Cloud Functions
    • Working knowledge of highly scalable ‘big data’ datastore and Apache Spark

Willing to develop

  • Scala
  • Terraform
  • CI/CD pipelines(Jenkins or similar)
  • Data pipeline and workflow management tools: Airflow, Azkaban, Luigi, etc.
  • Knowledge of Visualization tools:  Tableau, PowerBI, etc

Apply for this job

8d

Staff Data Engineer, Banking

SquareSan Francisco, CA, Remote
airflowDesignazurerubyjavapythonAWS

Square is hiring a Remote Staff Data Engineer, Banking

Job Description

The Square Banking team is building a suite of new financial products for Square sellers. We offer business checking accounts, savings accounts, credit card, and loans to help our sellers manage their business cash flow. Investing in a Financial Data Mesh Platform is not just about managing data; it's about unleashing the full potential of our organization's most valuable asset. It’s a critical strategic move that not only empowers us to use the distinctive value of data but also extends its positive impact to our customers, our Sellers, and users of the Banking platform.

As an Engineer focused on Data for Square Banking, you will help us build our own Square Banking Financial Data Mesh Platform, using real-time Big Data technologies and Medallion architecture. You will work directly with product, engineering, data science and machine learning teams to understand their use-case, develop reliable, trusted datasets that accelerate the decision-making process of important products.

You will:

  • You'll design large-scale, distributed data processing systems and pipelines to ensure efficient and reliable data ingestion, storage, transformation, and analysis

  • Promote high-quality software engineering practices towards building data infrastructure and pipelines at scale

  • You'll build core datasets to serve as unique sources of truth for product and departments (product, marketing, sales, finance, customer experience, data science, business operations, IT, engineering)

  • You'll partner with data scientists and other cross-functional partners to understand their needs and build pipelines to scale.

  • Identify and address data quality and integrity issues through data validation, cleansing, and data modeling techniques. You'll implement automated workflows that lower manual/operational cost for team members, define and uphold SLAs for timely delivery of data, move us closer to democratizing data and a self-serve model (query exploration, dashboards, data catalog, data discovery)

  • Learn about Big Data architecture via technologies such as AWS, DataBricks and Kafka.

  • Stay up to date with emerging technologies, best practices, and industry trends in data engineering and software development

  • Mentor and provide guidance to junior data engineers fostering inclusivity and growth.

  • Work remotely with a team of distributed colleagues #LI-Remote

  • Report to the Engineering Manager of Banking - Data Engineering

Qualifications

You Have:

  • 8+ years as a data engineer or software engineer, with a focus on large-scale data processing and analytics

  • You've spent 4+ years as a data engineer building core datasets.

  • You are passionate about analytics use cases, data models and solving complex data problems.

  • You have hands-on experience shipping scalable data solutions in the cloud (e.g AWS, GCP, Azure), across multiple data stores (e.g Databricks, Snowflake, Redshift, Hive, SQL/NoSQL, columnar storage formats) and methodologies (e.g dimensional modeling, data marts, star/snowflake schemas)

  • You have hands-on experience with highly scalable and reliable data pipelines using BigData (e.g Airflow, DBT, Spark, Hive, Parquet/ORC, Protobuf/Thrift, etc)

  • Optimized and tuned data pipelines to enhance overall system performance, reliability, and scalability

  • Knowledge of programming languages (e.g. Go, Ruby, Java, Python)

  • Willingness to participate in professional development activities to stay current on industry knowledge and passion for trying new things.

See more jobs at Square

Apply for this job

10d

Data Integration Engineer (Req #1713)

Clover HealthRemote - USA
Master’s DegreetableauairflowpostgressqlDesignc++python

Clover Health is hiring a Remote Data Integration Engineer (Req #1713)

Location: 3401 Mallory Lane, Suite 210, Franklin, TN 37067; Telecommuting
permissible from any location in the U.S.


Salary Range: $132,974 /yr - $161,250 /yr


Job Description: Create and manage ETL packages, triggers, stored procedures, views,
SQL transactions. Develop new secure data feeds with external parties as well as internal
applications including the data warehouse and business intelligence applications. Perform
analysis and QA. Diagnose ETL and database related issues, perform root cause analysis,
and recommend corrective actions to management. Work with a small project team to
support the design, development, implementation, monitoring, and maintenance of new
ETL programs. Telecommuting is permissible from any location in the US. 

Requirements: Bachelor’s degree or foreign degree equivalent in Computer Science,
Information Systems or related field and five (5) years of progressive, post-baccalaureate
experience in IT development or in the job offered or related role. Alternatively,
employer will accept a Master’s degree or foreign equivalent in Computer Science,
Information Systems or a related field and two (2) years of experience in IT development
or in the job offered or a related role. Any suitable combination of education, experience,
or training is acceptable.

Skills: Experience and/or education must include: 

1.  Python & Postgres; 
2.  Snowflake, DBT, Airflow, Big Query, Data Governance; 
3. Analytics, data science through SQL Optimization; 
4.  Database Design Modeling; and 
5.  ML Collaboration tools such as Tableau, Mode, and Looker.

 

#LI-DNI

See more jobs at Clover Health

Apply for this job

10d

Data Engineer - Senior 0010ALIS - 151

Global InfoTek, Inc.Huntsville, AL Remote
agilejiraairflowsqlDesigndockerelasticsearchpostgresqlkubernetesAWSjavascript

Global InfoTek, Inc. is hiring a Remote Data Engineer - Senior 0010ALIS - 151

Clearance Level:TS/SCI

US Citizenship: Required

Job Classification: Full-time

Location: District of Columbia

Experience:5-7 years

Education: Masters or equivalent experience in a related field.

As a Data Engineer, you will be required to interpret business needs and select appropriate technologies and have experience in implementing data governance of shared and/or master sets of data. You will work with key business stakeholders, IT experts, and subject-matter experts to plan and deliver optimal data solutions. You will create, maintain, and optimize data pipelines as workloads move from development to production for specific use cases to ensure seamless data flow for the use case. You will perform technical and non-technical analyses on project issues and help to ensure technical implementations follow quality assurance metrics. You will analyze data and systems architecture, create designs, and implement information systems solutions.

Responsibilities:

  • Bullets of responsibilities of the role, examples:
  • Define and communicate a clear product vision for our client’s software products, aligning with user needs and business objectives.
  • Create and manage product roadmaps that reflect both innovation and growth strategies.
  • Partner with a government product owner and a product team of 7-8 FTEs.
  • Develop and design data pipelines to support an end-to-end solution.
  • Develop and maintain artifacts (e.g. schemas, data dictionaries, and transforms related to ETL processes).
  • Integrate data pipelines with AWS cloud services to extract meaningful insights.
  • Manage production data within multiple datasets ensuring fault tolerance and redundancy.
  • Design and develop robust and functional dataflows to support raw data and expected data.
  • Provide Tier 3 technical support for deployed applications and dataflows.
  • Collaborate with the rest of data engineering team to design and launch new features.
  • Coordinate and document dataflows, capabilities, etc.
  • Occasionally (as needed) support to off-hours deployment such as evening or weekends.

Qualifications:

  • Understanding of cloud architectures and enabling tools and technologies, such as, AWS Cloud (GovCloud/C2S).
  • Familiar with Amazon Web Managed Services (AWS).
  • Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS, SNS, Kafka, AWS Lambda, NiFi, Airflow, or similar.
  • Proficient experience utilizing JavaScript, Elasticsearch, JSON, SQL, XML.
  • Working knowledge with datastores MongoDB/DynamoDB, PostgreSQL, S3, Redshift, JDBC/ODBC, and Redis.
  • Familiar with Linux/Unix server environments.
  • Experience with Agile development methodology.
  • Publishing and/or presenting design reports.
  • Coordinating with other team members to reach project milestones and deadlines.
  • Working knowledge with Collaboration tools, such as, Jira and Confluence.

Preferred Qualifications:

  • Familiarity and experience with the Intelligence Community (IC), and the intel cycle.
  • Familiarity and experience with the Department of Homeland Security (DHS).
  • Direct Experience with DHS and Intelligence Community (IC) component's data architectures and environments (IC-GovCloud experience preferred).
  • Experience with cloud message APIs and usage of push notifications.
  • Keen interest in learning and using the latest software tools, methods, and technologies to solve real world problem sets vital to national security.
  • Working knowledge with public keys and digital certificates.
  • Experience with DevOps environments.
  • Expertise in various COTS, GOTS, and open-source tools which support development of data integration and visualization applications.
  • Experience with cloud message APIs and usage of push notifications.
  • Specialization in Object Oriented Programming languages, scripting, and databases.

Global InfoTek, Inc. is an equal-opportunity employer. All qualified applicants will receive consideration for employment regardless of race, color, religion, sex, sexual orientation, gender identity, or national origin.

About Global InfoTek, Inc. Reston, VA-based Global InfoTek Inc. is a woman-owned small business with an award-winning track record of designing, developing, and deploying best-of-breed technologies that address the nation’s pressing cyber and advanced technology needs. For more than two decades, GITI has merged pioneering technologies, operational effectiveness, and best business practices to rapidly.

See more jobs at Global InfoTek, Inc.

Apply for this job

11d

Senior Analytics Engineer

RemoteRemote-Europasia
airflowsqlDesignjenkinspython

Remote is hiring a Remote Senior Analytics Engineer

About Remote

Remote is solving global remote organizations’ biggest challenge: employing anyone anywhere compliantly. We make it possible for businesses big and small to employ a global team by handling global payroll, benefits, taxes, and compliance. Check out remote.com/how-it-works to learn more or if you’re interested in adding to the mission, scroll down to apply now.

Please take a look at remote.com/handbook to learn more about our culture and what it is like to work here. Not only do we encourage folks from all ethnic groups, genders, sexuality, age and abilities to apply, but we prioritize a sense of belonging. You can check out independent reviews by other candidates on Glassdoor or look up the results of our candidate surveys to see how others feel about working and interviewing here.

All of our positions are fully remote. You do not have to relocate to join us!

The position

This is an exciting time to join Remote and make a personal difference in the global employment space as a Senior Analytics Engineer I, joining our Data Engineering team, composed of Analytics Engineers and Data Engineers. With the Data Analytics team, we support the decision making and reporting needs by being able to translate data into actionable insights to non-data professionals at Remote. We’re mainly using SQL, dbt, Python, Meltano, Airflow, Redshift, Metabase and Retool.

This role follows the Senior Engineer I role on the Remote Career paths.

Requirements

  • 3+ years of experience in analytics engineering; high-growth tech company experience is a plus
  • Strong experience using data transformation frameworks (e.g. dbt) and data warehouses (e.g. Redshift), strong proficiency in SQL
  • Strong knowledge in data modelling techniques (Kimball, Data Vault, etc)
  • Solid Experience with data visualization tools (e.g. Metabase)
  • Strong affinity towards well crafted software - testing, knowledge of best practices, experience with CI/CD (e.g. Gitlab, Github, Jenkins)
  • A self-starter mentality and the ability to thrive in an unstructured and fast-paced environment
  • Proven collaboration and communication skills
  • Experience in dealing with ambiguity, working together with stakeholders on taking abstract concepts and turning them into data models that can answer a variety of questions
  • Writes and speaks fluent English
  • It's not required to have experience working remotely, but considered a plus

Key Responsibilities

  • DBT Modelling:
    • Design, develop, and maintain dbt (Data Build Tool) models for data transformation and analysis, providing clean and reliable data to end users enabling them to get accurate and consistent answers by self-serving on BI tools.
    • Collaborate with Data Analysts and Business Stakeholders to understand their reporting and analysis needs and translate them into DBT models.
    • Own our internal dbt conventions and best practices, keeping our code-base clean and efficient (including code reviews for peers).
  • Data Analytics & Monitoring:
    • Ensure data quality and consistency by implementing data testing, validation and cleansing techniques.
    • Implement monitoring solutions to track the health and performance of the data present in our warehouse.
    • Train business users on how to use data visualisation tools.
  • Drive our Culture of Documentation:
    • Create and maintain data documentation & definitions, including data dictionaries and process flows.
    • Collaborate with cross-functional teams, including Data Analysts, Business stakeholders, to understand their data requirements and deliver effective data solutions.
    • Share knowledge and provide guidance to peers, creating an environment that empowers collective growth.

Compensation Philosophy 

Remote's Total Rewards philosophy is to ensure fair, unbiased compensation and fair equity pay along with competitive benefits in all locations in which we operate. We do not agree to or encourage cheap-labor practices and therefore we ensure to pay above in-location rates. We hope to inspire other companies to support global talent-hiring and bring local wealth to developing countries.

At first glance our salary bands seem quite wide - here is some context. At Remote we have international operations and a globally distributed workforce. We use geo ranges to consider geographic pay differentials as part of our global compensation strategy to remain competitive in various markets while we hiring globally.

The base salary range for this full-time position is between $39500 USD - $133400 USD. Our salary ranges are determined by role, level and location, and our job titles may span more than one career level. The actual base pay for the successful candidate in this role is dependent upon many factors such as location, transferable or job-related skills, work experience, relevant training, business needs, and market demands. The base salary range may be subject to change.

Practicals

  • You'll report to: Engineering Manager - Data
  • Team: Data Engineering
  • Location: Anywhere in the World
  • Start date: As soon as possible

Application process

  1. (async) Profile review
  2. Interview with recruiter
  3. Interview with future manager
  4. (async) Small challenge
  5. (async) Challenge Review
  6. Interview with team members (no managers present)
  7. Prior employment verification check(s)
  8. (async) Offer

Benefits

Our full benefits & perks are explained in our handbook at remote.com/r/benefits. As a global company, each country works differently, but some benefits/perks are for all Remoters:
  • work from anywhere
  • unlimited personal time off (minimum 4 weeks)
  • quarterly company-wide day off for self care
  • flexible working hours (we are async)
  • 16 weeks paid parental leave
  • mental health support services
  • stock options
  • learning budget
  • home office budget & IT equipment
  • budget for local in-person social events or co-working spaces

How you’ll plan your day (and life)

We work async at Remote which means you can plan your schedule around your life (and not around meetings). Read more at remote.com/async.

You will be empowered to take ownership and be proactive. When in doubt you will default to action instead of waiting. Your life-work balance is important and you will be encouraged to put yourself and your family first, and fit work around your needs.

If that sounds like something you want, apply now!

How to apply

  1. Please fill out the form below and upload your CV with a PDF format.
  2. We kindly ask you to submit your application and CV in English, as this is the standardised language we use here at Remote.
  3. If you don’t have an up to date CV but you are still interested in talking to us, please feel free to add a copy of your LinkedIn profile instead.

We will ask you to voluntarily tell us your pronouns at interview stage, and you will have the option to answer our anonymous demographic questionnaire when you apply below. As an equal employment opportunity employer it’s important to us that our workforce reflects people of all backgrounds, identities, and experiences and this data will help us to stay accountable. We thank you for providing this data, if you chose to.

See more jobs at Remote

Apply for this job

12d

Data Engineer - Contractor

carsalesSydney, Australia, Remote
airflowsqlDesign

carsales is hiring a Remote Data Engineer - Contractor

Job Description

We're looking for a contract data engineer to rebuild the core of our finance and billing infrastructure.

You'll be working with other data engineers and subject matter experts to design and construct a new system with full internal audit capability.

We're looking for a Contract Data Engineer to to rebuild the core of our finance and billing infrastructure.You'll be working with other data engineers and subject matter experts to design and construct a new system with full internal audit capability.

 

  • Building ETL pipelines (Airflow, Apache Beam, SQL, etc) to ingest data.
  • Working with business and technical teams to interpret data.
  • Identifying data quality issues and proactively developing quality strategies.
  • Building pipelines for machine learning models.

Qualifications

What you bring to the role

  • Hands-on experience constructing data pipelines using DBT is essential.
  • Similar experience in a hands on role. 
  • Hands-on experience constructing automated data validation systems to ensure output is correct.
  • Experience with the Google Cloud data stack is desirable, e.g. BigQuery, Composer, DataFlow, etc.

See more jobs at carsales

Apply for this job

13d

Senior Data Engineer (Taiwan)

GOGOXRemote
airflowsqlazureapijavapythonAWS

GOGOX is hiring a Remote Senior Data Engineer (Taiwan)

Senior Data Engineer (Taiwan) - GoGoX - Career Page

See more jobs at GOGOX

Apply for this job

15d

Senior Data Scientist - Health

SquareSan Francisco, CA, Remote
Bachelor degreetableauairflowsqlDesignpython

Square is hiring a Remote Senior Data Scientist - Health

Job Description

The Data Science team at Cash App derives valuable insights from our extremely unique datasets and turns those insights into actions that improve the experience for our customers every day. In this role, you’ll be embedded in our Health organization and work closely with product management as well as other cross-functional partners to drive meaningful change that helps protect our customers and their money. Because our Health DS team plays such a critical role in building and maintaining trust with our users, an appreciation for the connection between your work and the experience it delivers to customers is absolutely critical for this position.

As a Data Scientist, you will:

  • Partner directly with the Cash App Health org, working closely with operations, engineers, legal and compliance, and machine learning teams
  • Analyze large datasets using SQL and scripting languages to surface actionable insights and opportunities to the product team and other key stakeholders
  • Approach problems from first principles, using a variety of statistical and mathematical modeling techniques to research and understand customer behavior
  • Design and analyze A/B experiments to evaluate the impact of changes we make to the product
  • Work with engineers to log new, useful data sources as we build new product features
  • Build, forecast, and report on metrics that drive strategy and facilitate decision making for key business initiatives
  • Write code to effectively process, cleanse, and combine data sources in unique and useful ways, often resulting in curated ETL datasets that are easily used by the broader team
  • Build and share data visualizations and self-serve dashboards for your partners
  • Effectively communicate your work with team leads and cross-functional stakeholders on a regular basis

Qualifications

You have:

  • Previous exposure to or interest in areas like anomaly detection or regulatory data science 
  • A bachelor degree in statistics, data science, or similar STEM field with 4+ years of experience in a relevant role OR
  • A graduate degree in statistics, data science, or similar STEM field with 2+ years of experience in a relevant role
  • Advanced proficiency with SQL and data visualization tools (e.g. Tableau, Looker, etc)
  • Experience with scripting and data analysis programming languages, such as Python or R
  • Gone deep with cohort and funnel analyses, a deep understanding statistical concepts such as selection bias, probability distributions, and conditional probabilities

Technologies we use and teach:

  • SQL, Snowflake, etc.
  • Python (Pandas, Numpy)
  • Tableau, Airflow, Looker, Mode, Prefect

See more jobs at Square

Apply for this job

15d

Staff Data Scientist - Controls/Access

SquareSan Francisco, CA, Remote
Bachelor degreetableauairflowsqlDesignpython

Square is hiring a Remote Staff Data Scientist - Controls/Access

Job Description

The Data Science team at Cash App derives valuable insights from our extremely unique datasets and turns those insights into actions that improve the experience for our customers every day. In this role, you’ll be embedded in our Health organization and work closely with product management as well as other cross-functional partners to drive meaningful change that helps protect our customers and their money. Because our Health DS team plays such a critical role in building and maintaining trust with our users, an appreciation for the connection between your work and the experience it delivers to customers is absolutely critical for this position.

As a Data Scientist, you will:

  • Partner directly with the Cash App Health org, working closely with operations, engineers, legal and compliance, and machine learning teams
  • Analyze large datasets using SQL and scripting languages to surface actionable insights and opportunities to the product team and other key stakeholders
  • Approach problems from first principles, using a variety of statistical and mathematical modeling techniques to research and understand customer behavior
  • Design and analyze A/B experiments to evaluate the impact of changes we make to the product
  • Work with engineers to log new, useful data sources as we build new product features
  • Build, forecast, and report on metrics that drive strategy and facilitate decision making for key business initiatives
  • Write code to effectively process, cleanse, and combine data sources in unique and useful ways, often resulting in curated ETL datasets that are easily used by the broader team
  • Build and share data visualizations and self-serve dashboards for your partners
  • Effectively communicate your work with team leads and cross-functional stakeholders on a regular basis

Qualifications

You have:

  • Previous exposure to or interest in areas like anomaly detection or regulatory data science 
  • A bachelor degree in statistics, data science, or similar STEM field with 4+ years of experience in a relevant role OR
  • A graduate degree in statistics, data science, or similar STEM field with 2+ years of experience in a relevant role
  • Advanced proficiency with SQL and data visualization tools (e.g. Tableau, Looker, etc)
  • Experience with scripting and data analysis programming languages, such as Python or R
  • Gone deep with cohort and funnel analyses, a deep understanding statistical concepts such as selection bias, probability distributions, and conditional probabilities

Technologies we use and teach:

  • SQL, Snowflake, etc.
  • Python (Pandas, Numpy)
  • Tableau, Airflow, Looker, Mode, Prefect

See more jobs at Square

Apply for this job

15d

Senior Data Engineer

EquipmentShareRemote; Chicago; Denver; Kansas City; Columbia MO
agileairflowsqlDesignc++postgresqlpythonAWS

EquipmentShare is hiring a Remote Senior Data Engineer

EquipmentShare is Hiring a Senior Data Engineer.

Your role in our team

At EquipmentShare, we believe it’s more than just a job. We invest in our people and encourage you to choose the best path for your career. It’s truly about you, your future, and where you want to go.

We are looking for a Senior Data Engineer to help us continue to build the next evolution of our data platform in a scalable, performant, and customer-centric architecture.

Our main tech stack includes Snowflake, Apache Airflow, AWS cloud infrastructure (e.g., Kinesis, Kubernetes/EKS, Lambda, Aurora RDS PostgreSQL), Python and Typescript.

What you'll be doing

We are typically organized into agile cross-functional teams composed of Engineering, Product, and Design, which allows us to develop deep expertise and rapidly deliver high-value features and functionality to our next-generation T3 Platform.

You’ll be part of a close-knit team of data engineers developing and maintaining a data platform built with automation and self-service in mind to support analytics and machine learning data products for the next generation of our T3 Fleet that enableend-users to track, monitor and manage the health of their connected vehicles and deployed assets. 

We'll be there to support you as you become familiar with our teams, product domains, tech stack and processes — generally how we all work together.

Primary responsibilities for a Senior Data Engineer

  • Collaborate with Product Managers, Designers, Engineers, Data Scientists and Data Analysts to take ideas from concept to production at scale.
  • Design, build and maintain our data platform to enable automation and self-service for data scientists, machine learning engineers and analysts.
  • Design, build and maintain data product framework to support EquipmentShare application data science and analytics features.
  • Design, build and maintain CI/CD pipelines and automated data and machine learning deployment processes.
  • Develop data monitoring and alerting capabilities.
  • Document architecture, processes and procedures for knowledge sharing and cross-team collaboration.
  • Mentor peers to help them build their skills.

Why We’re a Better Place to Work

We can promise that every day will be a little different with new ideas, challenges and rewards.

We’ve been growing as a team and we are not finished just yet— there is plenty of opportunity to shape how we deliver together.

Our missionis to enable the construction industry with tools that unlock substantial increases to productivity. Together with our team and customers, we are building the future of construction.

T3is the only cloud-based operating system that brings together construction workflows & data from constantly moving elements in one place.

  • Competitive base salary and market leading equity package.
  • Unlimited PTO.
  • Remote first.
  • True work/life balance.
  • Medical, Dental, Vision and Life Insurance coverage.
  • 401(k) + match.
  • Opportunities for career and professional development with conferences, events, seminars and continued education.
  • On-site fitness center at the Home Office in Columbia, Missouri, complete with weightlifting machines, cardio equipment, group fitness space, racquetball courts, a climbing wall, and much more!
  • Volunteering and local charity support that help you nurture and grow the communities you call home through our Giving Back initiative.
  • Stocked breakroom and full kitchen with breakfast and lunch provided daily by our chef and kitchen crew.

About You

You're a hands-on developer who enjoys solving complex problems and building impactful solutions.  Most importantly, you care about making a difference.

  • Take the initiative to own outcomes from start to finish — knowing what needs to be accomplished within your domain and how we work together to deliver the best solution.
  • You are passionate about developing your craft — you understand what it takes to build quality, robust and scalable solutions.
  • You’ll see the learning opportunity when things don’t quite go to plan — not only for you but for how we continuously improve as a team.
  • You take a hypothesis-driven approach — knowing how to source, create and leverage data to inform decision making, using data to drive how we improve, to shape how we evaluate and make platform recommendations.

So, what is important to us?

Above all, you’ll get stuff done. More importantly, you’ll collaborate to do the right things in the right wayto achieve the right outcomes.

  • 7+ years of relevant data platform development experience building production-grade solutions.
  • Proficient with SQL and a high-order object-oriented language (e.g., Python).
  • Experience with designing and building distributed data architecture.
  • Experience building and managing production-grade data pipelines using tools such as Airflow, dbt, DataHub, MLFlow
  • Experience building and managing production-grade data platforms using distributed systems such as Kafka, Spark, Flink and/or others.
  • Familiarity with event data streaming at scale.
  • Proven track record learning new technologies and applying that learning quickly.
  • Experience building observability and monitoring into data products. 
  • Motivated to identify opportunities for automation to reduce manual toil.

EquipmentShare is committed to a diverse and inclusive workplace. EquipmentShare is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.

 

#LI-Remote

 

See more jobs at EquipmentShare

Apply for this job

16d

Technical Team Lead, Data Group

WoltStockholm, Sweden, Remote
airflowkubernetespython

Wolt is hiring a Remote Technical Team Lead, Data Group

Job Description

We are putting together a new engineering team in our Data domain and we are looking for someone to take its lead! 

You won’t be starting entirely from scratch as Wolt’s Data Group has already developed an initial foundational tooling in the areas of data management, security, auditing, data catalog and quality monitoring, but through your technical contributions and leadership you will ensure our Data Governance tooling is state of the art.

In the past few years we integrated some of these tools within our core data infrastructure and thanks to these efforts the teams contributing to the data platform have developed a good understanding when it comes to the Data Governance domain. 

We’re starting this team to take this work to the next level and ensure Wolt’s data registries and services keep on meeting compliance and governance requirements that a modern data company needs to meet.

Together with your team you will be responsible for managing and improving the current Data Governance platform, making sure it can be further integrated with the rest of the Data Platform and Wolt Services in a scalable, secure, compliant way, without significant disruptions to the teams.

Furthermore, the team will also be responsible for concepting and building all new components and processes necessary for scaling these capabilities further as required.

????This role can be based in one of our tech hubs in Helsinki or Stockholm, or you can work remotely anywhere in Finland, Sweden, Denmark, and Estonia. Read more about our remote setup here. If you live outside of these countries - not to worry! We provide relocation support to help you make your way to Finland or Sweden.

Qualifications

As a Team Lead you will be planning the work and focus areas for our Data Governance Platform efforts and help the team get settled with suitable ways of working. 

You want to take care of your team members, help them strive and achieve goals together! You’ll be making personal development plans and hosting regular 1on1s to support their growth and take part in recruitment.

Having previous experience in team leading is a big plus, but don’t hesitate to apply if you have previously mostly been a technical lead that led people in planning and executing complex projects and want to get into people management.

Good communication and collaboration skills are essential, and you shouldn’t shy away from problems, but be able to discuss them in a constructive way with your team and the Wolt Product team at large.


You are the rare gem who also has a solid engineering background and wants to stay hands-on, with an interest in Data Management challenges, so you will be able to speak the same language with your team and stakeholders, including our Legal, Security and Finance teams. 

In this domain we work with Python, Snowflake, Kubernetes and a variety of open source tools such as Kafka, DataHub, Airflow and Dagster.

See more jobs at Wolt

Apply for this job