airflow Remote Jobs

33 Results

8d

Senior Data Engineer (Cloud & Snowflake)- Remote

airflowsqlsalesforceDesignpythonAWS

Help At Home is hiring a Remote Senior Data Engineer (Cloud & Snowflake)- Remote

Senior Data Engineer (Cloud & Snowflake)- Remote - Help At Home - Career Page (function(d, s, id) { var js, iajs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)){return;} js = d.createElement(s); js.id = id;js.async = true; js.src

See more jobs at Help At Home

Apply for this job

8d

Data Engineer II - Remote

airflowsqlDesignpythonAWS

Help At Home is hiring a Remote Data Engineer II - Remote

Data Engineer II - Remote - Help At Home - Career Page (function(d, s, id) { var js, iajs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)){return;} js = d.createElement(s); js.id = id;js.async = true;

See more jobs at Help At Home

Apply for this job

13d

Junior Developer

InnovateEDURemote, New York, United States
jiraairflowpostgressqloracleslackpythonbackend

InnovateEDU is hiring a Remote Junior Developer

Who You Are

You are a mission-driven individual and believe in working to close the educational opportunity gap through the use of data and technical solutions. You are excited about bringing order to disparate data, and writing data pipelines, and don’t mind being relentless in pursuing data accuracy. You’ve previously worked with SQL and Python and written code that interacts with APIs.


You are an optimistic problem-solver. You believe that together we can create real solutions that help the entire education sector move forward despite its complexity. You are excited to join a small, but growing team working on an early-stage product and are looking forward to working on many different pieces of that product. You are open to feedback, bring your best every day, and are ready to grow in all areas of your work. You want to join a team of folks who share your vision for mission-driven work at the intersection of education and technology.  Finally, you know that sharing often is key to this work, and are ready to document everything that you do so that data people in schools everywhere can benefit.


Experience and Skills

You are a good fit if you:

  • Have strong computer science fundamentals and experience with Python and writing analytical SQL
  • Consider yourself as having a very high attention to detail
  • Have strong communication skills with both technical and non-technical people
  • Are passionate about making an impact in K-12 education
  • Are comfortable doing many different types of tasks and having to context switch between tasks relativity often
  • Are passionate about building the best version of whatever you’re working on
  • Are highly motivated to work autonomously, with strong organizational and time management skills


You’ll have an edge if you:

  • Have worked as a data analyst or data engineer in the past and are familiar with validating data and tools like Pandas, Jupyter Notebooks, Google BigQuery, and Google Data Studio
  • Have experience building pipelines using Apache Airflow
  • Have worked in K-12 education in the past


Responsibilities


The Junior Developer’s primary professional responsibilities will include, but not be limited to:

  • Implementing and maintaining Landing Zone for new and returning customers
  • Creating, troubleshooting, and maintaining data processing pipelines in Apache Airflow (ETL work)
  • Writing SQL queries against many different types of databases (Microsoft SQL Server, Oracle, Postgres) to extract data
  • Running reports and exports in edTech source systems as well as Landing Zone infrastructure to perform data validation checks and communicate those back to our customers
  • Maintaining Landing Zone documentation to ensure it is always up-to-date and reflects how integrations function.
  • Deploying code updates across the Landing Zone customer base
  • Assisting in the deployment of infrastructure on the Google Cloud Platform for new customers
  • Assisting in the development of a historical/longitudinal data storage system (data warehouse)
  • Responding to customer support tickets (this is a shared responsibility on our team)
  • Working with internal systems such as JIRA, Asana, Slack to stay organized and ensure communication with team members
  • Other duties as assigned 


What to expect in the hiring process:

  • An introductory phone call with a Manager
  • A coding project that will take about 2 hours. This will be in Python and be related to processing data.
  • A project review and feedback call with two developers
  • Final round interviews, likely including our Executive Director


The range for this position will be $58,500 to $75,000.  Salary is commensurate with education and experience.  


About InnovateEDU

InnovateEDU is a non-profit whose mission is to eliminate the opportunity gap by accelerating innovation in standards-aligned, next-generation learning models and tools that serve, inform, and enhance teaching and learning. InnovateEDU is committed to massively disrupting K-12 public education by focusing on developing scalable tools and practices that leverage innovation, technology, and new human capital systems to improve education for all students and close the opportunity gap.


About the Project

InnovateEDU strives to create real tooling and projects that greatly assist a school/district/state in moving toward embracing data standards, a data-driven culture, and data interoperability. Landing Zone, a project at InnovateEDU, provides school districts with a comprehensive cloud-based data infrastructure through the implementation of an Ed-Fi Operational Data Store (ODS), data mart for analytics in Google BigQuery, and the necessary data workflows in Apache Airflow to connect previously siloed, disparate educational data systems. Landing Zone simplifies a district's process to implement an Ed-Fi ODS, connecting Ed-Fi certified data sources and consuming non-Ed-Fi certified data once it has been aligned to the standard. This project heavily focuses on data engineering, backend work, dev ops, and data analytics tools to verify data.


Application Instructions

Please submit an application on this platform. 

See more jobs at InnovateEDU

Apply for this job

13d

Senior Cloud Data Engineer

Nordcloud FinlandHelsinki, FI; Jyväskylä, FI; Salo, FI; Oulu, FI; Kuopio, FI Remote
agileterraformscalaairflowsqlDesignmongodbazureAWS

Nordcloud Finland is hiring a Remote Senior Cloud Data Engineer

We are digital builders born in the cloud and currently, we are looking for a Senior Cloud Data Engineer.

Joining Nordcloud is the chance of a lifetime to leave your mark on the European IT industry! We use an agile, cloud-native approach to empower clients to seize the full potential of the public cloud.

Your daily work:

  • Designing, architecting, and implementing modern cloud-based data pipelines for our customers
  • Selecting appropriate cloud-native technologies/services to provide the most efficient solution for the business use-case
  • Making data accessible and usable for a new wave of data-powered apps and service
  • Solutioning with and challenging customers to solve real use-cases problems
  • Supporting pre-sales to design a proposal with the relevant teams (sales, pre-sales, product leads...)
  • Operating CI/CD (DevOps) pipelines and performing slight customization on them if/when needed
  • Understanding and writing small Infrastructure as Code modules/templates in Terraform if/when needed

Your skills and attributes of success:

  • Several years overall of professional programming experience and 3+ years of hands-on experience in building modern data platforms/pipelines in Google Cloud
  • At least one programming language: Python/Scala/Java
  • Experience in job orchestration (Airflow, Composer, Oozie, etc.)
  • At least two of the following skill sets including mandatory Google Cloud skills:
    • Experience in Google Cloud:
      • Google Cloud Professional Data Engineer certification
      • Other active certificates are a plus
      • BigQuery skills with querying tables, designing schemas and structuring tables with partitioning/clustering when relevant
      • Pub/Sub experience
      • Dataflow (Apache Beam) or Dataproc (Apache Spark) framework experience, meaning architecting, creating, implementing, and maintaining data pipelines within these frameworks with streaming and batch workloads
      • Google Cloud Storage (Lifecycle policies, accesses...)
      • Datastore experience is a plus
      • Looker studio experience is a plus
      • Knowledge of more than one cloud is a plus (Azure is preferred in addition to Google Cloud)
    • Experience in Big Data technologies (in the cloud or on-prem):
      • Spark (Scala or pySpark)
      • Hadoop, HDFS, Hive/Impala, Pig, Hbase, Kafka, NiFi, ... (not all technologies are needed, this is just to give an idea)
      • Familiarity with Big Data file formats (parquet, AVRO, ORC, ...)
      • Experience with Lakehouse formats is a plus (Delta lake, Apache Iceberg, Apache Hudi)
      • MongoDB or Cassandra is a plus
      • Experience with data lake architectures and designing data lake architecture
      • Familiarity with data-mesh architectures is a plus
  • Data warehousing experience:
    • Migrating from on-prem to cloud
    • Data modeling (Kimball, Inmon, Data Vault, etc) is a plus
    • Advanced SQL in any SQL dialect/framework
    • Experience building ETL processes for data warehousing solutions
  • Consultancy experience
  • Leadership and people skills are a strong plus
  • Previous experience gained in mid-size/large, international companies
  • Fluent communication skills in English and Finnish

If you don’t meet all of the desired criteria, but still fit most of the requirements, we encourage you to applyanyway. Let’s find out together if we are a good fit for each other!

What do we offer in return?

  • A highly skilled multinational team
  • Individual training budget and exam fees for partner certifications (Azure, AWS, GCP) and additional certification bonus covered by Nordcloud
  • Access to join and the possibility to create knowledge-sharing sessions within a community of leading cloud professionals
  • Flexible working hours and freedom to choose your tools (laptop and smartphone) and ways of working
  • Freedom to work fully remotely within the country of Finland
  • Local benefits such as extensive private health care and wellness benefits

      Please read our Recruitment Privacy Policy before applying. All applicants must have the right to work in Finland.

      Learn more about #NordcloudCommunity. If you’d like to join us, please send us your CV or LinkedIn profile.

      About Nordcloud

      Nordcloud, an IBM company, is a European leader in cloud advisory, implementation, application development, managed services, and training. It’s a recognized cloud-native pioneer with a proven track record of helping organizations leverage the public cloud in a way that balances quick wins, immediate savings, and sustainable value. Nordcloud is triple-certified across Microsoft Azure, Google Cloud Platform, and Amazon Web Services – and is a Visionary in Gartner’s Magic Quadrant for Public Cloud IT Transformation Services. Nordcloud has 10 European hubs, over 1500 employees, and counting, and it has delivered over 1,000 successful cloud projects.

      Learn more at nordcloud.com

      #LI-Remote

      18d

      Analytics Engineer / Data Engineer

      Education AnalyticsMadison Preferred, WI Remote
      airflowpostgressqlDesigngitpythonAWS

      Education Analytics is hiring a Remote Analytics Engineer / Data Engineer

      Analytics Engineer / Data Engineer

      Education Analytics strives to deliver sophisticated, research-informed analytics to educators and school administrators to support their work in improving student outcomes. To support them best, the data they receive must be accurate, up to date, secure, and easily accessible. The person in this role will be a critical team member who helps to make that a reality.

      We are seeking a full-time Data Engineer or Analytics Engineer to lead the design, build, and maintenance of automated data pipelines and analytic systems. An ideal candidate has strong SQL skills and experience with data warehousing concepts, familiarity with complex data integration and/or analysis, and an interest in improving K-12 education.

      This role supports the timely delivery of data and analytics to educators and administrators who use this data to drive change and improvement in education. We are looking for candidates who are innovative, hard-working, and curious to help us continue to develop our team's capacity in the development and use of cutting-edge tools. Our team is consistently evaluating tools for new projects and looking for the best tools for the job. Our current stack uses an ELT approach via Apache Airflow and dbt to create data warehouses in Snowflake or Postgres, depending on the scale of the data. These posts illustrate some projects that members of our team might work on.

      Responsibilities

      • Lead the design and implementation of data warehousing structures for research, analytics, and reporting/dashboarding
      • Apply best practices from software engineering to data pipelines
      • Help implement code testing, continuous integration, and deployment strategies to ensure system reliability
      • Design and implement complex pipelines to integrate data coming from a mix of APIs, flat files, or other database sources
      • Develop and improve internal tools and systems to efficiently deliver high-quality, actionable metrics
      • Work collaboratively within a team of analysts, school system leaders, and other engineers to create analytics solutions that are scalable, easy to maintain, and support high quality research
      • Explore and apply new cutting-edge tools to drive innovation across a variety of projects

      Qualifications

      • Experience architecting data warehouse and data lake structures that are intuitive and performant
      • Knowledge of best design practices in modern cloud-based data warehouses
      • Experience designing, implementing, and maintaining modern ELT pipelines with a clean code-base
      • Fluency in SQL, experience with Python and Linux.
      • Knowledge of software engineering best practices, particularly in team-based development using Git
      • Ability to proactively identify and defend against potential data quality & processing issues

      Bonus Skills:

      • Experience with cloud-based columnar data warehouses (Snowflake, RedShift, BigQuery)
      • Experience with Data Build Tool (dbt)
      • Experience with Apache Airflow, or other modern data pipeline systems
      • Desire to work with cutting edge tools in a fast-paced environment
      • Familiarity with AWS tooling and best practices

      Hiring Process

      1. Hiring team reviews resumes and cover letters
      2. Selected candidates invited to 30-minute interview with Data Engineering team managers to discuss skills and experience alignment
      3. Selected candidates invited for full day final interview. Candidates are sent a skills exercise in advance that will be discussed in the interview. This doesn’t require any coding, it’s about concepts and planning of data systems, and doesn’t require any pre-submitted work. In addition to discussing the exercise, there will be another approximately 2-3 hours of interviews to meet other Data Engineering team members and key members of other teams and to help candidates learn more about Education Analytics & the role.

      How you will successfully onboard in this role

      In your first few weeks, you will work through a training exercise our team has developed that familiarizes our new hires with our development setup & tooling, and join team meetings and 1-1 check-ins. From there, you will likely work on 1-2 projects and begin joining project meetings to gain familiarity with the context of the work we do. Next you will start to take on smaller tasks in those projects, and by 3-6 months in, begin to take the lead on larger initiatives.

      Additional details

      The weekly hour expectation is 45 hours per week, and nights and weekends are sometimes required. Our preference is for candidates to primarily work from EA’s office in Madison, WI.

      About us: Education Analytics is a non-profit organization that uses data analysis to inform education policy decisions. We work with school districts, regional offices of education, non-profits, and policymakers to identify ways to make education systems better.

      Benefits:

      · Competitive salary

      · Annual merit bonuses

      · Paid holidays and one month of paid vacation per year

      · Generous 401k and health benefits

      · Parental leave benefit of up to 26 weeks of paid leave

      · Free Madison Metro transit pass or subsidized office parking

      · Casual office environment

      · Location right in the heart of downtown Madison, WI

      Education Analytics is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.

      See more jobs at Education Analytics

      Apply for this job

      27d

      Senior Google Cloud Data Engineer

      Nordcloud FinlandHelsinki, FI; Jyväskylä, FI; Salo, FI; Oulu, FI; Kuopio, FI Remote
      agileterraformscalaairflowsqlDesignmongodbazureAWS

      Nordcloud Finland is hiring a Remote Senior Google Cloud Data Engineer

      We are digital builders born in the cloud and currently, we are looking for a Senior Google Cloud Data Engineer.

      Joining Nordcloud is the chance of a lifetime to leave your mark on the European IT industry! We use an agile, cloud-native approach to empower clients to seize the full potential of the public cloud.

      Your daily work:

      • Designing, architecting, and implementing modern cloud-based data pipelines for our customers
      • Selecting appropriate cloud-native technologies/services to provide the most efficient solution for the business use-case
      • Making data accessible and usable for a new wave of data-powered apps and service
      • Solutioning with and challenging customers to solve real use-cases problems
      • Supporting pre-sales to design a proposal with the relevant teams (sales, pre-sales, product leads...)
      • Operating CI/CD (DevOps) pipelines and performing slight customization on them if/when needed
      • Understanding and writing small Infrastructure as Code modules/templates in Terraform if/when needed

      Your skills and attributes of success:

      • Several years overall of professional programming experience and 3+ years of hands-on experience in building modern data platforms/pipelines in Google Cloud
      • At least one programming language: Python/Scala/Java
      • Experience in job orchestration (Airflow, Composer, Oozie, etc.)
      • At least two of the following skill sets including mandatory Google Cloud skills:
        • Experience in Google Cloud:
          • Google Cloud Professional Data Engineer certification
          • Other active certificates are a plus
          • BigQuery skills with querying tables, designing schemas and structuring tables with partitioning/clustering when relevant
          • Pub/Sub experience
          • Dataflow (Apache Beam) or Dataproc (Apache Spark) framework experience, meaning architecting, creating, implementing, and maintaining data pipelines within these frameworks with streaming and batch workloads
          • Google Cloud Storage (Lifecycle policies, accesses...)
          • Datastore experience is a plus
          • Looker studio experience is a plus
          • Knowledge of more than one cloud is a plus (Azure is preferred in addition to Google Cloud)
        • Experience in Big Data technologies (in the cloud or on-prem):
          • Spark (Scala or pySpark)
          • Hadoop, HDFS, Hive/Impala, Pig, Hbase, Kafka, NiFi, ... (not all technologies are needed, this is just to give an idea)
          • Familiarity with Big Data file formats (parquet, AVRO, ORC, ...)
          • Experience with Lakehouse formats is a plus (Delta lake, Apache Iceberg, Apache Hudi)
          • MongoDB or Cassandra is a plus
          • Experience with data lake architectures and designing data lake architecture
          • Familiarity with data-mesh architectures is a plus
      • Data warehousing experience:
        • Migrating from on-prem to cloud
        • Data modeling (Kimball, Inmon, Data Vault, etc) is a plus
        • Advanced SQL in any SQL dialect/framework
        • Experience building ETL processes for data warehousing solutions
      • Consultancy experience
      • Leadership and people skills are a strong plus
      • Previous experience gained in mid-size/large, international companies
      • Fluent communication skills in English and Finnish

      If you don’t meet all of the desired criteria, but still fit most of the requirements, we encourage you to applyanyway. Let’s find out together if we are a good fit for each other!

      What do we offer in return?

      • A highly skilled multinational team
      • Individual training budget and exam fees for partner certifications (Azure, AWS, GCP) and additional certification bonus covered by Nordcloud
      • Access to join and the possibility to create knowledge-sharing sessions within a community of leading cloud professionals
      • Flexible working hours and freedom to choose your tools (laptop and smartphone) and ways of working
      • Freedom to work fully remotely within the country of Finland
      • Local benefits such as extensive private health care and wellness benefits

          Please read our Recruitment Privacy Policy before applying. All applicants must have the right to work in Finland.

          Learn more about #NordcloudCommunity. If you’d like to join us, please send us your CV or LinkedIn profile.

          About Nordcloud

          Nordcloud, an IBM company, is a European leader in cloud advisory, implementation, application development, managed services, and training. It’s a recognized cloud-native pioneer with a proven track record of helping organizations leverage the public cloud in a way that balances quick wins, immediate savings, and sustainable value. Nordcloud is triple-certified across Microsoft Azure, Google Cloud Platform, and Amazon Web Services – and is a Visionary in Gartner’s Magic Quadrant for Public Cloud IT Transformation Services. Nordcloud has 10 European hubs, over 1500 employees, and counting, and it has delivered over 1,000 successful cloud projects.

          Learn more at nordcloud.com

          #LI-Remote

          +30d

          Middle Data Analyst (Solidgate-fintech)

          GenesisUkraine Remote
          kotlintableauairflowsqlRabbitMQgitjavadockerelasticsearchpostgresqltypescriptpythonAWSbackend

          Genesis is hiring a Remote Middle Data Analyst (Solidgate-fintech)

          See more jobs at Genesis

          Apply for this job

          +30d

          Data Engineer (Hybrid or Remote)

          agiletableauairflowpostgressqlmobilescrummysqlpythonAWS

          Kalkomey Enterprises, LLC is hiring a Remote Data Engineer (Hybrid or Remote)

          Data Engineer (Hybrid or Remote) - Kalkomey Enterprises, LLC - Career PageKnowledge of star schemas and sn

          See more jobs at Kalkomey Enterprises, LLC

          Apply for this job

          +30d

          Remote Senior Software Engineer

          3 years of experienceterraformairflowDesignazuredockerkubernetespythonAWS

          BlueVoyant is hiring a Remote Remote Senior Software Engineer

          Remote Senior Software Engineer - BlueVoyant - Career Page

          See more jobs at BlueVoyant

          Apply for this job

          +30d

          Data Engineer

          BlueLabsRemote or Washington, District of Columbia, United States
          tableauairflowsqloraclemobilegitjavac++postgresqlpythonAWS

          BlueLabs is hiring a Remote Data Engineer

          About BlueLabs

          BlueLabs is a leading provider of analytics services and technology for a variety of industry clients: including government, business, and political campaigns. We help our clients optimize their engagements with individual customers, supporters, and stakeholders to achieve their goals. Simply put: we help our partners do the most good by getting the most from their data. 


          Today, our team of data analysts, scientists, engineers, and strategists come together from diverse backgrounds to share a passion for using data to solve the world’s greatest social and analytical challenges. We’ve served more than 400 organizations ranging from government agencies, advocacy groups, unions, political campaigns, international groups, and companies. Along the way, we’ve developed some of the most innovative tools available in analytics, media optimization, reporting, and influencer outreach-- serving a diverse set of industries, including the automotive, travel, consumer packaged goods, entertainment, healthcare, media, telecom, and more.


          About the team:

          The Insights division creates and manages the underlying data that drives our day-to-day work. This is a new team at BlueLabs, created to meet our organization’s evolving client base and business needs, to ensure that BlueLabs is providing the most innovative data and analysis to our clients, all while developing and coaching team members to grow and respond to our company’s goals.

           

          The BlueLabs Insights practice works with non-profit, political, and private sector clients to provide them with high quality analysis to better understand the environment they operate in and inform future decision making. 


          Ripple is a proprietary, business-to-business technology product that helps our clients identify, engage, and measure impact on influencers who matter to their causes or brands. You will join a small and growing team of data, engineering, and product professionals to help us solve some exciting and challenging problems and to deploy the next generation of Ripple. 


          About the role:

          As a Data Engineer, you will help the Ripple team establish and maintain data pipelines for internal data sources and data sources related to our client engagements. The Data Engineer is critical to our client work, which requires proactive and continuous improvement as well as  prompt responsiveness to changing circumstances – particularly in handling data quality issues and adjusting the team’s end-to-end pipeline deployments. Your track record stepping into this role should reflect domain knowledge in data ingestion and transformation, including experience adapting to changing technologies and/or client priorities.  The Data Engineer reports to the Ripple Product Manager. 


          In this position you will:

          • Analyze, establish and maintain data pipelines that regularly deliver transformed data to data warehouses
          • Read in data, process and clean it, transform and recode it, merge different data sets together, reformat data between wide and long, etc.
          • Create documentation for data pipelines and data sets
          • Work closely with data analysts to understand, identify and effectively respond to their specific needs
          • Coordinate with the vendors, clients, and other stakeholders as needed to stand-up or respond to issues with data pipelines
          • Perform one-off data manipulation and analysis on a wide variety of data sets
          • Produce scalable, replicable code and engineering solutions that help automate repetitive data management tasks
          • Make recommendations and provide guidance on ways to make data collection more efficient and effective
          • Develop and streamline our internal data resources into more efficient and easier to understand taxonomies
          • Ensure a high-level of data accuracy through regular quality control


          What we are seeking:

          • About 2+ years of experience working with data pipeline solutions; beyond exact years, we seek candidates whose experience working with data pipelines provides the ability to proactively advise and then deploy solutions for our client based teams
          • Experience processing data using scripting languages like Python or R, or compiled languages like Java, C++, or Scala. Python preferred
          • Understanding of how to manipulate data using SQL or Python
          • Experience designing data models that accounts for upstream and downstream system dependencies
          • Experience working in modern data processing stacks using tools like Apache Airflow, AWS Glue, and dbt
          • Experience with an MPP database such as Amazon Redshift, Vertica, BigQuery, or Snowflake and/or experience writing complex analytics queries in a general purpose database such as Oracle or Postgresql
          • Familiarity with Git, or experience with other version control system
          • A high attention to detail and ability to effectively manage and prioritize several tasks or projects concurrently 
          • Effective communication and collaboration skills when working with team members of varied backgrounds, roles, and functions
          • Passion in applying your skills to our social mission to problem-solve and collaborate within a cross-functional team environment
          • Ability to diagnose and improve database and query performance issues


          You may also have experience:

          • Working on political campaigns or in progressive advocacy
          • Working with voter files or large consumer datasets
          • Working with geospatial data
          • Working with business intelligence software like Tableau, PowerBi, or Data Studio


          Recruitment process

          We strive to hire efficiently and transparently. We expect to hire this position in January 2023. To get there, we anticipate the successful candidate will complete three interviews (HR 15 minutes, technical interview 45 minutes, and team interview 60 minutes), all virtually. 


          What We Offer:

          BlueLabs offers a friendly work environment and competitive benefits package including:

          • Premier health insurance plan
          • 401K matching
          • Unlimited vacation leave
          • Paid sick, personal, and volunteer leave
          • 15 weeks paid parental leave
          • Professional development & learning stipend
          • Macbook Pro laptop & tech accessories
          • Bring Your Own Device (BYOD) stipend for mobile device
          • Employee Assistance Program (EAP)
          • Supportive & collaborative culture 
          • Flexible working hours
          • Telecommuting/Remote options
          • Pre-tax transportation options 
          • Lunches and snacks
          • And more! 


          The salary for this position is$85,000annually. 


          While we prefer this position to be in the Washington, DC area, we are open to considering candidates from within the U.S. 


          At BlueLabs, we celebrate, support and thrive on differences. Not only do they benefit our services, products, and community, but most importantly, they are to the benefit of our team. Qualified people of all races, ethnicities, ages, sex, genders, sexual orientations, national origins, gender identities, marital status, religions, veterans statuses, disabilities and any other protected classes are strongly encouraged to apply. As an equal opportunity workplace and an affirmative action employer, BlueLabs is committed to creating an inclusive environment for all employees. BlueLabs endeavors to make reasonable accommodations to the known physical or mental limitations of qualified applicants with a disability unless the accommodation would impose an undue hardship on the operation of our business. If an applicant believes they require such assistance to complete the application or to participate in an interview, or has any questions or concerns, they should contact the Director, People Operations.  BlueLabs participates in E-verify.

          See more jobs at BlueLabs

          Apply for this job

          +30d

          Business Intelligence Developer (Hybrid or Remote)

          agiletableauairflowpostgressqlmobilescrummysqlpythonAWS

          Kalkomey Enterprises, LLC is hiring a Remote Business Intelligence Developer (Hybrid or Remote)

          Business Intelligence Developer (Hybrid or Remote) - Kalkomey Enterprises, LLC - Career Page

          See more jobs at Kalkomey Enterprises, LLC

          Apply for this job

          +30d

          Senior Data Engineer

          InstalentBudapest, HU Remote
          agilescalaairflowdocker

          Instalent is hiring a Remote Senior Data Engineer

          Our partner is a rapidly expanding international technology and data science business. They build high quality SaaS solutions which automate data science using advanced machine learning and deep learning techniques. They use some of the trendiest technology on the planet so you will never get bored of doing the same thing.

          About the role

          They are looking for a talented developer to join their team. The job entails building and operating high performance big data pipelines to facilitate all their SaaS products for some of the world’s leading brands. You'll be part of a remote team of developers and data scientists based in the United Kingdom, South Africa and Hungary.

          Requirements:

          • Experience writing testable functional Scala in a production grade system.
          • To have utilized Apache Spark in a production system utilizing Scala
          • Having worked with Spark orchestration technologies like Apache Airflowis a plus.
          • Experience of using a cloud platform to architect and build data pipelines.
          • To be a developer who can quickly take up a new technology and deliver features in an extremely agile way.
          • To easily navigate the administration of a Hadoop cluster on a cloud platform such as Databricks
          • To have used Docker containers to deploy your systems
          • Having a strong JVM development background including use of Spring
          • Having worked with data streaming for example Kafka

          What they offer:

          • Remote and flexible working
          • Competitive Salary
          • Career Development
          • Exciting Clients and Projects
          • Talented Teams
          • Benefits: stock option plan, staff referral scheme, quarterly staff events, wellness day, volunteering opportunities, birthday lie in, sunshine hours, Christmas gift cards, Flexible Leave Policy, private health care, cafeteria system, enhanced maternity & paternity leave, drinks & snacks, fruit.

          See more jobs at Instalent

          Apply for this job

          +30d

          Senior Data Engineer

          airflowsqlDesignrubyjavapythonAWSjavascript

          Brightside is hiring a Remote Senior Data Engineer

          Senior Data Engineer - Brightside Health - Career Page

          See more jobs at Brightside

          Apply for this job

          +30d

          PHP Laravel Web Developer

          airflowlaravelDesignmysqlcssbackendPHP

          Latitude, Inc. is hiring a Remote PHP Laravel Web Developer

          PHP Laravel Web Developer - Latitude, Inc. - Career Page

          See more jobs at Latitude, Inc.

          Apply for this job

          +30d

          Software engineer (Web, Full-stack)

          OngoSan Francisco, CA Remote
          airflowpostgresDesignmobiletypescriptpythonAWSjavascriptNode.js

          Ongo is hiring a Remote Software engineer (Web, Full-stack)

          What is Ongo?

          Ongo is building the platform for reprogramming human behavior. We use code, content, and science to make the healthy choice the easy choice. We've built a range of beautiful, interactive mobile products that help health experts improve the reach and depth of the interactions they have with communities.

          We're backed by passionate investors who have built iconic companies, and we're working with some of the leading experts in the world. We're all excited about the opportunity to make a real and measurable impact on health!

          What's the position?

          Full Stack Developer.

          You will be joining as a full-time operations role - with tremendous opportunity to grow. This is a chance to be part of the founding team and help us with a range of product development needs. Specifically, you will be a primary owner of our web development, with opportunities to expand into nearly any other part of the business where you have passion and skill.

          Impact Plan for this Role

          Within 1 month, you'll...

          • Become familiar with our web projects by investigating and resolving outstanding customer issues
          • Take end-to-end ownership of at least one major product feature from design to release
          • Get to know the team, our business, and our customers

          Within 3 months, you'll...

          • Take full ownership of at least one web project
          • Implement, release, and optimize a number of product features
          • Work closely with customers to resolve key issues
          • Recommend improvements to our tech stack or product features

          You'll be working closely with...

          • The founders
          • A supportive engineering team
          • Designers and product owners
          • Customers who have unique problems

          So who are you?

          You are someone who...

          • Is a product-minded engineer (minimal lovable product > shiny tools)
          • Has a strong grasp of engineering and computer science fundamentals
          • Has demonstrated a record of shipping high quality web products to consumers
          • Can work in a fast-paced environment (we ship every two weeks!)
          • Can communicate effectively (and kindly) with others to solve customer problems

          In particular, you will...

          • Own the full-stack development of new features for our customer-facing web products, from early brainstorming to shipping
          • Support existing products and features, with the opportunity to work closely with customers
          • Contribute to a positive, high-impact engineering culture by providing feedback and support to other team members via code and design reviews
          • Drive new features and products by collaborating with the product and design teams

          Some technologies we use...

          • Languages: Javascript, Typescript
          • Frameworks: React, Node.js, NestJS
          • Other tools: Storybook, Postgres, Airflow, various AWS services

          Bonus if you...

          • Are familiar with: Python, Flask
          • Are an expert in the specific tools and technologies we use
          • Have experience scaling up infrastructure for high-growth products
          • Have a good understanding of web security

          Location, Location, Location

          Our primary HQ is in San Francisco (with another office in Sacramento, with others coming soon) and ideally a candidate for this role would be available to relocate to one of our hubs. However, we are open to hiring high performing candidates in any location if there is a strong fit.

          Of course, our office is closed to keep our team and the world safe during the Covid-19 pandemic, but we plan to re-open our offices once it is safe to do so. Our team has been happily and productively working remote-ish, but we can't wait to see each other again!

          Are there benefits?

          You bet there are! You'll get competitive salary and equity. We're a health company, so we care about that stuff - you'll also get great medical and dental coverage. And we emphasize strong work/life balance - we'd be hypocrites not to!

          And there are plenty of opportunities for you to grow through conferences, training, and more. Plus you get to work with some great people ????

          The Interview Process

          1. Apply to this job listing
          2. Quick call with the CTO
          3. Do a small project related to the role
          4. Meet the rest of the team
          5. Welcome! ????

          See more jobs at Ongo

          Apply for this job

          +30d

          Python Developer

          iGamingRemote
          airflowsqlgitpythonAWS

          iGaming is hiring a Remote Python Developer

          Intro
          iGaming.com is an international Media Group with 11 years of consecutive outstanding performance offering business growth through affiliate marketing.  

          Our team of over 300 talented and dedicated professionals develop, maintain and optimize websites ensuring they are well-designed and can be navigated intuitively. All content is tailored to experienced or interested players – we provide accurate, transparent, informative and up-to-date content around all aspects of igaming.  

          Why work with us at iGaming.com?  
          Because we are working to make a difference!

          Not only are we driven to provide the best experience for our users and exceed our partners’ expectations, we know that our team is our most important asset. Therefore, we focus on creating a work environment where everyone can learn new skills and further develop their career – be it inhouse workshops, training plans, online courses or external trainings. And we excel by providing a good work/life balance – giving you the flexibility to work where and when you want and much more. In fact, you can decide if you want to work remotely or from one of our offices, for example our Varna office.  
          We are continuing to grow and are hiring on all levels – Juniors, Experts and Managers.  

          We want to expand our team!  We are currently looking for a Python Developer to support our growth.
          These Tasks Await
          • Extract, transform and load data from differing external sources
          • Management of internals tools based on Django
          • Enrich our Business Intelligence Data Warehouse 
          • Build security-aware REST APIs
          • Consolidate alerting and monitoring systems
          This Profile Is Matching With Us
          • 3+ years of experience with Python
          • Experience in Django
          • Experience in SQL
          • Experience with AWS
          • Experience in decentralized version control systems like GIT
          • Structured and independent working style as well as creativity, team spirit and initiative
          • Strong problem solving skills and eagerness to learn
          • Good knowledge of English

          What would be a plus:
          • Experience with server management
          • Experience with Big Data tools like Apache Airflow and AWS Redshift
          • Knowledge of HTML/CSS/JS
          • Orientation towards Business Intelligence
          We Offer You
          • Health Management: sports cards, additional medical coverage
          • Flexibility: flexible working hours. Remote work possible.
          • Location of the main office: in the centre of Varna, 5 minutes away from the beach, a lot of possibilities: restaurants, cafés and shops, free parking
          • Environment: open-door policy, relaxed atmosphere, no dress code
          • Entertainment: regular company events and team buildings, PlayStation, board games
          • Support: language classes, regular feedback, internal development will be promoted
          • Safety: permanent employment contract, fair vacation days, adequate salary
          • Internationality: an international team consisting of various professionals and highly motivated personalities
          +30d

          Data Platform Engineer (m/f/d)

          JimdoHamburg, DE Remote
          3 years of experiencekotlintableauterraformairflowDesignqarubyjavadockerkuberneteslinuxpythonAWS

          Jimdo is hiring a Remote Data Platform Engineer (m/f/d)

          Our mission

          At Jimdo, we’re big on small. Our mission is to unleash the power of the self-employed and small businesses—and help them thrive. Small businesses are the backbone of the global economy, but they receive little support or recognition. We see them and are here to support them. Join us to help design intuitive tools that enable small businesses to solve complex problems.

          We run at a steady pace to achieve what we aim for. We learn best by digging deep into data, staying curious, taking calculated risks, and sometimes even falling down along the way. It’s the lessons we learn in the process that make us better problem-solvers for small business owners.

          If you’re motivated by our mission and excited to roll up your sleeves, experiment, learn from mistakes, and make a difference to small businesses around the world, we would love to get to know you.

          The Team

          The Data Platform team is developing, operating, and improving a highly scalable, robust, and resilient data infrastructure, which is the backbone of all data services, the central data warehouse, and our reporting & analytics infrastructure. As business needs are growing and becoming more diverse, the team plans to increase our systems' scalability and introduce new services for a variety of use cases, ranging from core infrastructure and Data/DevOps tasks to advanced monitoring and anomaly detection. The team cooperates with the Analytics teams in the Data Department to maximise the business impact and works closely with the Jimdo infrastructure teams.

          Our expectations

          You have 3 years of experience in one or more topics:

          • Operating Linux or Docker
          • AWS
          • Software development (Java or python)
          • Infrastructure as code (terraform, cloudformation etc)
          • CI/CD pipelines
          • Data related topics: Redshift, Snowflake, Airflow, dbt etc

          You’ll be part off a team which does the following:

          • Design, build and operate a highly scalable data platform, further advancing our approach to designing robust, self-healing, resilient systems.
          • Implement advanced monitoring and alerting with respect to the data infrastructure as well as the data, the data flows, and pipelines, this also includes anomaly detection.
          • Ensure high test coverage and improve our QA and testing concepts with respect to the data pipelines and workflows.
          • Educate and consult data & analytics engineers on designing, building, and operating maintainable, scalable, and reliable data services and workflows.
          • Be responsible for the overall system's health of the data infrastructure.

          Some of the technologies you will work with and learn:

          • AWS
          • Kubernetes / Docker
          • Github-Actions / Terraform / Terragrunt / Atlantis
          • Kafka
          • Java / Python / Kotlin
          • Airflow / DBT / Redshift / Tableau

          What We Value

          • Jimdo's success is rooted in no small part in consequently using state-of-the-art cloud services. We are looking for engineers that have a solid grasp of cloud technologies and have a strong interest in distributed systems.
          • Our data infrastructure and the services running on top of it ultimately contribute to the success of our several millions of customers and we believe that in the future data will play an even more significant role both for our users and for Jimdo. You fit right in if you share the same view about creating value from data and have experience building and operating great tooling for this purpose.
          • We leverage different technologies and languages depending on the problem we try to solve, so we value people who are able to pick up new languages and tools when necessary and are able to find the right tool for the job at hand. Currently, we use e.g. Python and Java, but also some Ruby and Kotlin.
          • You have excellent problem-solving skills. You use a systematic and thorough approach. You think from the first principles. You have a bias for action and know how to diagnose and resolve problems within complex systems.

          Jimdo is proud to be an equal-opportunity employer. This means that we don't discriminate based on race or ethnic origin, color, the language(s) you speak, where you (or your parents) are from, or whether or not you consider yourself to have a disability. Neither will your age, gender, gender identity, sexual orientation, religion, beliefs, or political opinions play a part in your application with us. We're a diverse team in so many ways, and we love it that way.

          Vasiliki is looking forward to receiving your application.

          By sending your application, you declare that you have read and understood the Jimdo Applicant Privacy Policy.

          See more jobs at Jimdo

          Apply for this job

          +30d

          DevOps/ Data Engineer - (m/f/d)

          JimdoHamburg, DE Remote
          3 years of experiencekotlintableauterraformairflowDesignqarubyjavadockerkuberneteslinuxpythonAWS

          Jimdo is hiring a Remote DevOps/ Data Engineer - (m/f/d)

          Our mission

          At Jimdo, we’re big on small. Our mission is to unleash the power of the self-employed and small businesses—and help them thrive. Small businesses are the backbone of the global economy, but they receive little support or recognition. We see them and are here to support them. Join us to help design intuitive tools that enable small businesses to solve complex problems.

          We run at a steady pace to achieve what we aim for. We learn best by digging deep into data, staying curious, taking calculated risks, and sometimes even falling down along the way. It’s the lessons we learn in the process that make us better problem-solvers for small business owners.

          If you’re motivated by our mission and excited to roll up your sleeves, experiment, learn from mistakes, and make a difference to small businesses around the world, we would love to get to know you.

          The Team

          The Data Platform team is developing, operating, and improving a highly scalable, robust, and resilient data infrastructure, which is the backbone of all data services, the central data warehouse, and our reporting & analytics infrastructure. As business needs are growing and becoming more diverse, the team plans to increase our systems' scalability and introduce new services for a variety of use cases, ranging from core infrastructure and Data/DevOps tasks to advanced monitoring and anomaly detection. The team cooperates with the Analytics teams in the Data Department to maximise the business impact and works closely with the Jimdo infrastructure teams.

          Our expectations

          You have 3 years of experience in one or more topics:

          • Operating Linux or Docker
          • AWS
          • Software development (Java or python)
          • Infrastructure as code (terraform, cloudformation etc)
          • CI/CD pipelines
          • Data related topics: Redshift, Snowflake, Airflow, dbt etc

          You’ll be part off a team which does the following:

          • Design, build and operate a highly scalable data platform, further advancing our approach to designing robust, self-healing, resilient systems.
          • Implement advanced monitoring and alerting with respect to the data infrastructure as well as the data, the data flows, and pipelines, this also includes anomaly detection.
          • Ensure high test coverage and improve our QA and testing concepts with respect to the data pipelines and workflows.
          • Educate and consult data & analytics engineers on designing, building, and operating maintainable, scalable, and reliable data services and workflows.
          • Be responsible for the overall system's health of the data infrastructure.

          Some of the technologies you will work with and learn:

          • AWS
          • Kubernetes / Docker
          • Github-Actions / Terraform / Terragrunt / Atlantis
          • Kafka
          • Java / Python / Kotlin
          • Airflow / DBT / Redshift / Tableau

          What We Value

          • Jimdo's success is rooted in no small part in consequently using state-of-the-art cloud services. We are looking for engineers that have a solid grasp of cloud technologies and have a strong interest in distributed systems.
          • Our data infrastructure and the services running on top of it ultimately contribute to the success of our several millions of customers and we believe that in the future data will play an even more significant role both for our users and for Jimdo. You fit right in if you share the same view about creating value from data and have experience building and operating great tooling for this purpose.
          • We leverage different technologies and languages depending on the problem we try to solve, so we value people who are able to pick up new languages and tools when necessary and are able to find the right tool for the job at hand. Currently, we use e.g. Python and Java, but also some Ruby and Kotlin.
          • You have excellent problem-solving skills. You use a systematic and thorough approach. You think from the first principles. You have a bias for action and know how to diagnose and resolve problems within complex systems.

          Jimdo is proud to be an equal-opportunity employer. This means that we don't discriminate based on race or ethnic origin, color, the language(s) you speak, where you (or your parents) are from, or whether or not you consider yourself to have a disability. Neither will your age, gender, gender identity, sexual orientation, religion, beliefs, or political opinions play a part in your application with us. We're a diverse team in so many ways, and we love it that way.

          Vasiliki is looking forward to receiving your application.

          By sending your application, you declare that you have read and understood the Jimdo Applicant Privacy Policy.

          See more jobs at Jimdo

          Apply for this job

          +30d

          Senior Data Engineer (USA Remote)

          Blue Orange DigitalNew York (Remote), NY Remote
          6 years of experienceterraformairflowsqlDesignazuredockerlinuxpythonAWS

          Blue Orange Digital is hiring a Remote Senior Data Engineer (USA Remote)

          Blue Orange is seeking a Senior Azure Data Engineer to join our team to help build up our data engineering practice. Our Platform Engineers require a diverse skill set including system administration, DevOps, infrastructure automation, data modeling, and workflow orchestration. Blue Orange builds enterprise data platforms and systems for a variety of clients, so this candidate should have experience with supporting modern data technologies. The ideal candidate will have experience with multiple data engineering technologies across multiple clouds and deployment scenarios. In particular, we’re looking for someone with experience with Azure DevOps, Snowflake, Airflow, and dbt.

          This is a full-time fully remote position.

          Core Responsibilities & Skills:

          • Work with data teams to help design, build and deploy data platforms in the cloud (Azure, AWS, GCP) and automate their operation.
          • Work with Azure DevOps, Terraform, CloudFormation, and other Automation and infrastructure tools to build robust systems.
          • Work with Airflow, dbt, and other data orchestration and ETL tools to build high-performance data pipelines.
          • Provide leadership in applying software development principles and best practices, including Continuous Integration, Continuous Delivery/Deployment, and managing Infrastructure as Code, Automated Testing across multiple software applications.
          • Support heterogeneous technologies environments including both Windows and Linux systems.
          • Develop reusable, automated processes, and custom tools.

          Qualifications:

          • BA/BS degree in Computer Science or a related technical field, or equivalent practical experience.
          • At least 6 years of experience building and supporting data platforms; exposure to data technologies like Azure Data Factory, Azure Synapse Analytics, AWS Glue, Airflow, Spark.
          • Experience with Cloud Data Warehouses, Snowflake in particular.
          • Advanced level Python, SQL, and Bash scripting.
          • Experience designing and building robust CI/CD pipelines.
          • Strong Linux system administration skills.
          • Comfortable with Docker, configuration management, and monitoring tools.
          • Knowledge of best practices related to security, performance, and disaster recovery.
          • Experience working in cloud environments, at a minimum experience in Azure and AWS.
          • Enjoys collaborating with other engineers on architecture and sharing designs with the team.
          • Excellent verbal and written English communication.
          • Interacts with others using sound judgment, good humor, and consistent fairness in a fast-paced environment.

          Bonus Points:

          • Hold certifications for Azure DevOps, Azure Data Fundamentals. Snowflake.

          Our Benefits Include:

          • 401k Matching
          • PTO
          • 100% remote role with an option for hybrid
          • Healthcare, Dental, Vision, and Life Insurance

          Salary: USD 130 K - 160 K (per Year)

          Blue Orange Digital is an equal opportunity employer.

          See more jobs at Blue Orange Digital

          Apply for this job

          +30d

          Senior Software Engineer (Pyspark/SQL)

          terraformairflowsqlapidockermysqlpythonAWS

          Cerebral Staffing, LLC is hiring a Remote Senior Software Engineer (Pyspark/SQL)

          Senior Software Engineer (Pyspark/SQL) - Cerebral Staffing, LLC - Career Page

          See more jobs at Cerebral Staffing, LLC

          Apply for this job