airflow Remote Jobs

125 Results

+30d

Senior Backend Developer (.NET)

DevoteamVilnius, Lithuania, Remote
DevOPSnosqlairflowazurec++c#AWSbackend

Devoteam is hiring a Remote Senior Backend Developer (.NET)

Job Description

Imagine being part of one of the most successful IT companies in Europe and finding innovative solutions to technical challenges. Turn imagination into reality and apply for this exciting career opportunity in Devoteam.  

Job Highlights:  

  • Joining more than 10.000 talented colleagues around Europe  
  • International career opportunity   
  • Cozy environment in Vilnius and Kaunas offices   

Your Highlights?   

  • You have extreme ownership and proven experience in leading complex projects and tasks  
  • You have a strong sense of honesty, responsibility and reliability  
  • You are ready to step out of comfort zone and constantly work on improving soft and hard skills  
  • You are an excellent team player and always ready to assist your colleagues  

Still with us? Then we might have a fantastic job opportunity for you!  

OUR NEW SENIOR BACKEND DEVELOPER  

As a Senior Backend Developer you will get to work on innovating Cloud-based software with the goal of full-scale autonomy when operating Public Clouds. You will also provide technical leadership and mentorship to other team members. 

SOME OF YOUR RESPONSIBILITIES:  

  • Develop Cloud-based software backend, ensuring high performance and scalability;
  • Help create, maintain and follow best practices for development work including coding, testing, source control, build automation, continuous deployment and continuous delivery  
  • Contribute to software engineering excellence by mentoring other team members  
  • Participate in the development process relying on your technical expertise  
  • Keep up to date on technology innovations 

SOME OF OUR REQUIREMENTS:  

  • Deep knowledge of software development methodologies and abilities to quickly adopt to languages and platforms when needed   
  • Extensive development experience in C#/.NET  
  • Experience in writing REST APIs 
  • Experience in developing multi tenant applications 
  • Experience with  DevOps including CI/CD and Configuration Management  
  • Experience in cloud application development for one of cloud providers (Microsoft Azure, AWS or GCP)  
  • Experience working with source code repository management systems, such as GitHub, and Azure DevOps  

It would be awesome, if you have:  

  • Understanding of AI and Machine Learning  
  • Experience in building and supporting complex distributed  SaaS solutions  
  • Solid knowledge of database technologies (RDBMS and NoSQL)  
  • Familiarity with some data engineering tools is a plus: Databricks, BigQuerry.
  • Experience with workflow orchestration engines, e.g. Cadence, Temporal, Airflow, AWS step functions, etc. 

WHAT YOU CAN LOOK FORWARD TO:   

  • Creating a purposeful set of software built on a modern tech stack
  • Becoming a part of very specialized team who will support your ability to succeed  
  • A challenging and exciting career with an international perspective and opportunities  
  • Attractive compensation package with a mix of fixed and variable  
  • High level of trust and competency to make your own decisions  
  • A warm and talented culture with a focus on business, but knowing that family always comes first  
  • Access to international network of specialists within the organization to build your rep and skills  
  • Salary from 5500  EUR gross (depending on the experience and competencies)


At  Devoteam we have created a culture of honesty and transparency, inclusion, and cooperation which we value a lot. We are looking for colleagues, who are highly motivated and proactive, not afraid of challenges. We are highly invested in the career path development of our employees, and we offer and support possibilities for further training, certification and specialization.   

Qualifications

See more jobs at Devoteam

Apply for this job

+30d

Database Administrator (remote in Spain possible)

LanguageWireSpain, Remote
DevOPSairflowsqlazureqapostgresql

LanguageWire is hiring a Remote Database Administrator (remote in Spain possible)

Do you just love tweaking that one annoying query to perform just a little bit better?

Are you the go-to guy to know how to find or use data in a complex distributed ecosystem including plenty of services and databases?

Are you interested in pushing organizations to use their data more effectively and become more data-driven?

Yes? You should definitely read on!

The role you’ll play

At LanguageWire, we offer a large product suite to cater for all our customer’s linguistic needs.

Our product suite includes many products powered by many microservices, each of them owning their own data. We also have our data warehouse and data lake solutions to develop and power our AI/ML developments.

We are looking for a seasoned DBA to support engineering teams in their day to day work. Technology selection, data modelling, query optimization, monitoring & troubleshooting issues, etc. are continuous needs that you will help teams with.

You will not only support the teams but will be responsible for evangelizing and leveling up our engineering teams regarding their data and database. By providing guidance and training.

In parallel to that, as our system is complex and distributed, you will work closely with our Senior Director of Technology to build a solid data governance framework and define/execute our data strategy.

The team you’ll be a part of

We have 8 software teams working across 5 countries and taking care of the continuous development of our platform. We strongly believe in building our own tech so we can deliver the best solutions for our customers. Our teams cover the full technical scope needed to create advanced language solutions with AI, complex web-based tools, workflow engines, large scale data stores and much more. Our technology and linguistic data assets set us apart from the competition and we’re proud of that.

You will report directly to our Senior Director of Technology and work as part of our Technical Enablement team which is a cross-functional team of specialists working closely with all our other engineering teams in core technical aspects (architecture, data engineering, QA automation, performance, cybersecurity, etc.). Our Technical Enablement team es key to ensure that LanguageWire platform is built, run, and maintained in a scalable, reliable, performant and secure manner.

If you want to make a difference, make it with us by…

  • Ensuring the optimal operations of our products and services by being the hands-on expert that support our teams on with their databases and data needs.
  • Defining LanguageWire’s data architecture framework, standards, and principles, including modeling, metadata, security, reference data, and master data.
  • Driving the strategy execution across the entire tech organization by closely collaborating with other teams.

In one year, you’ll know you were successful if…

  • All of LanguageWire’s data is well modelled and documented.
  • LanguageWire has a powerful core data engine that allows our ML/AI teams to effectively leverage all of our data.
  • You are regarded as the go-to person for all database and data needs.

 

Desired experience and competencies

What does it take to work for LanguageWire?

What you’ll need to bring

You are a hands-on technical expert

  • Expert knowledge of SQL (PostgreSQL, SQL Server, etc.)
  • Solid data modelling skills, including conceptual, logical and physical models.
  • Good knowledge of cloud services (Azure & GCP) and DevOps engineering

You are a team player 

  • Excellent communicator able to create engagement and commitment from teams around you
  • You love solving complex puzzles with engineers from different areas and different backgrounds 
  • You’re eager to understand how the different areas of the ecosystem connect to create the complete value chain

You are a team player 

  • You love solving complex puzzles with engineers from different areas and different backgrounds 
  • You’re eager to understand how the different areas of the ecosystem connect to create the complete value chain

Fluent English (reading, writing, speaking) 

This will make you stand out

  • Experience working within a microservice-based architecture
  • Experience with Data Warehousing (BigQuery, SnowFlake, Databricks, …)
  • Experience with Orchestration technology (Apache Airflow, Azure Data Factory, …)
  • Experience with Data Lakes and Data Warehouses

Your colleagues say you

  • Are approachable and helpful when needed
  • know all the latest trends in the industry
  • never settle for second best

Our perks

  • Enjoy flat hierarchies, responsibility and freedom, direct feedback, and room to stand up for your own ideas
  • Internal development opportunities, ongoing support from your People Partner, and an inclusive and fun company culture
  • International company with over 400 employees. Offices in Copenhagen, Aarhus, Stockholm, Varberg, London, Leuven, Lille, Paris, Munich, Hamburg, Zurich, Kiev, Gdansk, Atlanta, Finland and Valencia
  • For this role, we have a full-time FlexiWire@home option for remote work. Of course, you are always welcome at the office to collaborate and connect with your colleagues.
  • We take care of our people and initiate many social get-togethers from Friday Bars a to Summer or Christmas parties. We have fun!
  • 200 great colleagues in the Valencia office belonging to different business departments
  • Excellent location in cool and modern offices in the city center, with a great rooftop terrace and a view over the Town Hall Square
  • Working in an international environment—more than 20 different nationalities
  • A private health insurance
  • A dog friendly atmosphere
  • Big kitchen with access to organic fruits, nuts and biscuits and coffee.
  • Social area and game room (foosball table, darts, and board games)
  • Bike and car parking

 

About LanguageWire

At LanguageWire, we want to wire the world together with language. Why? Because we want to help people & businesses simplify communication. We are fueled by the most advanced technology (AI) and our goal is to make customer's lives easier by simplifying their communication with any audience across the globe.

 

Our values drive our behavior

We are curious. We are trustworthy. We are caring. We are ambitious.

At LanguageWire, we are curious and intrigued by what we don’t understand. We believe relationships are based on honesty and responsibility, and being trustworthy reinforces an open, humble, and honest way of communicating. We are caring and respect each other personally and professionally. We encourage authentic collaboration, invite feedback and a positive social environment. Our desire to learn, build, and share knowledge is a natural part of our corporate culture.

 

Working at LanguageWire — why we like it: 

“We believe that we can wire the world together with language. It drives us to think big, follow ambitious goals, and get better every day. By embracing and solving the most exciting and impactful challenges, we help people to understand each other better and to bring the world closer together.”

(Waldemar, Senior Director of Product Management, Munich)

Yes, to diversity, equity & inclusion

In LanguageWire, we believe diversity in gender, age, background, and culture is essential for our growth. Therefore, we are committed to creating a culture that incorporates diverse perspectives and expertise in our everyday work.

LanguageWire’s recruitment process is designed to be transparent and fair for all candidates. We encourage candidates of all backgrounds to apply, and we ensure that candidates are provided with an equal opportunity to demonstrate their competencies and skills.

Want to know more?

We can’t wait to meet you! So, why wait 'til tomorrow? Apply today!

If you want to know more about LanguageWire, we encourage you to visit our website!

See more jobs at LanguageWire

Apply for this job

+30d

Sr. Data Engineer

DevOPSterraformairflowpostgressqlDesignapic++dockerjenkinspythonAWSjavascript

hims & hers is hiring a Remote Sr. Data Engineer

Hims & Hers Health, Inc. (better known as Hims & Hers) is the leading health and wellness platform, on a mission to help the world feel great through the power of better health. We are revolutionizing telehealth for providers and their patients alike. Making personalized solutions accessible is of paramount importance to Hims & Hers and we are focused on continued innovation in this space. Hims & Hers offers nonprescription products and access to highly personalized prescription solutions for a variety of conditions related to mental health, sexual health, hair care, skincare, heart health, and more.

Hims & Hers is a public company, traded on the NYSE under the ticker symbol “HIMS”. To learn more about the brand and offerings, you can visit hims.com and forhers.com, or visit our investor site. For information on the company’s outstanding benefits, culture, and its talent-first flexible/remote work approach, see below and visit www.hims.com/careers-professionals.

​​About the Role:

We're looking for a savvy and experienced Senior Data Engineer to join the Data Platform Engineering team at Hims. As a Senior Data Engineer, you will work with the analytics engineers, product managers, engineers, security, DevOps, analytics, and machine learning teams to build a data platform that backs the self-service analytics, machine learning models, and data products serving Million+ Hims & Hers subscribers.

You Will:

  • Architect and develop data pipelines to optimize performance, quality, and scalability
  • Build, maintain & operate scalable, performant, and containerized infrastructure required for optimal extraction, transformation, and loading of data from various data sources
  • Design, develop, and own robust, scalable data processing and data integration pipelines using Python, dbt, Kafka, Airflow, PySpark, SparkSQL, and REST API endpoints to ingest data from various external data sources to Data Lake
  • Develop testing frameworks and monitoring to improve data quality, observability, pipeline reliability, and performance 
  • Orchestrate sophisticated data flow patterns across a variety of disparate tooling
  • Support analytics engineers, data analysts, and business partners in building tools and data marts that enable self-service analytics
  • Partner with the rest of the Data Platform team to set best practices and ensure the execution of them
  • Partner with the analytics engineers to ensure the performance and reliability of our data sources.
  • Partner with machine learning engineers to deploy predictive models
  • Partner with the legal and security teams to build frameworks and implement data compliance and security policies
  • Partner with DevOps to build IaC and CI/CD pipelines
  • Support code versioning and code deployments for data Pipeline

You Have:

  • 8+ years of professional experience designing, creating and maintaining scalable data pipelines using Python, API calls, SQL, and scripting languages
  • Demonstrated experience writing clean, efficient & well-documented Python code and are willing to become effective in other languages as needed
  • Demonstrated experience writing complex, highly optimized SQL queries across large data sets
  • Experience with cloud technologies such as AWS and/or Google Cloud Platform
  • Experience with Databricks platform
  • Experience with IaC technologies like Terraform
  • Experience with data warehouses like BigQuery, Databricks, Snowflake, and Postgres
  • Experience building event streaming pipelines using Kafka/Confluent Kafka
  • Experience with modern data stack like Airflow/Astronomer, Databricks, dbt, Fivetran, Confluent, Tableau/Looker
  • Experience with containers and container orchestration tools such as Docker or Kubernetes.
  • Experience with Machine Learning & MLOps
  • Experience with CI/CD (Jenkins, GitHub Actions, Circle CI

Nice to Have:

  • Experience building data models using dbt
  • Experience with Javascript and event tracking tools like GTM
  • Experience designing and developing systems with desired SLAs and data quality metrics
  • Experience with microservice architecture
  • Experience architecting an enterprise-grade data platform

Our Benefits (there are more but here are some highlights):

  • Competitive salary & equity compensation for full-time roles
  • Unlimited PTO, company holidays, and quarterly mental health days
  • Comprehensive health benefits including medical, dental & vision, and parental leave
  • Employee Stock Purchase Program (ESPP)
  • Employee discounts on hims & hers & Apostrophe online products
  • 401k benefits with employer matching contribution
  • Offsite team retreats

#LI-Remote

 

Outlined below is a reasonable estimate of H&H’s compensation range for this role for US-based candidates. If you're based outside of the US, your recruiter will be able to provide you with an estimated salary range for your location.

The actual amount will take into account a range of factors that are considered in making compensation decisions, including but not limited to skill sets, experience and training, licensure and certifications, and location. H&H also offers a comprehensive Total Rewards package that may include an equity grant.

Consult with your Recruiter during any potential screening to determine a more targeted range based on location and job-related factors.

An estimate of the current salary range is
$160,000$185,000 USD

We are focused on building a diverse and inclusive workforce. If you’re excited about this role, but do not meet 100% of the qualifications listed above, we encourage you to apply.

Hims considers all qualified applicants for employment, including applicants with arrest or conviction records, in accordance with the San Francisco Fair Chance Ordinance, the Los Angeles County Fair Chance Ordinance, the California Fair Chance Act, and any similar state or local fair chance laws.

It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.

Hims & Hers is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, please contact us at accommodations@forhims.com and describe the needed accommodation. Your privacy is important to us, and any information you share will only be used for the legitimate purpose of considering your request for accommodation. Hims & Hers gives consideration to all qualified applicants without regard to any protected status, including disability. Please do not send resumes to this email address.

For our California-based applicants – Please see our California Employment Candidate Privacy Policy to learn more about how we collect, use, retain, and disclose Personal Information. 

See more jobs at hims & hers

Apply for this job

+30d

Contractor: Lead Data Engineering Services

NewselaRemote - Brazil or Argentina
terraformairflowsqlDesignc++pythonAWS

Newsela is hiring a Remote Contractor: Lead Data Engineering Services

Seeking to hire a Contractor based out of Mexico or Argentina for Lead-Level Data Engineering Services.

Scope of Services:

  • As a Contractor you will work alongside app developers and data stakeholders to create data system changes and to respond to data inquires
  • You will Lead initiatives and problem definition scoping, design, and planning through epics and blueprints. 
  • With your deep domain knowledge and radiation of that knowledge you will develop documentation, technical presentation, discussion, and incident reviews. 
  • You will build and maintain data pipelines and DAG tooling 
  • You will establish and maintain a data catalog with business-related metadata

Skills & Experience:

  • Mastered proficiency in SQL, Python, and relational datastores (columnar and row databases)
  • Proficiency in building and maintaining data pipelines and DAG tooling (Dagster, Airflow, etc)
  • Advanced experience with event-based pipelines and CDC tooling
  • Advanced experience in managing large-scale data migrations in relational datastores
  • Advanced experience in optimizing SQL query performance
  • Advanced experience with data testing strategies to ensure resulting datastores are aligned with expected business logic
  • Experience with DBT orchestration and best practices
  • Experience with enabling monitoring, health checks and alerting on data systems and pipelines
  • Experience establishing and maintaining a data catalog with business-related metadata
  • Experience building tools and automation to run data infrastructure
  • Experience with writing and maintaining cloud-based infrastructure for data pipelines (AWS, GCP and Terraform) is a plus
  • Experience in document, graph or schema-less datastores is a plus

Please note that given the nature of the contract, this role will not be eligible to participate in company-sponsored benefits. 

See more jobs at Newsela

Apply for this job

+30d

Contractor: Lead Data Reporting Engineering Services

NewselaRemote - Brazil or Argentina
airflowsqlapic++pythonfrontend

Newsela is hiring a Remote Contractor: Lead Data Reporting Engineering Services

Seeking to hire a Contractor based out of Mexico or Argentina for Lead-Level Data Reporting Engineering Services.

Scope of Services:

  • As a Contractor you will translate product data requests into insightful customer reporting metrics
  • You will communicate with app developers and data stakeholders regarding data system changes and data inquiries
  • You will lead initiative, define problems, and provide scoping to break down work for the team
  • You will communicate with frontend engineers to create data visualization
  • You will lead and work alongside customer-facing business intelligence, reporting, and data visualization teams
  • You will conduct root-cause analyses relating to queries about data irregularities  and product metric definition requests 

Skills & Experience:

  • Mastered proficiency in SQL, Python, and relational datastores
    • Experience columnar relational databases is a plus
  • Advanced experience in optimizing SQL query performance
  • Experience leading and working on customer-facing business intelligence, reporting, data modeling, and data visualization teams
    • Experience with maintaining custom, in house tooling is a plus
  • Strong ability to conduct root-cause analyses relating to queries about data irregularities and product metric definitions requests 
  • Experience with data testing strategies to ensure transformations and metrics are aligned with expected business logic
  • Experience with DBT orchestration and best practices
  • Experience with python web frameworks (Fast API, Flask)
  • Experience with enabling application monitoring and health checks for systems within the team’s domain
  • Experience with data pipelines and DAG tooling (Dagster, Airflow, etc) is a plus
  • Experience with event-based pipelines and CDC tooling is a plus

Please note that given the nature of the contract, this role will not be eligible to participate in company-sponsored benefits. 

See more jobs at Newsela

Apply for this job

+30d

Contractor: Data Engineering Services

NewselaRemote - Brazil or Argentina
tableauairflowsqlc++pythonAWS

Newsela is hiring a Remote Contractor: Data Engineering Services

Seeking to hire a Contractor based out of Brazil or Argentina for Mid-Senior Level Data Engineering Services.

Scope of Services: 

  • This Contractor will develop a thorough understanding of the various data sources, data pipelines, and enterprise data warehouse models. 
  • Play a crucial role as we assess the existing tools and processes and help team with critical migrations like prefect 1 to prefect 2 as well as incorporating tools like Dbt to analytics engineering cycle.
  • Help the team rapidly meet the business needs by connecting new data sources as needed as well as building new data warehouse models
  • Maintain reliable data and analytics platform by bringing in best practices and tools for data quality checks, monitoring as well as helping troubleshoot and address production issues.

Skills & Experience:

  • 4+ years of experience in Data Engineering
  • Proficient in Python programming and hands-on experience building ETL/ELT pipelines
  • Experience using orchestration tools like Prefect, Airflow etc
  • Working experience with column store Analytical Datastores like Snowflake/Redshift/Bigquery etc
  • Hands-on experience in data modeling; including strong knowledge of SQL
  • Experience with source control systems like GitHub
  • Experience with a public cloud preferably AWS
  • Detail-oriented, and take pride in the quality of your work 
  • Experience working with a modern cloud-native data stack in a fast-paced environment
  • Experience with Dbt is a plus
  • Experience with BI tools like Tableau is a plus

Please note that given the nature of the contract, this role will not be eligible to participate in company-sponsored benefits. 

See more jobs at Newsela

Apply for this job

+30d

Contractor: Senior-Level Site Reliability Engineering Services (Brazil or Argentina)

NewselaRemote - Brazil or Argentina
terraformairflowDesignc++dockerkuberneteslinuxAWS

Newsela is hiring a Remote Contractor: Senior-Level Site Reliability Engineering Services (Brazil or Argentina)

Seeking to hire a Contractor based out of Brazil or Argentina for Senior-Level Site Reliability Engineering Services.

Scope of Services:

  • Be on an on-call rotation to respond to incidents that impact Newsela.com availability and provide support for developers during internal and external incidents 
  • Maintain and assist in extending our infrastructure with Terraform, Github Actions CI/CD, Prefect, and AWS services
  • Build monitoring that alerts on symptoms rather than outages using Datadog, Sentry and CloudWatch
  • Look for ways to turn repeatable manual actions into automations to reduce toil
  • Improve operational processes (such as deployments, releases, migrations, etc) to make them run seamlessly with fault tolerance in mind 
  • Design, build and maintain core cloud infrastructure on AWS and GCP that enables scaling to support thousands of concurrent users
  • Debug production issues across services and levels of the stack 
  • Provide infrastructure and architectural planning support as an embedded team member within a domain of Newsela’s application developers 
  • Plan the growth of Newsela’s infrastructure 
  • Influence the product roadmap and work with engineering and product counterparts to influence improved resiliency and reliability of the Newsela product.
  • Proactively work on efficiency and capacity planning to set clear requirements and reduce the system resource usage to make Newsela cheaper to run for all our customers.
  • Identify parts of the system that do not scale, provide immediate palliative measures, and drive long-term resolution of these incidents.
  • Identify Service Level Indicators (SLIs) that will align the team to meet the availability and latency objectives.
  • For stable counterpart assignments, maintain awareness and actively influence stage group plans and priorities through participation in stage group meetings and async discussions. Act as a steward for reliability.

Skills / Experience:

  • 5+ years of experience in site-reliability 
  • You have advanced Terraform syntax and CI/CD configuration, pipelines, jobs
  • You have managed DAG tooling and data pipelines (ex: Airflow, Dagster, Prefect)
  • You have advanced knowledge and experience with maintaining data pipeline infrastructure and large scale data migrations 
  • You have advanced knowledge of cloud infrastructure services (AWS, GCP)
  • You are well versed in container orchestration technologies: cluster provisioning and new services (ECS, Kubernetes, Docker)
  • Background working with service catalog metrics and recording rules for alerts (Datadog, NewRelic, Sentry, Cloudwatch)
  • Experience with log shipping pipelines and incident debugging visualizations
  • Familiarity with operating system (Linux) configuration, package management, startup and troubleshooting and a comfortable with BASH/CLI scripting
  • Familiarity with block and object storage configuration and debugging.
  • Ability to identify significant projects that result in substantial improvements in reliability, cost savings and/or revenue.
  • Ability to identify changes for the product architecture from the reliability, performance and availability perspectives with a data-driven approach.
  • Lead initiatives and problem definition and scoping, design, and planning through epics and blueprints.
  • You have deep domain knowledge and radiation of that knowledge through documentation, recorded demos, technical presentations, discussions, and incident reviews.
  • You can perform and run blameless RCAs on incidents and outages aggressively looking for answers that will prevent the incident from ever happening again.

Please note that given the nature of the contract, this role will not be eligible to participate in company-sponsored benefits. 

See more jobs at Newsela

Apply for this job

+30d

DATA ENGINEER (STAGE Janvier 2025) - H/F

Showroomprive.comSaint-Denis, France, Remote
airflowsqlc++

Showroomprive.com is hiring a Remote DATA ENGINEER (STAGE Janvier 2025) - H/F

Description du poste

Au cœur du pôle Data de Showroomprive, vous intègrerez l’équipe « Data Engineering ».    
Vos missions seront axées autour de l’extraction, du traitement et du stockage de la donnée via le maintien et l’évolution d’un Datawarehouse utilisé par le reste des équipes Data (BI, Data Science, Marketing analyst).    

Vos missions se découperont en deux parties :    

  • Un projet principal à réaliser de A à Z autour de la donnée, de son traitement, de son contrôle ou encore de son accessibilité.    

  • Les taches du quotidien de l’équipe (développement de nouveaux flux, exports de données métier, requêtes ad hoc, gestion d’accès…).   

Pour mener à bien ses missions, notre équipe utilise des outils à la pointe du marché en matière de traitement de la donnée. Notamment, Dataiku et Airflow pour les flux et une plateforme cloud leader du marché.   

Vous intégrerez une équipe de Data Engineers qui vous accompagneront au quotidien pour mener à bien vos missions, mais aussi un service Data aux compétences diverses et pointues dans leurs domaines.   

Qualifications

En fin de formation supérieure (Bac+4 /+5) de type Ecole d’Ingénieurs ou formation universitaire équivalente dans une filière liée à la Business Intelligence ou au Data Engineering.  

Dans le cadre de vos études ou d’une expérience précédente,  

vous avez pu acquérir de solides bases en SQL et en Python. Vous avez aussi développé une réelle appétence à découvrir par vous-même et vous êtes très curieux lorsqu’il s’agit de Data.   

Votre rigueur et votre dynamisme constitueront des atouts clés dans la réussite des missions qui vous seront confiées.  

See more jobs at Showroomprive.com

Apply for this job

+30d

Senior Java Engineer (Cloud Native)

ExperianHeredia, Costa Rica, Remote
S3EC2LambdaagilenosqlairflowsqlDesignmongodbapijavapostgresqlpythonAWS

Experian is hiring a Remote Senior Java Engineer (Cloud Native)

Job Description

You will be involved in projects using modern technologies as part of a senior software engineering team. You will help design and implementing product features. This is a technical role requiring excellent coding skills.

You will develop core functionality and processing for a new powerful, enterprise level data platform built with Java and using leading mainstream open-source technologies.

  • Hands-On collaboration as a primary member of a software engineering team focused on building event driven services delivering secure, efficient solutions in a bold timeframe.
  • Deliver available and scalable data streaming application functionality on an AWS cloudbased platform.
  • Diligently observe and maintain Standards for Regulatory Compliance and Information Security • Deliver and maintain accurate, complete and current documentation
  • Participate in full Agile cycle engagements, including meetings, iterative development, estimations, code reviews and design sessions.
  • Contribute to team architecture, engineering, and product discussions ensuring the team delivers software
  • Work with the service quality engineering team to ensure that only thoroughly tested code makes it to production.
  • Oversee deliverables from design through production operationalization •
  • Flexibility to work on Experience providing engineering support to customer support team to resolve any critical customer issues
  • you will report to Software Development Director Senior

Qualifications

  • 5+ years of software development experience building and testing applications following secure coding practices
  • Collaborate as a hands-on team member developing a significant commercial software project in Java with Spring Framework.
  • Proficiency in developing server-side Java applications using mainstream tools including the Spring framework and AWS SDK
  • Experience with event driven architectures using pub/sub message brokers such as Kafka, Kinesis, and NATS.io
  • Current cloud technology experience, preferably AWS (Fargate, EC2, S3, RDS PostgreSQL, Lambda, API Gateway, Airflow)
  • Experience developing web application using Spring Reactive libraries like WebFlux and Project Reactor and normal Spring Web
  • Aproficiency in SQL and NoSQL based data access and management on PostgeSQL and MongoDB or AWS DocumentDB.
  • Recent hands-on experience building and supporting commercial systems managing data and transactions including server-side development of Data Flow processes
  • Experience with Continuous Integration/Continuous Delivery (CI/CD) process and practices (CodeCommit, CodeDeploy, CodePipeline/Harness/Jenkins/Github Actions, CLI, BitBucket/Git)
  • Experience overseeing technologies including Splunk, Datadog, and Cloudwatch
  • Familiarity creating and using Docker/Kubernetes applications Additional Preferred Experience
  • Proficiency in developing server-side Python using mainstream tools including Pandas, SciPy, PySpark, and Pydantic
  • Experience building systems for financial services or tightly regulated businesses.
  • Security and privacy compliance (GPDR, CCPA, ISO 27001, PCI, HIPAA, etc.) experience a plus.

See more jobs at Experian

Apply for this job

+30d

Head of Data

GeminiRemote (USA)
remote-firstairflowsqlDesignazurepythonAWS

Gemini is hiring a Remote Head of Data

About the Company

Gemini is a global crypto and Web3 platform founded by Tyler Winklevoss and Cameron Winklevoss in 2014. Gemini offers a wide range of crypto products and services for individuals and institutions in over 70 countries.

Crypto is about giving you greater choice, independence, and opportunity. We are here to help you on your journey. We build crypto products that are simple, elegant, and secure. Whether you are an individual or an institution, we help you buy, sell, and store your bitcoin and cryptocurrency. 

At Gemini, our mission is to unlock the next era of financial, creative, and personal freedom.

In the United States, we have a flexible hybrid work policy for employees who live within 30 miles of our office headquartered in New York City and our office in Seattle. Employees within the New York and Seattle metropolitan areas are expected to work from the designated office twice a week, unless there is a job-specific requirement to be in the office every workday. Employees outside of these areas are considered part of our remote-first workforce. We believe our hybrid approach for those near our NYC and Seattle offices increases productivity through more in-person collaboration where possible.

The Department: Data

The Role: Head of Data

As the leader, you’ll shape the way we approach data at Gemini by creating a strategic vision for how data can help drive our business growth. You will build and manage a high-performance machine learning, data engineering, platform, and analytics team and leverage your experience and communication skills to work across business teams to develop innovative data solutions.  You will inspire and mentor a strong data team through your passion for data and its ability to transform decision-making and generate solutions for this new, exciting asset class. Communicating your insights and driving new product development across the organization is paramount to success. You will also be looked upon to share Gemini’s data vision and products externally.

Responsibilities:

  • Lead team responsible for scaling our data infrastructure and optimizing our warehouse’s performance
  • Lead design, architecture and implementation of best-in-class Data Warehousing and reporting solutions
  • Define and drive the vision for data management, analytics, and data culture
  • Collaborate with executive leadership to integrate data initiatives into business strategies
  • Oversee the design, development, and maintenance of the company’s data platform, ensuring scalability, reliability, and security
  • Lead and participate in design discussions and meetings
  • Oversee the end-to-end management of our data ecosystem, ensuring data integrity, and driving data-driven decision-making across the organization including our products
  • Design, automate, build, and launch scalable, efficient and reliable data pipelines into production
  • Drive the development of advanced analytics, reporting solutions, and dashboards to provide actionable insights to stakeholders
  • Oversee design, development,  and maintenance of ETL processes, data warehouses, and data lakes
  • Research new tools and technologies to improve existing processes
  • Implement best practices for data modeling, architecture, and integration across various data sources
  • Develop new systems and tools to enable the teams to consume and understand data more intuitively
  • Partner with engineers, project managers, and analysts to deliver insights to the business
  • Own responsibilities around root cause analysis, production and data issues such as validation

Minimum Qualifications:

  • 12-20+ years experience in data engineering with data warehouse technologies
  • 5-7+ years experience bringing a data infrastructure to the next level including the overhaul of data, pipelines, architecture and modeling
  • 10+ years experience in custom ETL design, implementation and maintenance
  • 10+ years experience with schema design and dimensional data modeling
  • Experience building real-time data solutions and processes
  • Experience building and integrating web analytics solutions
  • Experience with AI/ML tools and techniques, including experience deploying machine learning models in production
  • Strong knowledge of traditional and modern data tools and technologies, including SQL, Python, cloud platforms (AWS, Azure, GCP), and big data frameworks
  • Experience with AI/ML tools and techniques, including experience deploying machine learning models in production
  • Strong understanding of data governance, compliance, and security best practices
  • Experience with one or more MPP databases(Redshift, Bigquery, Snowflake, etc)
  • Experience with one or more ETL tools(Airflow, AWS Glue, Informatica, Pentaho, SSIS, Alooma, etc)
  • Experienced building cross functional teams across different departments
  • Exceptional leadership, communication, and stakeholder management skills

Preferred Qualifications:

  • Experience in a fast-paced, high-growth environment, particularly in tech or a data-driven industry
  • Hands-on experience with AI/ML frameworks like TensorFlow, PyTorch, or similar
It Pays to Work Here
 
The compensation & benefits package for this role includes:
  • Competitive starting salary
  • A discretionary annual bonus
  • Long-term incentive in the form of a new hire equity grant
  • Comprehensive health plans
  • 401K with company matching
  • Paid Parental Leave
  • Flexible time off

Salary Range: The base salary range for this role is between $269,000 - $336,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidate’s compensation, we consider a number of factors including skillset, experience, job scope, and current market data.

At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace. If you have a specific need that requires accommodation, please let a member of the People Team know.

#LI-SM1

Apply for this job

+30d

Data Engineer (Stage Janvier 2025) H/F

Showroomprive.comSaint-Denis, France, Remote
airflowsqlc++

Showroomprive.com is hiring a Remote Data Engineer (Stage Janvier 2025) H/F

Description du poste

Au cœur du pôle Data de Showroomprive, vous intègrerez l’équipe «Data Engineering».  
Vos missions seront axées autour de l’extraction, du traitement et du stockage de la donnée via le maintien et l’évolution d’un Datawarehouse utilisé par le reste des équipes Data (BI, Data Science, Marketing analyst).  

 

Vos missions se découperont en deux parties :  

  • Un projet principal à réaliser de A à Z autour de la donnée, de son traitement, de son contrôle ou encore de son accessibilité.  
  • Les taches du quotidien de l’équipe (développement de nouveaux flux, exports de données métier, requêtes ad hoc, gestion d’accès…). 

Pour mener à bien ses missions, notre équipe utilise des outils à la pointe du marché en matière de traitement de la donnée grâce à Airflow pour le pipeline et à l’utilisation d’une plateforme cloud leader du marché. 

Vous intégrerez une équipe Data Engineering de 3 personnes qui vous accompagneront au quotidien pour mener à bien vos missions, mais aussi un service Data de 30 personnes aux compétences diverses et pointues dans leurs domaines. 

Qualifications

En fin de formation supérieure (Bac+5) de type Ecole d’Ingénieurs (filière liée à la Data ou Software Engineering). 

Dans le cadre de vos études ou d’une expérience précédente, vous avez pu acquérir de solides bases en SQL et Python. Vous avez aussi développé une réelle appétence à découvrir par vous-même et vous êtes très curieux lorsqu’il s’agit de Data. 

Votre rigueur et votre dynamisme constitueront des atouts clés dans la réussite des missions qui vous seront confiées. 

See more jobs at Showroomprive.com

Apply for this job

+30d

Lead Data Analyst, Product

tableauairflowsqlDesignc++python

hims & hers is hiring a Remote Lead Data Analyst, Product

Hims & Hers Health, Inc. (better known as Hims & Hers) is the leading health and wellness platform, on a mission to help the world feel great through the power of better health. We are revolutionizing telehealth for providers and their patients alike. Making personalized solutions accessible is of paramount importance to Hims & Hers and we are focused on continued innovation in this space. Hims & Hers offers nonprescription products and access to highly personalized prescription solutions for a variety of conditions related to mental health, sexual health, hair care, skincare, heart health, and more.

Hims & Hers is a public company, traded on the NYSE under the ticker symbol “HIMS”. To learn more about the brand and offerings, you can visit hims.com and forhers.com, or visit our investor site. For information on the company’s outstanding benefits, culture, and its talent-first flexible/remote work approach, see below and visit www.hims.com/careers-professionals.

​​About the Role:

As the Lead Data Analyst of Product Analytics, you and your team will shape the customer experience through high-quality experimental design and hypothesis testing. You will work cross-functionally with product managers, growth leads, designers, and engineers in a fast-paced collaborative environment. Your knowledge of A/B testing and digital analytics combined with your background in experimental design will allow Hims and Hers to build best-in-class customer experiences. This position will report to the Senior Manager of Product Analytics.

You Will:

  • Design experiments and provide actionable and scalable recommendations from the results
  • Deliver in-depth analyses that are statistically sound and easily understood by non-technical audiences
  • Work with your team to curate the experimentation roadmap for the product and growth teams
  • Enable data self-service by designing templates that are easy to understand using relevant KPIs
  • Collaborate across analytics, engineering, and growth teams to improve the customer experience
  • Distill your knowledge of tests into playbooks that can be implemented and utilized to help us transform our digital experience
  • Identify causal relationships in our data using advanced statistical modeling
  • Segment users based on demographic, behavioral, and psychographic attributes to tailor product experiences and lifecycle communications
  • Align analytics initiatives with broad business objectives to build long-term value
  • Conduct deep-dive analyses to answer specific business questions and provide actionable recommendations to product and growth team

You Have:

  • 8+ years of analytics experience
  • 5+ years of experience in A/B testing
  • Experience working with subscription metrics
  • A strong work ethic and the drive to learn more and understand a problem in detail
  • Strong organizational skills with an aptitude to manage long-term projects from end to end
  • Expert SQL skills
  • Extensive experience working with data engineering teams and production data pipelines
  • Experience programming in Python, SAS, or R 
  • Experience in data modeling and statistics with a strong knowledge of experimental design and statistical inference 
  • Development and training of predictive models
  • Advanced knowledge of data visualization and BI in Looker or Tableau
  • Ability to explain technical analyses to non-technical audience

A Big Plus If You Have:

  • Advanced degree in Statistics, Mathematics, or a related field
  • Experience with price testing and modeling price elasticity
  • Experience with telehealth concepts
  • Project management experience 
  • DBT, airflow, and Databricks experience

Our Benefits (there are more but here are some highlights):

  • Competitive salary & equity compensation for full-time roles
  • Unlimited PTO, company holidays, and quarterly mental health days
  • Comprehensive health benefits including medical, dental & vision, and parental leave
  • Employee Stock Purchase Program (ESPP)
  • Employee discounts on hims & hers & Apostrophe online products
  • 401k benefits with employer matching contribution
  • Offsite team retreats

#LI-Remote

Outlined below is a reasonable estimate of H&H’s compensation range for this role for US-based candidates. If you're based outside of the US, your recruiter will be able to provide you with an estimated salary range for your location.

The actual amount will take into account a range of factors that are considered in making compensation decisions, including but not limited to skill sets, experience and training, licensure and certifications, and location. H&H also offers a comprehensive Total Rewards package that may include an equity grant.

Consult with your Recruiter during any potential screening to determine a more targeted range based on location and job-related factors.

An estimate of the current salary range is
$160,000$190,000 USD

We are focused on building a diverse and inclusive workforce. If you’re excited about this role, but do not meet 100% of the qualifications listed above, we encourage you to apply.

Hims considers all qualified applicants for employment, including applicants with arrest or conviction records, in accordance with the San Francisco Fair Chance Ordinance, the Los Angeles County Fair Chance Ordinance, the California Fair Chance Act, and any similar state or local fair chance laws.

It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.

Hims & Hers is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, please contact us at accommodations@forhims.com and describe the needed accommodation. Your privacy is important to us, and any information you share will only be used for the legitimate purpose of considering your request for accommodation. Hims & Hers gives consideration to all qualified applicants without regard to any protected status, including disability. Please do not send resumes to this email address.

For our California-based applicants – Please see our California Employment Candidate Privacy Policy to learn more about how we collect, use, retain, and disclose Personal Information. 

See more jobs at hims & hers

Apply for this job

+30d

Senior Analytics Engineer, MarTech

CLEAR - CorporateNew York, New York, United States (Hybrid)
tableauairflowsqlDesignjenkinspythonAWS

CLEAR - Corporate is hiring a Remote Senior Analytics Engineer, MarTech

At CLEAR, we are pioneers in digital and biometric identification, known for reducing friction wherever identity verification is needed. Now, we’re evolving further, building the next generation of products to go beyond ID, empowering our members to harness the power of a networked digital identity. As a Senior Analytics Engineer, you will play a pivotal role in designing and enhancing our data platform, implementing our MarTech products, and ensuring it supports data-driven insights while safeguarding member privacy and security.


A brief highlight of our tech stack:

  • SQL / Python / Looker / Snowflake / Airflow / Databricks / Spark / dbt

What you’ll do:

  • Design and maintain scalable, self-service data platforms enabling Analysts and Engineers to drive automation, testing, security, and high-quality analytics.
  • Develop robust processes for data transformation, structuring, metadata management, and workflow optimization.
  • Own and manage end-to-end data pipelines—from ingestion to transformation, modeling, and visualization—ensuring high data quality.
  • Collaborate with stakeholders across product and business teams to understand requirements and deliver actionable insights.
  • Lead the development of data models and analytics workflows that support strategic decision-making and reporting.
  • Maintain a strong focus on privacy, ensuring that member data is used securely and responsibly.
  • Drive architectural improvements in data processes, continuously improving CLEAR’s data infrastructure.

What makes you a great fit:

  • 6+ years of experience in data engineering, with a focus on data transformation, analytics, and cloud-based solutions.
  • Proficient in building and managing data pipelines using orchestration tools (Airflow, Dagster,) and big data tools (Spark, Kafka, Snowflake, Databricks).
  • Expertise in modern data tools like dbt and data visualization platforms like Looker, Tableau.
  • Ability to communicate complex technical concepts clearly to both technical and non-technical stakeholders.
  • Experience mentoring and collaborating with team members to foster a culture of learning and development.
  • Comfortable working in a dynamic, fast-paced environment with a passion for leveraging data to solve complex business challenges.

How You'll be Rewarded:

At CLEAR we help YOU move forward - because when you’re at your best, we’re at our best. You’ll work with talented team members who are motivated by our mission of making experiences safer and easier. Our hybrid work environment provides flexibility. In our offices, you’ll enjoy benefits like meals and snacks. We invest in your well-being and learning & development with our stipend and reimbursement programs. 

We offer holistic total rewards, including comprehensive healthcare plans, family building benefits (fertility and adoption/surrogacy support), flexible time off, free OneMedical memberships for you and your dependents, and a 401(k) retirement plan with employer match. The base salary range for this role is $175,000 - $215,000, depending on levels of skills and experience.

The base salary range represents the low and high end of CLEAR’s salary range for this position. Salaries will vary depending on various factors which include, but are not limited to location, education, skills, experience and performance. The range listed is just one component of CLEAR’s total compensation package for employees and other rewards may include annual bonuses, commission, Restricted Stock Units.

About CLEAR

Have you ever had that green-light feeling? When you hit every green light and the day just feels like magic. CLEAR's mission is to create frictionless experiences where every day has that feeling. With more than 25+ million passionate members and hundreds of partners around the world, CLEAR’s identity platform is transforming the way people live, work, and travel. Whether it’s at the airport, stadium, or right on your phone, CLEAR connects you to the things that make you, you - unlocking easier, more secure, and more seamless experiences - making them all feel like magic.

CLEAR provides reasonable accommodation to qualified individuals with disabilities or protected needs. Please let us know if you require a reasonable accommodation to apply for a job or perform your job. Examples of reasonable accommodation include, but are not limited to, time off, extra breaks, making a change to the application process or work procedures, policy exceptions, providing documents in an alternative format, live captioning or using a sign language interpreter, or using specialized equipment.

See more jobs at CLEAR - Corporate

Apply for this job

+30d

Engineering Manager, Data Platform

GrammarlySan Francisco; Hybrid
MLremote-firstairflowDesignazurec++AWS

Grammarly is hiring a Remote Engineering Manager, Data Platform

Grammarly offers a dynamic hybrid working model for this role. This flexible approach gives team members the best of both worlds: plenty of focus time along with in-person collaboration that helps foster trust, innovation, and a strong team culture.

About Grammarly

Grammarly is the world’s leading AI writing assistance company trusted by over 30 million people and 70,000 teams. From instantly creating a first draft to perfecting every message, Grammarly helps people at 96% of theFortune 500 and teams at companies like Atlassian, Databricks, and Zoom get their point across—and get results—with best-in-class security practices that keep data private and protected. Founded in 2009, Grammarly is No. 14 on the Forbes Cloud 100, one of TIME’s 100 Most Influential Companies, one of Fast Company’s Most Innovative Companies in AI, and one of Inc.’s Best Workplaces.

The Opportunity

To achieve our ambitious goals, we’re looking for a Software Engineer to join our Data Platform team and help us build a world-class data platform. Grammarly’s success depends on its ability to efficiently ingest over 60 billion daily events while using our systems to improve our product. This role is a unique opportunity to experience all aspects of building complex software systems: contributing to the strategy, defining the architecture, and building and shipping to production.

Grammarly’s engineers and researchers have the freedom to innovate and uncover breakthroughs—and, in turn, influence our product roadmap. The complexity of our technical challenges is growing rapidly as we scale our interfaces, algorithms, and infrastructure. You can hear more from our team on our technical blog.

We are seeking a highly skilled and experienced Manager for our Data Platform team to achieve our ambitious objectives. This role is crucial in managing and evolving our data infrastructure, engineering, and governance processes to support modern machine learning (ML) use cases, self-serve analytics, and data policy management across the organization. The ideal candidate will possess strong technical expertise, exceptional leadership abilities, and the capability to mentor and develop a high-performing team that operates across data infrastructure, engineering, and governance.

This person will be integral to the larger data organization, reporting directly to the Director of Data Platform. They will have the opportunity to influence decisions and the direction of our overall data platform, including data processing, infrastructure, data governance, and analytics engineering. Grammarly’s engineers and researchers have the freedom to innovate and uncover breakthroughs—and, in turn, influence our product roadmap. The complexity of our technical challenges is growing rapidly as we scale our interfaces, algorithms, and infrastructure.

As the Data Platform team manager, you will lead and mentor a team of data engineers, infrastructure engineers, and data governance specialists, fostering a collaborative and innovative environment focused on professional growth. You will oversee the design, implementation, and maintenance of secure, scalable, and optimized data platforms, ensuring high performance and reliability. Your role includes developing and executing strategic roadmaps aligned with business objectives and collaborating closely with cross-functional teams and the larger data organization to ensure seamless data integration, governance, and access. Additionally, you will provide technical leadership and play a pivotal role in resource management and recruiting efforts, driving the team’s success and aligning with the organization’s long-term data strategy.

In this role, you will:

  • Build a highly specialized data platform team to support the growing needs and complexity of our product, business, and ML organizations.
  • Oversee the design, implementation, and maintenance of a robust data infrastructure, ensuring high availability and reliability across ingestion, processing, and storage layers.
  • Lead the development of frameworks and tooling that enable self-serve analytics, policy management, and seamless data governance across the organization.
  • Ensure data is collected, transformed, and stored efficiently to support real-time, batch processing, and machine learning needs.
  • Act as a liaison between the Data Platform team and the broader organization, ensuring seamless communication, collaboration, and alignment with global data strategies.
  • Drive cross-functional meetings and initiatives to represent the Data Platform team’s interests and contribute to the organization’s overall data strategy, ensuring ML and analytics use cases are adequately supported.
  • Drive the evaluation, selection, and implementation of new technologies and tools that enhance the team’s capabilities and improve the organization’s overall data infrastructure and governance processes.
  • Implement and enforce data governance policies and practices to ensure data quality, privacy, security, and compliance with organizational standards.
  • Collaborate with stakeholders to define and refine data governance policies that align with business objectives and facilitate discoverability and accessibility of high-quality data.
  • Monitor and assess the data platform's performance to identify areas for optimization, cost management, and continuous improvement.
  • Foster a collaborative and high-performance culture within the team, emphasizing ownership and innovation.
  • Cultivate an ownership mindset and culture across product and platform teams by providing necessary metrics to drive informed decisions and continuous improvement.
  • Set high performance and quality standards, coaching team members to meet them, and mentoring and growing junior and senior IC talent.

Qualifications

  • 7+ years of experience in data engineering, infrastructure & governance, with at least 2-3 years in a leadership or management role.
  • Proven experience in building and managing large-scale data platforms, including data ingestion pipelines and infrastructure.
  • Experience with cloud platforms and data ecosystems such as AWS, GCP, Azure, and Databricks.
  • Familiarity with modern data engineering and orchestration tools and frameworks (e.g., Apache Kafka, Airflow, DBT, Spark).
  • Strong understanding of data governance frameworks, policy management, and self-serve analytics platforms.
  • Excellent leadership and people management skills, with a track record of mentoring and developing high-performing teams.
  • Experience working with geographically distributed teams and aligning with global data and governance strategies.
  • Strong problem-solving skills, with the ability to navigate and resolve complex technical challenges.
  • Excellent communication and collaboration skills, with the ability to work effectively with stakeholders across different locations and time zones.
  • Proven ability to operate in a fast-paced, dynamic environment where things change quickly.
  • Leads by setting well-understood goals and sharing the appropriate level of context for maximum autonomy, but is also profoundly technical and can dive in to help when necessary.
  • Embodies our EAGER values—ethical, adaptable, gritty, empathetic, and remarkable.
  • Is inspired by our MOVE principles: move fast and learn faster; obsess about creating customer value; value impact over activity; and embrace healthy disagreement rooted in trust.

Compensation and Benefits

Grammarly offers all team members competitive pay along with a benefits package encompassing the following and more: 

  • Excellent health care (including a wide range of medical, dental, vision, mental health, and fertility benefits)
  • Disability and life insurance options
  • 401(k) and RRSP matching 
  • Paid parental leave
  • 20 days of paid time off per year, 12 days of paid holidays per year, two floating holidays per year, and flexible sick time
  • Generous stipends (including those for caregiving, pet care, wellness, your home office, and more)
  • Annual professional development budget and opportunities

Grammarly takes a market-based approach to compensation, which means base pay may vary depending on your location. Our US locations are categorized into two compensation zones based on proximity to our hub locations.

Base pay may vary considerably depending on job-related knowledge, skills, and experience. The expected salary ranges for this position are outlined below by compensation zone and may be modified in the future.  

Zone 1: $285,000 – $325,000/year (USD)

We encourage you to apply

At Grammarly, we value our differences, and we encourage all to apply—especially those whose identities are traditionally underrepresented in tech organizations. We do not discriminate on the basis of race, religion, color, gender expression or identity, sexual orientation, ancestry, national origin, citizenship, age, marital status, veteran status, disability status, political belief, or any other characteristic protected by law. Grammarly is an equal opportunity employer and a participant in the US federal E-Verify program (US). We also abide by the Employment Equity Act (Canada).

#LI-Hybrid

 

Apply for this job

+30d

Senior Principal Architect - Cloud Engineering

IFSBengaluru, India, Remote
gRPCgolangagileairfloworacleDesignmobileazuregraphqljavac++.netdockerpostgresqlkubernetesangularjenkinspythonjavascript

IFS is hiring a Remote Senior Principal Architect - Cloud Engineering

Job Description

The Senior Principal Architect (“SPA”) will own the overall architecture accountability for one or more portfolios within IFS Technology. The role of the SPA is to build and develop the technology strategy, while growing, leading, and energising multi-faceted technical teams to design and deliver technical solutions that deliver IFS technology needs and are supported by excellent data, methodology, systems and processes. The role will work with a broad set of stakeholders including product managers, engineers, and various R&D and business leaders. The occupant of this role diagnoses and solves significant, complex and non-routine problems; translates practices from other markets, countries and industries; provides authoritative, technical recommendations which have a significant impact on business performance in short and medium term; and contributes to company standards and procedures, including the IFS Technical Reference Architecture. This role actively identifies new approaches that enhance and simplify where possible complexities in the IFS suite. The SPA represents IFS as the authority in one or technology areas or portfolios and acts as a role model to develop experts within this area.

What is the role?

  • Build, nurture and grow high performance engineering teams using Agile Engineering principles.
  • Provide technical leadership for design and development of software meeting functional & nonfunctional requirements.
  • Provide multi-horizon technology thinking to broad portfolios and platforms in line with desired business needs.
  • Adopt a hands-on approach to develop the architecture runway for teams.
  • Set technical agenda closely with the Product and Program Managers
  • Ensure maintainability, security and performance in software components developed using well-established engineering/architectural principles.
  • Ensure software quality complying with shift left quality principles.  
  • Conduct peer reviews & provide feedback ensuring quality standards.
  • Engage with requirement owners and liaise with other stakeholders.
  • Contribute to improvements in IFS products & services.

Qualifications

What do we need from you? 

It’s your excellent influencing and communication skills that will really make the difference. Entrepreneurship and resilience will be required, to help drive and shape the technology strategy. You will need technical, operational, and commercial breadth to deliver a strategic technical vision alongside a robust, secure and cost-effective delivery platform and operational model.

  • Seasoned Leader with 15+ years of hands-on experience in Design, Development and Implementation of scalable cloud-based web and mobile applications.
  • Have strong software architectural, technical design and programming skills.
  • Experience in Application Security, Scalability and Performance.
  • Ability to envision the big picture and work on details. 
  • Can articulate technology vision and delivery strategy in a way that is understandable to technical and non-technical audiences.
  • Willingness to learn and adapt different technologies/work environments.
  • Knowledge of and skilled in various tools, languages, frameworks and cloud technologies with the ability to be hands-on where needed:
    • Programming languages - C++, C#, GoLang, Python, JavaScript and Java
    • JavaScript frameworks - Angular, Node and React JS, etc.,
    • Back-end frameworks - .NET, GoLang, etc.,
    • Middleware – REST, GraphQL, GRPC,
    • Databases - Oracle, Mongo DB, Cassandra, PostgreSQL etc.
    • Azure and Amazon cloud services. Proven experience in building cloud-native apps on either or both cloud platforms
    • Kubernetes and Docker containerization
    • CI/CD tools - Circle CI, GitHub, GitLab, Jenkins, Tekton
  • Hands on experience in OOP concepts and design principles.
  • Good to have:
    • Knowledge of cloud-native big data tools (Hadoop, Spark, Argo, Airflow) and data science frameworks (PyTorch, Scikit-learn, Keras, TensorFlow, NumPy)
    • Exposure to ERP application development is advantageous.
  • Excellent communication and multi-tasking skills along with an innovative mindset.

See more jobs at IFS

Apply for this job

+30d

Lead Data Engineer (F/H)

ASINantes, France, Remote
S3agilenosqlairflowsqlazureapijava

ASI is hiring a Remote Lead Data Engineer (F/H)

Description du poste

Avec Simon GRIFFON, responsable de l’équipe Data Nantaise, nous recherchons un Lead Data Engineer pour mettre en place, intégrer, développer et optimiser des solutions de pipeline sur des environnements Cloud et On Premise pour nos projets clients. 

Au sein d'une équipe dédiée, principalement en contexte agile, voici les missions qui pourront vous être confiées : 

  • Participer à la compréhension des besoins métiers et réaliser des ateliers de cadrage avec le client 

  • Participer à la rédaction des spécifications fonctionnelles et techniques des flux 

  • Maîtriser les formats de données structurés et non structurés et savoir les manipuler  

  • Modéliser et mettre en place des systèmes décisionnels   

  • Installer et connecter une solution ETL / ELT à une source de données en tenant compte des contraintes et de l’environnement du client 

  • Concevoir et réaliser un pipeline de transformation et de valorisation des données et ordonnancer son fonctionnement 

  • Veiller à la sécurisation des pipelines de données 

  • Concevoir et réaliser des API utilisant les données valorisées  

  • Définir des plans de tests et d’intégration 

  • Prendre en charge la maintenance évolutive et corrective 

  • Accompagner les juniors dans leur montée en compétences 

 

En fonction de vos compétences et appétences, vous intervenez sur l’une ou plusieurs des technologies suivantes : 

  • L’écosystème data notamment Microsoft Azure 

  • Les langages : SQL, Java 

  • Les bases de données SQL et NoSQL 

  • Stockage cloud: S3, Azure Blob Storage… 

  • Les ETL/ESB et autres outils : Talend, Spark, Kafka NIFI, Matillion, Airflow, Datafactory, Glue... 

 

En rejoignant ASI : 

  • Vous évoluerez au sein d’une entreprise aux modes de fonctionnement internes flexibles garantis par une politique RH attentive (accord télétravail 3J/semaine, accord congé parenthèse…)  

  • Vous intégrerez les différentes communautés expertes d'ASI, pour partager des bonnes pratiques et participer aux actions d'amélioration continue. 

  • Vous évoluerez dans une entreprise bientôt reconnue Société à mission, Team GreenCaring et non GreenWashing porteuse d’une démarche RSE incarnée et animée, depuis plus de 10 ans. (Equipe RSE dédiée, accord forfaits mobilités durables…)  

Qualifications

Issu d’une formation supérieure en informatique, mathématiques ou spécialisé en Big Data, vous avez une expérience minimale de 10 ans en ingénierie des données et d'une expérience opérationnelle réussie dans la construction de pipelines de données structurées et non structurées.  

Le salaire proposé pour ce poste est compris entre 40 000 et 45 000 €, selon l'expérience et les compétences, tout en respectant l'équité salariale au sein de l'équipe. 

Attaché à la qualité de ce que vous réalisez, vous faites preuve de rigueur et d'organisation dans la réalisation de vos activités. 

Doté d'une bonne culture technologique, vous faites régulièrement de la veille pour actualiser vos connaissances. 

Un bon niveau d’anglais, tant à l’écrit qu’à l’oral est recommandé. 

Vous êtes doté d’un véritable esprit d’équipe, et votre leadership vous permet d'animer celle-ci en toute bienveillance et pédagogie pour la faire grandir. 

Désireux d’intégrer une entreprise à votre image, vous vous retrouvez dans nos valeurs de confiance, d’écoute, de plaisir et d’engagement.

 

A compétences égales, ce poste est ouvert aux personnes en situation de handicap.  

See more jobs at ASI

Apply for this job

+30d

Senior Data Scientist

redisBachelor's degreeterraformairflowsqlansibledockerkubernetespython

ReCharge Payments is hiring a Remote Senior Data Scientist

Who we are

In a world where acquisition costs are skyrocketing, funding is scarce, and ecommerce merchants are forced to do more with less, the most innovative DTC brands understand that subscription strategy is business strategy.

Recharge is simplifying retention and growth for innovative ecommerce brands. As the #1 subscription platform, Recharge is dedicated to empowering brands to easily set up and manage subscriptions, create dynamic experiences at every customer touchpoint, and continuously evaluate business performance. Powering everything from no-code customer portals, personalized offers, and dynamic bundles, Recharge helps merchants seamlessly manage, grow, and delight their subscribers while reducing operating costs and churn. Today, Recharge powers more than 20,000 merchants serving 100 million subscribers, including brands such as Blueland, Hello Bello, LOLA, Chamberlain Coffee, and Bobbie—Recharge doesn’t just help you sell products, we help build buyer routines that last.

Recharge is recognized on the Technology Fast 500, awarded by Deloitte, (3rd consecutive year) and is Great Place to Work Certified.

Senior Data Analyst, Product Analytics

Recharge is positioned to support the best Direct-To-Consumer ecommerce brands in the world. We are building multiple AI-based analytic products that revolutionize how our merchants leverage insight to retain and grow their business. 


We are looking for a data scientist who is value driven and passionate about providing actionable insights and helping to create data products that our product and growth teams can leverage. As a data scientist you will be working on both product analytics as well as advanced analytics projects working closely with data engineering and product to deliver value to our merchants through analytics and insights


You will be responsible for preparing data for prescriptive and predictive modeling, driving hypotheses, applying stats, and developing architecture for algorithms. 


What you’ll do

  • Live by and champion all of our core values (#ownership, #empathy, #day-one, and #humility).

  • Collaborate with stakeholders in cross-projects and team settings to identify and clarify business or product questions to answer. Provide feedback to translate and refine business questions into tractable analysis, evaluation metrics, or mathematical models.

  • Perform analysis utilizing relevant tools (e.g., SQL, Python). Provide analytical thought leadership through proactive and strategic contributions (e.g., suggests new analyses, infrastructure or experiments to drive improvements in the business).

  • Own outcomes for projects by covering problem definition, metrics development, data extraction and manipulation, visualization, creation, and implementation of analytical/statistical models, and presentation to stakeholders.

  • Develop solutions, lead, and manage problems that may be ambiguous and lacking clear precedent by framing problems, generating hypotheses, and making recommendations from a perspective that combines both, analytical and product-specific expertise.

  • Work independently to find creative solutions to difficult problems.

  • Effectively communicate analyses and experimental outcomes to business stakeholders, ensuring insights are presented with clear business context and relevance.

  • Write and maintain technical documentation for the data models and analytics solutions.
     

What you'll bring

  • Bachelor's degree ,or equivalent work experience, in Statistics, Mathematics, Data Science, Engineering, Physics, Economics, or a related quantitative field.

  • 5+ years of work experience using analytics to solve product or business problems, performing statistical analysis, and coding (e.g., Python, R, SQL) 

  • Preferred experience in leveraging LLMs to address business challenges, and familiarity with frameworks such as Langchain.

  • Experience developing and operating  within Snowflake

  • Expert in translating data findings to broader audiences including non-data stakeholders, engineering, and executive leadership to maximize impact

  • Preferred experience in dimensional modeling in dbt 

  • Experience working on advanced analytics models (machine learning or learning based models) that accomplish tasks such as making recommendations or scoring users.

  • Ability to demonstrate high self-sufficiency to take on complex problems in a timely manner

  • Consistently navigates ambiguous technical and business requirements while making flexible technical decisions

  • Consistently delivers technically challenging tasks efficiently with quality, speed, and simplicity

  • Payments and/or Ecommerce experience preferred


Our Stack

Vertex ai, Google Colab, Looker, Dbt, Snowflake, Airflow, Fivetran, CloudSQL/MySQL, Python (Pandas, NumPy, Scikit-learn) , Gitlab, Flask, Jinja, ES6, Vue.js, Saas, Webpack, Redis, Docker, GCP, Kubernetes, Helmfile, Terraform, Ansible, Nginx

Recharge | Instagram | Twitter | Facebook

Recharge Payments is an equal opportunity employer. In addition to EEO being the law, it is a policy that is fully consistent with our principles. All qualified applicants will receive consideration for employment without regard to status as a protected veteran or a qualified individual with a disability, or other protected status such as race, religion, color, national origin, sex, sexual orientation, gender identity, genetic information, pregnancy or age. Recharge Payments prohibits any form of workplace harassment. 

Transparency in Coverage

This link leads to the Anthem Blue Cross machine-readable files that are made available in response to the federal Transparency in Coverage Rule and includes network negotiated rates for all items and services; allowed amounts for OON items, services and prescription drugs; and negotiated rates and historical prices for network prescription drugs (delayed). EIN 80-6245138. This link leads to the Kaiser machine-readable files.

#LI-Remote

See more jobs at ReCharge Payments

Apply for this job

+30d

(Senior) Data Engineer - France (F/M/D)

ShippeoParis, France, Remote
MLairflowsqlRabbitMQdockerkubernetespython

Shippeo is hiring a Remote (Senior) Data Engineer - France (F/M/D)

Job Description

The Data Intelligence Tribe is responsible for leveraging Shippeo’s data from our large shipper and carrier base, to build data products that help our users (shippers and carriers alike) and ML models to provide predictive insights. This tribe’s typical responsibilities are to:

  • get accurately alerted in advance of any potential delays on their multimodal flows or anomalies so that they can proactively anticipate any resulting disruptions

  • extract the data they need, get direct access to it or analyze it directly on the platform to gain actionable insights that can help them increase their operational performance and the quality and compliance of their tracking

  • provide best-in-class data quality by implementing advanced cleansing & enhancement rules

As a Data Engineer at Shippeo, your objective is to ensure that data is available and exploitable by our Data Scientists and Analysts on our different data platforms. You will contribute to the construction and maintenance of Shippeo’s modern data stack that’s composed of different technology blocks:

  • Data Acquisition (Kafka, KafkaConnect, RabbitMQ),

  • Batch data transformation (Airflow, DBT),

  • Cloud Data Warehousing (Snowflake, BigQuery),

  • Stream/event data processing (Python, docker, Kubernetes) and all the underlying infrastructure that support these use cases.

 

Qualifications

Required:

  • You have a degree (MSc or equivalent) in Computer Science.

  • 3+ years of experience as a Data Engineer.

  • Experience building, maintaining, testing and optimizing data pipelines and architectures

  • Programming skills in Python 

  • Advanced working knowledge of SQL, experience working with relational databases and familiarity with a variety of databases.

  • Working knowledge of message queuing and stream processing.

  • Advanced knowledge of Docker and Kubernetes.

  • Advanced knowledge of a cloud platform (preferably GCP).

  • Advanced knowledge of a cloud based data warehouse solution (preferably Snowflake).

  • Experience with Infrastructure as code (Terraform/Terragrunt)

  • Experience building and evolving CI/CD pipelines (Github Actions).

Desired: 

  • Experience with Kafka and KafkaConnect (Debezium).

  • Monitoring and alerting on Grafana / Prometheus.

  • Experience working on Apache Nifi.

  • Experience working with workflow management systems such as Airflow.

See more jobs at Shippeo

Apply for this job

+30d

Senior Data Engineer (Taiwan)

GOGOXRemote
Full TimeairflowsqlazureapijavapythonAWS

GOGOX is hiring a Remote Senior Data Engineer (Taiwan)

Senior Data Engineer (Taiwan) - GoGoX - Career Page

See more jobs at GOGOX

Apply for this job

+30d

Data and Analytics Engineer

airflowsqlDesignpython

Cloudflare is hiring a Remote Data and Analytics Engineer

About Us

At Cloudflare, we are on a mission to help build a better Internet. Today the company runs one of the world’s largest networks that powers millions of websites and other Internet properties for customers ranging from individual bloggers to SMBs to Fortune 500 companies. Cloudflare protects and accelerates any Internet application online without adding hardware, installing software, or changing a line of code. Internet properties powered by Cloudflare all have web traffic routed through its intelligent global network, which gets smarter with every request. As a result, they see significant improvement in performance and a decrease in spam and other attacks. Cloudflare was named to Entrepreneur Magazine’s Top Company Cultures list and ranked among the World’s Most Innovative Companies by Fast Company. 

We realize people do not fit into neat boxes. We are looking for curious and empathetic individuals who are committed to developing themselves and learning new skills, and we are ready to help you do that. We cannot complete our mission without building a diverse and inclusive team. We hire the best people based on an evaluation of their potential and support them throughout their time at Cloudflare. Come join us! 

Available Locations: Lisbon, Portugal

About the team

You will be part of the Network Strategy team within Cloudflare’s Infrastructure Engineering department. The Network Strategy team focuses on building both external and internal relationships that allow for Cloudflare to scale and reach user populations around the world. Our group takes a long term and technical approach to forging mutually beneficial and sustainable relationships with all of our network partners. 

About the role

We are looking for an experienced Data and Analytics Engineer to join our team to scale our data insights initiatives. You will work with a wide array of data sources about network traffic, performance, and cost. You’ll be responsible for building data pipelines, doing ad-hoc analytics based on the data, and automating our analysis. Important projects include understanding the resource consumption and cost of Cloudflare’s broad product portfolio.

A candidate will be successful in this role if they’re flexible and able to match the right solution to the right problem. Flexibility is key. Cloudflare is a fast-paced environment and requirements change frequently.

What you'll do

  • Design and implement data pipelines that take unprocessed data and make it usable for advanced analytics
  • Work closely with other product and engineering teams to ensure our products and services collect the right data for our analytics
  • Work closely with a cross functional team of data scientists and analysts and internal stakeholders on strategic initiatives 
  • Build tooling, automation, and visualizations around our analytics for consumption by other Cloudflare teams

Examples of desirable skills, knowledge and experience

  • Excellent Python and SQL (one of the interviews will be a code review)
  • B.S. or M.S in Computer Science, Statistics, Mathematics, or other quantitative fields, or equivalent experience
  • Minimum 3 years of industry experience in software engineering, data engineering, data science or related field with a track record of extracting, transforming and loading large datasets 
  • Knowledge of data management fundamentals and data storage/computing principles
  • Excellent communication & problem solving skills 
  • Ability to collaborate with cross functional teams and work through ambiguous business requirements

Bonus Points

  • Familiarity with Airflow 
  • Familiarity with Google Cloud Platform or other analytics databases

What Makes Cloudflare Special?

We’re not just a highly ambitious, large-scale technology company. We’re a highly ambitious, large-scale technology company with a soul. Fundamental to our mission to help build a better Internet is protecting the free and open Internet.

Project Galileo: We equip politically and artistically important organizations and journalists with powerful tools to defend themselves against attacks that would otherwise censor their work, technology already used by Cloudflare’s enterprise customers--at no cost.

Athenian Project: We created Athenian Project to ensure that state and local governments have the highest level of protection and reliability for free, so that their constituents have access to election information and voter registration.

1.1.1.1: We released 1.1.1.1to help fix the foundation of the Internet by building a faster, more secure and privacy-centric public DNS resolver. This is available publicly for everyone to use - it is the first consumer-focused service Cloudflare has ever released. Here’s the deal - we don’t store client IP addresses never, ever. We will continue to abide by our privacy commitmentand ensure that no user data is sold to advertisers or used to target consumers.

Sound like something you’d like to be a part of? We’d love to hear from you!

This position may require access to information protected under U.S. export control laws, including the U.S. Export Administration Regulations. Please note that any offer of employment may be conditioned on your authorization to receive software or technology controlled under these U.S. export laws without sponsorship for an export license.

Cloudflare is proud to be an equal opportunity employer.  We are committed to providing equal employment opportunity for all people and place great value in both diversity and inclusiveness.  All qualified applicants will be considered for employment without regard to their, or any other person's, perceived or actual race, color, religion, sex, gender, gender identity, gender expression, sexual orientation, national origin, ancestry, citizenship, age, physical or mental disability, medical condition, family care status, or any other basis protected by law.We are an AA/Veterans/Disabled Employer.

Cloudflare provides reasonable accommodations to qualified individuals with disabilities.  Please tell us if you require a reasonable accommodation to apply for a job. Examples of reasonable accommodations include, but are not limited to, changing the application process, providing documents in an alternate format, using a sign language interpreter, or using specialized equipment.  If you require a reasonable accommodation to apply for a job, please contact us via e-mail athr@cloudflare.comor via mail at 101 Townsend St. San Francisco, CA 94107.

See more jobs at Cloudflare

Apply for this job