airflow Remote Jobs

139 Results

+30d

Ingénieur en apprentissage machine/Ingénieure en apprentissage machine

DevoteamNantes, France, Remote
MLDevOPSOpenAILambdaagileterraformscalaairflowansiblescrumgitc++dockerkubernetesjenkinspythonAWS

Devoteam is hiring a Remote Ingénieur en apprentissage machine/Ingénieure en apprentissage machine

Description du poste

Missions

  • Supporter le processus de développement machine learning de bout en bout pour concevoir, créer et gérer des logiciels reproductibles, testables et évolutifs.

  • Travailler sur la mise en place et l’utilisation de plateformes ML/IA/MLOps (telles que AWS SageMaker, Kubeflow, AWS Bedrock, AWS Titan) 

  • Apporter à nos clients des best practices en termes d’organisation, de développement, d’automatisation, de monitoring, de sécurité.

  • Expliquer et appliquer les best practices pour l’automatisation, le testing, le versioning, la reproductibilité et le monitoring de la solution IA déployée.

  • Encadrer et superviser les consultant(es) juniors i.e., peer code review, application des best practices.

  • Accompagner notre équipe commerciale sur la rédaction de propositions et des réunions d’avant-vente.

  • Participer au développement de notre communauté interne (REX, workshops, articles, hackerspace.

  • Participer au recrutement de nos futurs talents.

Qualifications

Compétences techniques 

REQUIRED : 

  • Parler couramment Python, PySpark ou Scala Spark. Scikit-learn, MLlib, Tensorflow, Keras, PyTorch, LightGBM, XGBoost, Scikit-Learn et Spark (pour ne citer qu’eux)

  • Savoir implémenter des architectures de Containerisation (Docker / Containerd) et les environnements en Serverless et micro services utilisant Lambda, ECS, Kubernetes

  • Parfaitement opérationnel pour mettre en place des environnements DevOps et Infra As Code, et pratiquer les outils de MLOps

  • Une bonne partie des outils Git, GitLab CI, Jenkins, Ansible, Terraform, Kubernetes, ML Flow, Airflow ou leurs équivalents dans les environnements Cloud doivent faire partie de votre quotidien. 

  • Cloud AWS (AWS Bedrock, AWS Titan, OpenAI, AWS SageMaker / Kubeflow)

  • Méthode Agile / Scrum

  • Feature Store (n'importe quel fournisseur)

 

NICE TO HAVE : 

  • Apache Airflow

  • AWS SageMaker / Kubeflow

  • Apache Spark

  • Méthode Agile / Scrum

Evoluer au sein de la communauté

Évoluer au sein de la Tribe Data, c’est être acteur dans la création d’un environnement stimulant dans lequel les consultants ne cessent de se tirer vers le haut, aussi bien en ce qui concerne les compétences techniques que les soft-skills. Mais ce n’est pas tout, c’est aussi des événements réguliers et des conversations slacks dédiées vous permettant de solliciter les communautés (data, AI/ML, DevOps, sécurité,...) dans leur ensemble !

A côté de cela, vous avez l’opportunité d’être moteur dans le développement des différentes communautés internes (REX, workshops, articles, Podcasts…).

Rémunération

La rémunération fixe proposée pour le poste est en fonction de votre expérience et dans une fourchette de 46,5k et 52,5k.

See more jobs at Devoteam

Apply for this job

+30d

Software Engineer (with overlap into ML Engineer) for Artificial Intelligence team (Engagement)

BloomreachSlovakia, Czechia, Remote
MLDevOPSredisagileremote-firstjiraairflowDesignmongodbapigitpythonAWS

Bloomreach is hiring a Remote Software Engineer (with overlap into ML Engineer) for Artificial Intelligence team (Engagement)

Bloomreach is the world’s #1 Commerce Experience Cloud, empowering brands to deliver customer journeys so personalized, they feel like magic. It offers a suite of products that drive true personalization and digital commerce growth, including:

  • Discovery, offering AI-driven search and merchandising
  • Content, offering a headless CMS
  • Engagement, offering a leading CDP and marketing automation solutions

Together, these solutions combine the power of unified customer and product data with the speed and scale of AI optimization, enabling revenue-driving digital commerce experiences that convert on any channel and every journey. Bloomreach serves over 850 global brands including Albertsons, Bosch, Puma, FC Bayern München, and Marks & Spencer. Bloomreach recently raised $175 million in a Series F funding round, bringing its total valuation to $2.2 billion. The investment was led by Goldman Sachs Asset Management with participation from Bain Capital Ventures and Sixth Street Growth. For more information, visit Bloomreach.com.

 

Join our Artificial Intelligence team as a Software Engineer and help us revolutionize marketing with ML-powered solutions! You'll work on cutting-edge technologies, impacting millions of users, and contributing to a product that truly makes a difference. The salary range starts at 3500€ per month,along with restricted stock unites and other benefits. Working in one of our Central European offices or from home on a full-time basis, you'll become a core part of the Engineering Team.

What challenge awaits you?

You'll face the exciting challenge of building and maintaining ML-powered features in a production environment, ensuring they are reliable, scalable, and deliver real value to our users. You'll work alongside a team to overcome the unique challenges of building and running ML models in a SaaS environment, including managing data complexity, optimizing for performance, and ensuring model robustness.

You will cooperate with your teammates, Data Science engineers, and Engineering and Product leaders to speed up ML-powered features' delivery (from ideation to production) by applying principles of continuous discovery, integration, testing, and other techniques from Agile, DevOps, and MLOps mindsets. This will involve building efficient workflows, automating processes, and fostering a culture of collaboration and innovation.

Your job will be to:

  1. Design & Deliver new features
  2. Ensure quality and performance of developed solution
  3. Support and Maintain owned components

a. Design & Deliver new features

  • Translate business requirements for ML-powered features into technical specifications and design documents.
  • Collaborate with data scientists to ensure new ML features' technical feasibility and scalability.
  • Define and develop back-office API endpoints (to configure the features) as well the high-performance serving endpoints.
  • Develop and implement ML models, algorithms, and data pipelines to support new features.
  • Deploy and monitor new features in production, ensuring seamless integration with existing systems.

b. Ensure quality and performance of developed solution

  • Perform rigorous testing and quality assurance of ML models and code, including unit tests, integration tests, and A/B testing.
  • Implement monitoring systems and dashboards to track the performance of ML models in production, identify potential issues, and optimize for accuracy and efficiency.
  • Contribute to developing and implementing DevOps and MLOps best practices within the team.

c. Support and Maintain owned components

  • Maintain end-to-end features, encompassing back-office APIs, models, definitions, and high-performance serving APIs.
  • Provide ongoing support and maintenance for existing ML-powered features, including troubleshooting issues, fixing bugs, and implementing enhancements.
  • Support our client-facing colleagues in the investigation of possible issues (L3 support).
  • Document code, design decisions, and operational procedures to facilitate ongoing maintenance and knowledge sharing.

What technologies and tools does the AI team work with?

  • Programming languages - Python 
  • Google Cloud Platform services - GKE, BigQuery, BigTable, GCS, Dataproc, VertexAI 
  • Data Storage and Processing - MongoDB, Redis, Spark, TensorFlow 
  • Software and Tools - Grafana, Sentry, Gitlab, Jira, Productboard, PagerDuty 

The owned area encompasses various domains such as Recommendations, Predictions, Contextual bandits, MLOps. Therefore, having experience in these areas would be beneficial. The team also works with large amounts of data and utilizes platforms and algorithms for model training and data processing & ML pipelines. Experience in these areas is highly valued.

Your success story will be:

  • In 30 Days: Successfully onboard and contribute to ongoing tasks, demonstrating understanding of the codebase and team processes.
  • In 90 Days: Contribute to design discussions and independently deliver high-quality code for assigned features. Participate in investigating and resolving production issues.
  • In 180 Days: Independently manage larger tasks, contribute to team improvements, and confidently handle L3 support, investigating and resolving production issues.

You have the following experience and qualities:

  1. Professional— Proven experience in python engineering, system design, and maintenance in the area of AI/ML-powered features.
  2. Personal — Demonstrates strong initiative, ability to work within a team, communication skills, and a commitment to continuous learning and improvement.

Professional experience

  • Proven experience in Python engineering, with a strong focus on designing and maintaining AI/ML-powered features in production environments.
  • Experience with cloud platforms (e.g., GCP, AWS) and relevant services for ML development and deployment.
  • Solid understanding of software architecture principles, particularly in the context of building and maintaining scalable and reliable APIs and microservices.
  • Experience with version control systems (e.g., Git) and CI/CD pipelines for efficient development and deployment.
  • Familiarity with common ML frameworks, libraries, and tools (e.g., TensorFlow, PyTorch, Scikit-learn, etc.) and with ML pipelines/orchestration frameworks (Kubeflow, Airflow, Prefect,... )

Personal qualities

  • Demonstrates strong initiative and a proactive approach to problem-solving.
  • Excellent communication and collaboration skills, with the ability to work effectively within a team.
  • A genuine passion for learning new technologies and keeping up-to-date with the latest advancements in AI/ML.
  • A commitment to delivering high-quality work and a dedication to continuous improvement.

Excited? Join us and transform the future of commerce experiences.

More things you'll like about Bloomreach:

Culture:

  • A great deal of freedom and trust. At Bloomreach we don’t clock in and out, and we have neither corporate rules nor long approval processes. This freedom goes hand in hand with responsibility. We are interested in results from day one. 

  • We have defined our5 valuesand the 10 underlying key behaviors that we strongly believe in. We can only succeed if everyone lives these behaviors day to day. We've embedded them in our processes like recruitment, onboarding, feedback, personal development, performance review and internal communication. 

  • We believe in flexible working hours to accommodate your working style.

  • We work remote-first with several Bloomreach Hubs available across three continents.

  • We organize company events to experience the global spirit of the company and get excited about what's ahead.

  • We encourage and support our employees to engage in volunteering activities - every Bloomreacher can take 5 paid days off to volunteer*.
  • TheBloomreach Glassdoor pageelaborates on our stellar 4.6/5 rating. The Bloomreach Comparably page Culture score is even higher at 4.9/5

Personal Development:

  • We have a People Development Program -- participating in personal development workshops on various topics run by experts from inside the company. We are continuously developing & updating competency maps for select functions.

  • Our resident communication coachIvo Večeřais available to help navigate work-related communications & decision-making challenges.*
  • Our managers are strongly encouraged to participate in the Leader Development Program to develop in the areas we consider essential for any leader. The program includes regular comprehensive feedback, consultations with a coach and follow-up check-ins.

  • Bloomreachers utilize the $1,500 professional education budget on an annual basis to purchase education products (books, courses, certifications, etc.)*

Well-being:

  • The Employee Assistance Program -- with counselors -- is available for non-work-related challenges.*

  • Subscription to Calm - sleep and meditation app.*

  • We organize ‘DisConnect’ days where Bloomreachers globally enjoy one additional day off each quarter, allowing us to unwind together and focus on activities away from the screen with our loved ones.

  • We facilitate sports, yoga, and meditation opportunities for each other.

  • Extended parental leave up to 26 calendar weeks for Primary Caregivers.*

Compensation:

  • Restricted Stock Units or Stock Options are granted depending on a team member’s role, seniority, and location.*

  • Everyone gets to participate in the company's success through the company performance bonus.*

  • We offer an employee referral bonus of up to $3,000 paid out immediately after the new hire starts.

  • We reward & celebrate work anniversaries -- Bloomversaries!*

(*Subject to employment type. Interns are exempt from marked benefits, usually for the first 6 months.)

Excited? Join us and transform the future of commerce experiences!

If this position doesn't suit you, but you know someone who might be a great fit, share it - we will be very grateful!


Any unsolicited resumes/candidate profiles submitted through our website or to personal email accounts of employees of Bloomreach are considered property of Bloomreach and are not subject to payment of agency fees.

 #LI-Remote

See more jobs at Bloomreach

Apply for this job

+30d

Data Engineer - AWS

Tiger AnalyticsJersey City,New Jersey,United States, Remote
S3LambdaairflowsqlDesignAWS

Tiger Analytics is hiring a Remote Data Engineer - AWS

Tiger Analytics is a fast-growing advanced analytics consulting firm. Our consultants bring deep expertise in Data Science, Machine Learning and AI. We are the trusted analytics partner for multiple Fortune 500 companies, enabling them to generate business value from data. Our business value and leadership has been recognized by various market research firms, including Forrester and Gartner. We are looking for top-notch talent as we continue to build the best global analytics consulting team in the world.

As an AWS Data Engineer, you will be responsible for designing, building, and maintaining scalable data pipelines on AWS cloud infrastructure. You will work closely with cross-functional teams to support data analytics, machine learning, and business intelligence initiatives. The ideal candidate will have strong experience with AWS services, Databricks, and Apache Airflow.

Key Responsibilities:

  • Design, develop, and deploy end-to-end data pipelines on AWS cloud infrastructure using services such as Amazon S3, AWS Glue, AWS Lambda, Amazon Redshift, etc.
  • Implement data processing and transformation workflows using Databricks, Apache Spark, and SQL to support analytics and reporting requirements.
  • Build and maintain orchestration workflows using Apache Airflow to automate data pipeline execution, scheduling, and monitoring.
  • Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver scalable data solutions.
  • Optimize data pipelines for performance, reliability, and cost-effectiveness, leveraging AWS best practices and cloud-native technologies.
  • 8+ years of experience building and deploying large-scale data processing pipelines in a production environment.
  • Hands-on experience in designing and building data pipelines on AWS cloud infrastructure.
  • Strong proficiency in AWS services such as Amazon S3, AWS Glue, AWS Lambda, Amazon Redshift, etc.
  • Strong experience with Databricks and Apache Spark for data processing and analytics.
  • Hands-on experience with Apache Airflow for orchestrating and scheduling data pipelines.
  • Solid understanding of data modeling, database design principles, and SQL.
  • Experience with version control systems (e.g., Git) and CI/CD pipelines.
  • Excellent communication skills and the ability to collaborate effectively with cross-functional teams.
  • Strong problem-solving skills and attention to detail.

This position offers an excellent opportunity for significant career development in a fast-growing and challenging entrepreneurial environment with a high degree of individual responsibility.

See more jobs at Tiger Analytics

Apply for this job

+30d

Senior Data Engineer

AltUS Remote
airflowpostgresDesignc++pythonAWS

Alt is hiring a Remote Senior Data Engineer

At Alt, we’re on a mission to unlock the value of alternative assets, and looking for talented people who share our vision. Our platform enables users to exchange, invest, value, securely store, and authenticate their collectible cards. And we envision a world where anything is an investable asset. 

To date, we’ve raised over $100 million from thought leaders at the intersection of culture, community, and capital. Some of our investors include Alexis Ohanian’s fund Seven Seven Six, the founders of Stripe, Coinbase co-founder Fred Ehrsam, BlackRock co-founder Sue Wagner, the co-founders of AngelList, First Round Capital, and BoxGroup. We’re also backed by professional athletes including Tom Brady, Candace Parker, Giannis Antetokounmpo, Alex Morgan, Kevin Durant, and Marlon Humphrey.

Alt is a dedicated equal opportunity employer committed to creating a diverse workforce. We celebrate our differences and strive to create an inclusive environment for all. We are focused on fostering a culture of empowerment which starts with providing our employees with the resources needed to reach their full potential.

What we are looking for:

We are seeking a Senior Data Engineer who is eager to make a significant impact. In this role, you'll get the opportunity to leverage your technical expertise and problem-solving skills to solve some of the hardest data problems in the hobby. Your primary focus in this role will be on enhancing and optimizing our pricing engine to support strategic business goals. Our ideal candidate is passionate about trading cards, has a strong sense of ownership, and enjoys challenges. At Alt, data is core to everything we do and is a differentiator for our customers. The team’s scope covers data pipeline development, search infrastructure, web scraping, detection algorithms, internal toolings and data quality. We give our engineers a lot of individual responsibility and autonomy, so your ability to make good trade-offs and exercise good judgment is essential.

The impact you will make:

  • Partner with engineers, and cross-functional stakeholders to contribute to all phases of algorithm development including: ideation, prototyping, design, and production
  • Build, iterate, productionize, and own Alt's valuation models
  • Leverage background in pricing strategies and models to develop innovative pricing solutions
  • Design and implement scalable, reliable, and maintainable machine learning systems
  • Partner with product to understand customer requirements and prioritize model features

What you bring to the table:

  • Experience: 5+ years of experience in software development, with a proven track record of developing and deploying models in production. Experience with pricing models preferred.
  • Technical Skills: Proficiency in programming languages and tools such as Python, AWS, Postgres, Airflow, Datadog, and JavaScript.
  • Problem-Solving: A knack for solving tough problems and a drive to take ownership of your work.
  • Communication: Effective communication skills with the ability to ship solutions quickly.
  • Product Focus: Excellent product instincts, with a user-first approach when designing technical solutions.
  • Team Player: A collaborative mindset that helps elevate the performance of those around you.
  • Industry Knowledge: Knowledge of the sports/trading card industry is a plus.

What you will get from us:

  • Ground floor opportunity as an early member of the Alt team; you’ll directly shape the direction of our company. The opportunities for growth are truly limitless.
  • An inclusive company culture that is being built intentionally to foster an environment that supports and engages talent in their current and future endeavors.
  • $100/month work-from-home stipend
  • $200/month wellness stipend
  • WeWork office Stipend
  • 401(k) retirement benefits
  • Flexible vacation policy
  • Generous paid parental leave
  • Competitive healthcare benefits, including HSA, for you and your dependent(s)

Alt's compensation package includes a competitive base salary benchmarked against real-time market data, as well as equity for all full-time roles. We want all full-time employees to be invested in Alt and to be able to take advantage of that investment, so our equity grants include a 10-year exercise window. The base salary range for this role is: $194,000 - $210,000. Offers may vary from the amount listed based on geography, candidate experience and expertise, and other factors.

See more jobs at Alt

Apply for this job

+30d

Cloud Platform Architect

SignifydUnited States (Remote);
MLDevOPSSQSLambdaagileBachelor's degreeBachelor degreeterraformairflowDesigndockerkubernetesAWS

Signifyd is hiring a Remote Cloud Platform Architect

Who Are You

We seek a highly skilled and experienced Cloud Platform Architect to join our dynamic and growing Cloud Platform team. As a Cloud Platform Architect, you will play a crucial role in strengthening and expanding the core of our cloud infrastructure. We want you to lead the way in scaling our cloud infrastructure for our customers, engineers, and data science teams. You will work alongside talented cloud platform and software engineers and architects to envision how all cloud infrastructure will evolve to support the expansion of Signifyd’s core products. The ideal candidate must: 

 

  • Effectively communicate complex problems by tailoring the message to the audience and presenting it clearly and concisely. 
  • Balance multiple perspectives, disagree, and commit when necessary to move key company decisions and critical priorities forward.
  • Understand the inner workings of Cloud Service Providers (CSPs) such as AWS, GCP, and Azure. Able to understand networking and security concepts core and most relevant within the space.
  • Ability to work independently in a dynamic environment and proactively approach problem-solving.
  • Be committed to achieving positive business outcomes via automation and enablement efforts, reducing costs, and improving operational excellence.
  • Be an example for fellow engineers by showcasing customer empathy, creativity, curiosity, and tenacity.
  • Have strong analytical and problem-solving skills, with the ability to innovate and adapt to fast-paced environments.
  • Design and build clear, understandable, simple, clean, and scalable solutions.
  • Champion an Agile and ‘DevOps’ mindset across the organization.

What You'll Do

  • Modernize Signifyd’s Cloud Platform to scale for security, cost, operational excellence, reliability, and performance, working closely with Engineering and Data Science teams across Signifyd’s R&D group.
  • Create and deliver a technology roadmap focused on advancing our cloud performance capabilities, supporting our real-time fraud protection and prevention via our core products.
  • Work alongside Architects, Software Engineers, ML Engineers, and Data Scientists to develop innovative big data processing solutions for scaling our core product for eCommerce fraud prevention.
  • Take full ownership of our Cloud Platform evolution to support low-latency, high-quality, high-scale decisioning for Signifyd’s flagship product.

 

  • Architect, deploy, and optimize Cloud Solutions to evolve our technology stack, including Multi-Account strategy, best practices around data access, IAM and security rules, and the best approaches for optimized and secure access to our infrastructure.
  • Implement Engineering Enablement automation and best-of-breed solutions for Developer Tooling to support Elite DORA metrics measurements and optimal Engineering Experience.
  • Mentor and coach fellow engineers on the team, fostering an environment of growth and continuous improvement.
  • Identify and address gaps in team capabilities and processes to enhance team efficiency and success.

What You'll Need

  • Ideally has 10-15+ years in cloud infrastructure engineering and automation, including at least five years of experience as a cloud engineering architect or lead. Have successfully navigated the challenges of working with large-scale cloud environments encompassing millions of dollars of computing costs and many petabytes of data storage and process.
  • Deep understanding of best practices and current trends in cloud providers, are comfortable working with multi-terabyte datasets, and skilled in high-scale data ingestion, transformation, and distributed processing; experience in understanding Apache Spark or Databricks is a plus.
  • Deep understanding of Container-based systems such as Kubernetes (k8s), Docker, ECS, EKS, GKE, and others.
  • Deep understanding of Networking concepts such as DNS / Route53, ELB/ALB, Networking load balancing, IAM rules, VPC peering and data connectivity, NAT gateways, Network bridge technology such as Megaport, and others.
  • Experience converting existing Cloud infrastructure to serverless architecture patterns (AWS Lambda, Kinesis, Aurora, etc.), deploying via Terraform, Pulumi, or AWS Cloud Formation / CDK.
  • Hands-on expertise in data technologies with proficiency in technologies such as Spark, Airflow, Databricks, AWS services (SQS, Kinesis, etc.), and Kafka. Understand the trade-offs of various architectural approaches and recommend solutions suited to our needs.
  • Executed the planning of product and infrastructure software releases
  • Experience in developing, deploying, and managing CI/CD developer tooling like AWS Code Commit, Code Build, Code Deploy, Code Pipeline, JetBrains TeamCity, and GitHub Enterprise.
  • Understanding how to appropriately deploy, integrate, and maintain Developer build and scanning tools such as Develocity Gradle Enterprise, Sonarqube, Maven, Snyk, CyCode, and others.
  • Deep knowledge of best practices around Logging, Monitoring, and Observability tools such as AWS Cloudwatch, Datadog, Loggly, and others.
  • Demonstrable ability to lead and mentor engineers, fostering their growth and development. 
  • You have successfully partnered with Product and Engineering teams to lead through strategic initiatives.
  • Commitment to quality: you take pride in delivering work that excels in accuracy, performance, and reliability, setting a high standard for the team and the organization.

 

#LI-Remote

Benefits in our US offices:

  • Discretionary Time Off Policy (Unlimited!)
  • 401K Match
  • Stock Options
  • Annual Performance Bonus or Commissions
  • Paid Parental Leave (12 weeks)
  • On-Demand Therapy for all employees & their dependents
  • Dedicated learning budget through Learnerbly
  • Health Insurance
  • Dental Insurance
  • Vision Insurance
  • Flexible Spending Account (FSA)
  • Short Term and Long Term Disability Insurance
  • Life Insurance
  • Company Social Events
  • Signifyd Swag

We want to provide an inclusive interview experience for all, including people with disabilities. We are happy to provide reasonable accommodations to candidates in need of individualized support during the hiring process.

Signifyd provides a base salary, bonus, equity and benefits to all its employees. Our posted job may span more than one career level, and offered level and salary will be determined by the applicant’s specific experience, knowledge, skills, and abilities, as well as internal equity and alignment with market data.

USA Base Salary Pay Range
$220,000$240,000 USD

See more jobs at Signifyd

Apply for this job

+30d

Ingénieur MLOps AWS

DevoteamNantes, France, Remote
MLDevOPSOpenAILambdaagileterraformscalaairflowansiblescrumgitc++dockerkubernetesjenkinspythonAWS

Devoteam is hiring a Remote Ingénieur MLOps AWS

Description du poste

???????? Missions

  • Supporter le processus de développement machine learning de bout en bout pour concevoir, créer et gérer des logiciels reproductibles, testables et évolutifs.

  • Travailler sur la mise en place et l’utilisation de plateformes ML/IA/MLOps (telles que AWS SageMaker, Kubeflow, AWS Bedrock, AWS Titan) 

  • Apporter à nos clients des best practices en termes d’organisation, de développement, d’automatisation, de monitoring, de sécurité.

  • Expliquer et appliquer les best practices pour l’automatisation, le testing, le versioning, la reproductibilité et le monitoring de la solution IA déployée.

  • Encadrer et superviser les consultant(es) juniors i.e., peer code review, application des best practices.

  • Accompagner notre équipe commerciale sur la rédaction de propositions et des réunions d’avant-vente.

  • Participer au développement de notre communauté interne (REX, workshops, articles, hackerspace.

  • Participer au recrutement de nos futurs talents.

Qualifications

???????? Compétences techniques 

REQUIRED : 

  • Parler couramment Python, PySpark ou Scala Spark. Scikit-learn, MLlib, Tensorflow, Keras, PyTorch, LightGBM, XGBoost, Scikit-Learn et Spark (pour ne citer qu’eux)

  • Savoir implémenter des architectures de Containerisation (Docker / Containerd) et les environnements en Serverless et micro services utilisant Lambda, ECS, Kubernetes

  • Parfaitement opérationnel pour mettre en place des environnements DevOps et Infra As Code, et pratiquer les outils de MLOps

  • Une bonne partie des outils Git, GitLab CI, Jenkins, Ansible, Terraform, Kubernetes, ML Flow, Airflow ou leurs équivalents dans les environnements Cloud doivent faire partie de votre quotidien. 

  • Cloud AWS (AWS Bedrock, AWS Titan, OpenAI, AWS SageMaker / Kubeflow)

  • Méthode Agile / Scrum

  • Feature Store (n'importe quel fournisseur)

 

NICE TO HAVE : 

  • Apache Airflow

  • AWS SageMaker / Kubeflow

  • Apache Spark

  • Méthode Agile / Scrum

???????? Evoluer au sein de la communauté

Évoluer au sein de la Tribe Data, c’est être acteur dans la création d’un environnement stimulant dans lequel les consultants ne cessent de se tirer vers le haut, aussi bien en ce qui concerne les compétences techniques que les soft-skills. Mais ce n’est pas tout, c’est aussi des événements réguliers et des conversations slacks dédiées vous permettant de solliciter les communautés (data, AI/ML, DevOps, sécurité,...) dans leur ensemble !

A côté de cela, vous avez l’opportunité d’être moteur dans le développement des différentes communautés internes (REX, workshops, articles, Podcasts…).

???????? Rémunération

La rémunération fixe proposée pour le poste est en fonction de votre expérience et dans une fourchette de 46,5k et 52,5k.

See more jobs at Devoteam

Apply for this job

+30d

Lead Data Engineer

DevoteamTunis, Tunisia, Remote
airflowsqlscrum

Devoteam is hiring a Remote Lead Data Engineer

Description du poste

Au sein de la direction « Plateforme Data », le consultant intégrera une équipe SCRUM et se concentrera sur un périmètre fonctionnel spécifique.

Votre rôle consistera à contribuer à des projets data en apportant votre expertise sur les tâches suivantes :

  • Concevoir, développer et maintenir des pipelines de données robustes et évolutifs sur Google Cloud Plateform (GCP), en utilisant des outils tels que BigQuery, Airflow, Looker et DBT.
  • Collaborer avec les équipes métier pour comprendre les exigences en matière de données et concevoir des solutions adaptées.
  • Optimiser les performances des traitements des données et des processus ELT en utilisant AirFlow, DBT et BigQuery.
  • Mettre en œuvre des processus de qualité des données pour garantir l'intégrité et la cohérence des données.
  • Travailler en étroite collaboration avec les équipes d'ingénierie pour intégrer les pipelines de données dans les applications et les services existants.
  • Rester à jour avec les nouvelles technologies et les meilleures pratiques dans le domaine du traitement des données et de l'analyse.

 

    Qualifications

    • Diplômé(e) d’un Bac+5 en école d'ingénieur ou équivalent universitaire avec une spécialisation en informatique.
    • Au moins 3 ans d'expérience dans le domaine de l'ingénierie des données, avec une expérience significative dans un environnement basé sur le cloud GCP.
    • Maîtrise avancée de SQL pour l'optimisation et le traitement des données.
    • Certification Google Professional Data Engineer est un plus.
    • Très bonne communication écrite et orale (livrables et reportings de qualité).

    See more jobs at Devoteam

    Apply for this job

    +30d

    Data Infrastructure Software Engineer II

    DevOPSremote-firstterraformairflowsqlDesigngraphqlc++dockerkubernetespython

    Khan Academy is hiring a Remote Data Infrastructure Software Engineer II

    ABOUT KHAN ACADEMY

    Khan Academy is a nonprofit with the mission to deliver a free, world-class education to anyone, anywhere. Our proven learning platform offers free, high-quality supplemental learning content and practice that cover Pre-K - 12th grade and early college core academic subjects, focusing on math and science. We have over 155 million registered learners globally and are committed to improving learning outcomes for students worldwide, focusing on learners in historically under-resourced communities.

    OUR COMMUNITY 

    Our students, teachers, and parents come from all walks of life, and so do we. Our team includes people from academia, traditional/non-traditional education, big tech companies, and tiny startups. We hire great people from diverse backgrounds and experiences because it makes our company stronger. We value diversity, equity, inclusion, and belonging as necessary to achieve our mission and impact the communities we serve. We know that transforming education starts in-house with learning about ourselves and our colleagues. We strive to be world-class in investing in our people and commit to developing you as a professional.

    THE ROLE

    We have some of the richest educational data in the world, and we want to leverage that data to develop a clearer picture of who our users are, how they are using the site, and how we could better serve them on their educational journey. Your work will enable answering critical and meaningful questions like "how do students learn most effectively?" and "how can we improve our content and product?"

    The ideal candidate will have a strong background in software development and DevOps, with a focus on data engineering. You will be responsible for designing, developing, and maintaining scalable systems and applications. 

    What you’ll work on:

    • Design and manage data pipelines and workflows using SQL, BigQuery, Airflow, and DBT.
    • Develop and maintain data visualization and analytical tools using Streamlit.
    • Design, develop, and maintain data infrastructure software using Python and Go (familiarity with  GraphQL a plus)
    • Implement and manage DevOps processes for Data Engineering tools
    • Use Docker and Kubernetes to build and deploy containerized applications.
    • Utilize Terraform for infrastructure as code to manage and provision cloud resources.

    You can read about our latest work on our Engineering Blog. A few highlights:

    WHAT YOU BRING

    • 4+ years experience in a software engineer role with a focus on tool design, creation, and maintenance for infrastructure or data engineering
    • Strong collaborative development experience including PR reviews and documentation writing
    • Proficient in SQL
    • Proficiency in writing and maintaining data pipelines and data quality monitors in a workflow management tool for productionized solutions, with source control and code review.
    • Proficiency in computer science and software engineering fundamentals, including the ability to program in Python and/or Go.
    • Proficient in Docker and Kubernetes.
    • Experience with Terraform.
    • Experience with BigQuery and data pipeline tools such as Airflow, & DBT would be a big plus as would familiarity with Streamlit for data visualization.

    Note: We welcome candidates with experience in any and all technologies. We don’t require experience in any particular language or tool. Our commitment to on-boarding and mentorship means you won’t be left in the dark as you learn new technologies.

    PERKS AND BENEFITS

    We may be a non-profit, but we reward our talented team extremely well! We offer:

    • Competitive salaries
    • Ample paid time off as needed – Your well-being is a priority.
    • Remote-first culture - that caters to your time zone, with open flexibility as needed, at times
    • Generous parental leave
    • An exceptional team that trusts you and gives you the freedom to do your best
    • The chance to put your talents towards a deeply meaningful mission and the opportunity to work on high-impact products that are already defining the future of education
    • Opportunities to connect through affinity, ally, and social groups
    • And we offer all those other typical benefits as well: 401(k) + 4% matching & comprehensive insurance, including medical, dental, vision, and life

    At Khan Academy we are committed to fair and equitable compensation practices, the well-being of our employees, and our Khan community. This belief is why we have built out a robust Total Rewards package that includes competitive base salaries, and extensive benefits and perks to support physical, mental, and financial well-being.

    The target salary range for this position(LEC IC 1.5) is $165,500 - $201,250 USD / 206,875 - 251,562 CAD. The pay range for this position is a general guideline only. The salary offered will depend on internal pay equity and the candidate’s relevant skills, experience, qualifications, and job market data. Exceptional performers in this role who make an outsized contribution can make well in excess of this range.  Additional incentives are provided as part of the complete total rewards package in addition to comprehensive medical and other benefits.

    MORE ABOUT US

    OUR COMPANY VALUES

    Live & breathe learners

    We deeply understand and empathize with our users. We leverage user insights, research, and experience to build content, products, services, and experiences that our users trust and love. Our success is defined by the success of our learners and educators.

    Take a stand

    As a company, we have conviction in our aspirational point of view of how education will evolve. The work we do is in service to moving towards that point of view. However, we also listen, learn and flex in the face of new data, and commit to evolving this point of view as the industry and our users evolve.

    Embrace diverse perspectives

    We are a diverse community. We seek out and embrace a diversity of voices, perspectives and life experiences leading to stronger, more inclusive teams and better outcomes. As individuals, we are committed to bringing up tough topics and leaning into different points of view with curiosity. We actively listen, learn and collaborate to gain a shared understanding. When a decision is made, we commit to moving forward as a united team.

    Work responsibly and sustainably

    We understand that achieving our audacious mission is a marathon, so we set realistic timelines and we focus on delivery that also links to the bigger picture. As a non-profit, we are supported by the generosity of donors as well as strategic partners, and understand our responsibility to our finite resources. We spend every dollar as though it were our own. We are responsible for the impact we have on the world and to each other. We ensure our team and company stay healthy and financially sustainable.

    Bring out the joy

    We are committed to making learning a joyful process. This informs what we build for our users and the culture we co-create with our teammates, partners and donors.

    Cultivate learning mindset

    We believe in the power of growth for learners and for ourselves. We constantly learn and teach to improve our offerings, ourselves, and our organization. We learn from our mistakes and aren’t afraid to fail. We don't let past failures or successes stop us from taking future bold action and achieving our goals.

    Deliver wow

    We insist on high standards and deliver delightful, effective end-to-end experiences that our users can rely on. We choose to focus on fewer things — each of which aligns to our ambitious vision — so we can deliver high-quality experiences that accelerate positive measurable learning with our strategic partners.

    We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, gender, gender identity or expression, national origin, sexual orientation, age, citizenship, marital status, disability, or Veteran status. We value diversity, equity, and inclusion, and we encourage candidates from historically underrepresented groups to apply.

    See more jobs at Khan Academy

    Apply for this job

    +30d

    Staff Full-Stack Engineer

    BetterUpAnywhere in the U.S. (Remote)
    Salesagileairflowrubyc++AWS

    BetterUp is hiring a Remote Staff Full-Stack Engineer

    Let’s face it, a company whose mission is human transformation better have some fresh thinking about the employer/employee relationship.

    We do. We can’t cram it all in here, but you’ll start noticing it from the first interview.

    Even our candidate experience is different. And when you get an offer from us (and accept it), you get way more than a paycheck. You get a personal BetterUp Coach, a development plan, a trained and coached manager, the most amazing team you’ve ever met (yes, each with their own personal BetterUp Coach), and most importantly, work that matters.

    This makes for a remarkably focused and fulfilling work experience. Frankly, it’s not for everyone. But for people with fire in their belly, it’s a game-changing, career-defining, soul-lifting move.

    Join us and we promise you the most intense and fulfilling years of your career, doing life-changing work in a fun, inventive, soulful culture.

    If that sounds exciting—and the job description below feels like a fit—we really should start talking. 

    As a Staff engineer here at BetterUp you will have the opportunity to leverage your expertise, passion, and drive to make the world better by crafting iconic, groundbreaking solutions via your deep ability to architect and deliver robust, scalable, foundational systems. You will help lead our technical strategy, mentor engineers, and drive innovation at the intersection of platform technology, data, and AI. 

    You will collaborate across functions to drive our mission to transform lives worldwide. You’ll also help us chart a course to elevate our platform, ensuring it is robust, flexible, and composable - you will help us create the bedrock necessary to unlock new possibilities for users across the globe. Your core mission will be to lead the way in building our infrastructure for innovation. You will lead by example, driving significant advancements while growing personally and professionally. Using tried and true technologies like Ruby on Rails running in AWS paired with the latest in data and GenAI technologies your team and platforms will be pivotal in positively impacting millions of people via our Human Transformation Platform. 

    Your leadership will extend to mentoring engineers and spearheading cross-functional collaborations that nurture a culture of innovation and continuous learning. You'll tackle exciting projects and complex problems, pushing the boundaries of technology to support personal and professional growth. 

    Your career at BetterUp will accelerate by delivering innovative projects, extensive leadership opportunities, and the profound impact of your work on global well-being. You'll have the freedom to innovate and the opportunity to grow, all while contributing significantly to our mission and having fun along the way. 

    This role won’t be easy, our course is steep, and the road ahead will be tough, so we are looking for someone who is comfortable in the rapidly changing world of hyper growth startups, who has the experience and wisdom to help us mature into a high growth, iconic world impacting organization. 

    At BetterUp we delight in supporting and pushing each other to bring out the best in our colleagues, and would love someone to join the team who shares our passion for customer empathy, engineering excellence, and continuous improvement. We also deeply understand that a key to peak performance is balance, and our culture is focused on providing the support our people need to be able to bring their whole selves to bear in service of our mission. We recognize that transforming lives starts with your own, and our organization is at the forefront of human transformation, and we aim to start with our team first, so we also give you and a friend a coach to help you grow and thrive. We have all the usual benefits (see details below), plus we also close the company for two full weeks a year to rest and recharge, and we have a thriving remote first culture. 

    What you'll do:

    • Spearhead delivery of scalable, robust systems for BetterUp's platform, ensuring they are flexible and adaptable for future growth.
    • Align technical efforts with business goals, innovating solutions that enhance user experience and operational efficiency.
    • Elevate our team through continuous learning, fostering an environment of engineering excellence, creating leverage via your experience, brilliance and leadership. 
    • Ensure technical solutions are cohesive and support company-wide objectives.
    • Apply deep product development expertise and a strong sense of customer empathy to guide the creation of user-centered solutions, ensuring the delivery of high-impact capabilities that drive value for members and our customers. 

    Attributes we look for: 

    • Bachelor's or Master's in Computer Science/Engineering or equivalent experience - you are relentlessly curious and an expert in multiple tried and true tech stacks, plus you are comfortable jumping into new areas and thrive on learning. 
    • Deep experience in startups with demonstrated success building scalable system architectures, especially for multi-sided marketplace platforms like BetterUp's Human Transformation Platform.
    • Exceptional problem-solving skills, with a strategic mindset able to navigate complex technical challenges, ensuring robust, flexible solutions that align with BetterUp’s long-term vision.
    • Extensive full-stack/data engineering experience, across industries, and domains. 
    • Expertise in building enterprise applications using tech like Ruby on Rails, Ember.js, AWS, etc plus the ability to leverage and reason about modern data technologies (DBT, Airflow, Snowflake) to drive data-driven, data powered systems.
    • Agile and Lean startup veteran who is able to deliver incredible customer-centric product innovations, by their direct engineering ability, but also by shaping and guiding product roadmaps in the face of ambiguity, volatility, and risk. 
    • Strong communicators who possess a passion for BetterUp’s mission, and have a demonstrated ability to mentor and lead with empathy, passion, and wisdom. 

    What will make you successful in your time at BetterUp: 

    • You have radical curiosity and you love to learn new things
    • You seek feedback and turn it into action
    • You can work autonomously while being great at collaboration
    • You mentor and empower others around you

    Benefits:

    At BetterUp, we are committed to living out our mission every day and that starts with providing benefits that allow our employees to care for themselves, support their families, and give back to their community. 

    • Access to BetterUp coaching; one for you and one for a friend or family member 
    • A competitive compensation plan with opportunity for advancement
    • Medical, dental and vision insurance
    • Flexible paid time off
    • Per year: 
      • All federal/statutory holidays observed
      • 4 BetterUp Inner Work days (https://www.betterup.co/inner-work)
      • 5 Volunteer Days to give back
      • Learning and Development stipend
      • Company wide Summer & Winter breaks 
    • Year-round charitable contribution of your choice on behalf of BetterUp
    • 401(k) self contribution

    We are dedicated to building diverse teams that fuel an authentic workplace and sense of belonging for each and every employee. We know applying for a job can be intimidating, please don’t hesitate to reach out — we encourage everyone interested in joining us to apply.

    BetterUp Inc. provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, disability, genetics, gender, sexual orientation, age, marital status, veteran status. In addition to federal law requirements, BetterUp Inc. complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.

    At BetterUp, we compensate our employees fairly for their work. Base salary is determined by job-related experience, education/training, residence location, as well as market indicators. The range below is representative of base salary only and does not include equity, sales bonus plans (when applicable) and benefits. This range may be modified in the future.

    The base salary range for this role is $147,000 – 245,000$.

    Protecting your privacy and treating your personal information with care is very important to us, and central to the entire BetterUp family. By submitting your application, you acknowledge that your personal information will be processed in accordance with ourApplicant Privacy Notice. If you have any questions about the privacy of your personal information or your rights with regards to your personal information, please reach out tosupport@betterup.co

    #LI-Remote

    See more jobs at BetterUp

    Apply for this job

    +30d

    Sr. Site Reliability Engineer

    Signify HealthDallas TX, Remote
    terraformairflowmobileazurec++kubernetespythonAWS

    Signify Health is hiring a Remote Sr. Site Reliability Engineer

    How will this role have an impact?

    Signify Health is looking for a passionate Site Reliability Engineer (SRE) to enhance our dynamic SRE team. Reporting to the Sr Director of Cloud Operations and SRE, we welcome individuals from different technical backgrounds, especially software engineers aspiring to transition into SRE/DevOps roles. 

    At Signify Health, we appreciate and respect the unique experiences and perspectives that each team member brings. We are committed to providing an environment where everyone feels welcomed, respected, and empowered. So, no matter what your background is, we invite you to join us and help shape the future of healthcare while refining your skills in the SRE domain.

    Diversity and Inclusion are core values at Signify Health, and fostering a workplace culture reflective of that is critical to our continued success as an organization

    What will you do?

    • Develop and implement strategies that improve the stability, scalability, and availability of our products
    • Maintain and deploy observability solutions for infrastructure and applications to ensure optimal performance
    • Participate in real-time service management, including crafting monitoring systems, alerts, playbooks, and runbooks in collaboration with our development teams
    • Utilize your on-call rotation to proactively prevent incidents and maintain uninterrupted operations
    • Work alongside colleagues from various disciplines to optimize operational processes
    • This is a remote role with some occasional travel required to Dallas, TX


    Basic Requirements

    • Minimum of 4 years of relevant technical experience, with an emphasis on SRE/DevOps
    • Experience creating python scripts to solve operational challenges
    • Experience with Pipeline orchestration tooling such as Airflow, Dagster, etc.
    • ELT tooling, Azure Data Factory
    • Experience with Databricks interface/tools
    • Practical experience with Azure or AWS, and Terraform
    • Working knowledge of Kubernetes (AKS/EKS preferred)
    • Familiarity with the deployment of CI/CD systems and practices


    About Us:

    Signify Health is helping build the healthcare system we all want to experience by transforming the home into the healthcare hub. We coordinate care holistically across individuals’ clinical, social, and behavioral needs so they can enjoy more healthy days at home. By building strong connections to primary care providers and community resources, we’re able to close critical care and social gaps, as well as manage risk for individuals who need help the most. This leads to better outcomes and a better experience for everyone involved.

    Our high-performance networks are powered by more than 9,000 mobile doctors and nurses covering every county in the U.S., 3,500 healthcare providers and facilities in value-based arrangements, and hundreds of community-based organizations. Signify’s intelligent technology and decision-support services enable these resources to radically simplify care coordination for more than 1.5 million individuals each year while helping payers and providers more effectively implement value-based care programs.

    We are committed to equal employment opportunities for employees and job applicants in compliance with applicable law and to an environment where employees are valued for their differences.

    To learn more about how we’re driving outcomes and making healthcare work better, please visit us at www.signifyhealth.com

    See more jobs at Signify Health

    Apply for this job

    +30d

    Data Engineer

    SonderMindDenver, CO or Remote
    S3scalaairflowsqlDesignjavac++pythonAWS

    SonderMind is hiring a Remote Data Engineer

    About SonderMind

    At SonderMind, we know that therapy works. SonderMind provides accessible, personalized mental healthcare that produces high-quality outcomes for patients. SonderMind's individualized approach to care starts with using innovative technology to help people not just find a therapist, but find the right, in-network therapist for them, should they choose to use their insurance. From there, SonderMind's clinicians are committed to delivering best-in-class care to all patients by focusing on high-quality clinical outcomes. To enable our clinicians to thrive, SonderMind defines care expectations while providing tools such as clinical note-taking, secure telehealth capabilities, outcome measurement, messaging, and direct booking.

    To follow the latest SonderMind news, get to know our clients, and learn about what it’s like to work at SonderMind, you can follow us on Instagram, Linkedin, and Twitter. 

    About the Role

    In this role, you will be responsible for designing, building, and managing the information infrastructure systems used to collect, store, process, and distribute data. You will also be tasked with transforming data into a format that can be easily analyzed. You will work closely with data engineers on data architectures and with data scientists and business analysts to ensure they have the data necessary to complete their analyses.

    Essential Functions

    • Strategically design, construct, install, test, and maintain highly scalable data management systems
    • Develop and maintain databases, data processing procedures, and pipelines
    • Integrate new data management technologies and software engineering tools into existing structures
    • Develop processes for data mining, data modeling, and data production
    • Translate complex functional and technical requirements into detailed architecture, design, and high-performing software and applications
    • Create custom software components and analytics applications
    • Troubleshoot data-related issues and perform root cause analysis to resolve them
    • Manage overall pipeline orchestration
    • Optimize data warehouse performance

     

    What does success look like?

    Success in this role will be by the seamless and efficient operations of data infrastructure. This includes minimal downtime, accurate and timely data delivery and the successful implementation of new technologies and tools. The individual will have demonstrated their ability to collaborate effectively to define solutions with both technical and non-technical team members across data science, engineering, product and our core business functions. They will have made significant contributions to improving our data systems, whether through optimizing existing processes or developing innovative new solutions. Ultimately, their work will enable more informed and effective decision-making across the organization.

     

    Who You Are 

    • Bachelor’s degree in Computer Science, Engineering, or a related field
    • Minimum three years experience as a Data Engineer or in a similar role
    • Experience with data science and analytics engineering is a plus
    • Experience with AI/ML in GenAI or data software - including vector databases - is a plus
    • Proficient with scripting and programming languages (Python, Java, Scala, etc.)
    • In-depth knowledge of SQL and other database related technologies
    • Experience with Snowflake, DBT, BigQuery, Fivetran, Segment, etc.
    • Experience with AWS cloud services (S3, RDS, Redshift, etc.)
    • Experience with data pipeline and workflow management tools such as Airflow
    • Strong negotiation and interpersonal skills: written, verbal, analytical
    • Motivated and influential – proactive with the ability to adhere to deadlines; work to “get the job done” in a fast-paced environment
    • Self-starter with the ability to multi-task

    Our Benefits 

    The anticipated salary rate for this role is between $130,000-160,000 per year.

    As a leader in redesigning behavioral health, we are walking the walk with our employee benefits. We want the experience of working at SonderMind to accelerate people’s careers and enrich their lives, so we focus on meeting SonderMinders wherever they are and supporting them in all facets of their life and work.

    Our benefits include:

    • A commitment to fostering flexible hybrid work
    • A generous PTO policy with a minimum of three weeks off per year
    • Free therapy coverage benefits to ensure our employees have access to the care they need (must be enrolled in our medical plans to participate)
    • Competitive Medical, Dental, and Vision coverage with plans to meet every need, including HSA ($1,100 company contribution) and FSA options
    • Employer-paid short-term, long-term disability, life & AD&D to cover life's unexpected events. Not only that, we also cover the difference in salary for up to seven (7) weeks of short-term disability leave (after the required waiting period) should you need to use it.
    • Eight weeks of paid Parental Leave (if the parent also qualifies for STD, this benefit is in addition which allows between 8-16 weeks of paid leave)
    • 401K retirement plan with 100% matching which immediately vests on up to 4% of base salary
    • Travel to Denver 1x a year for annual Shift gathering
    • Fourteen (14) company holidays
    • Company Shutdown between Christmas and New Years
    • Supplemental life insurance, pet insurance coverage, commuter benefits and more!

    Application Deadline

    This position will be an ongoing recruitment process and will be open until filled.

     

    Equal Opportunity 
    SonderMind does not discriminate in employment opportunities or practices based on race, color, creed, sex, gender, gender identity or expression, pregnancy, childbirth or related medical conditions, religion, veteran and military status, marital status, registered domestic partner status, age, national origin or ancestry, physical or mental disability, medical condition (including genetic information or characteristics), sexual orientation, or any other characteristic protected by applicable federal, state, or local laws.

    Apply for this job

    +30d

    Director of Finance Data

    MonzoCardiff, London or Remote (UK)
    airfloworaclec++python

    Monzo is hiring a Remote Director of Finance Data

    ???? We’re on a mission to make money work for everyone.

    We’re waving goodbye to the complicated and confusing ways of traditional banking. 

    With our hot coral cards and get-paid-early feature, combined with financial education on social media and our award winning customer service, we have a long history of creating magical moments for our customers!

    We’re not about selling products - we want to solve problems and change lives through Monzo ❤️

    Hear from our team about what it's like working at Monzo


     

    ????London/Cardiff/UK Remote | ???? £155,000 - £175,000 + Benefits 

    About our Data Discipline:

    We have a strong culture of data-driven decision making across the whole company. And we're great believers in powerful, real-time analytics and empowerment of the wider business. All our data lives in one place and is super easy to use. 

    We work in cross-functional squads where every data practitioner is a member of a central data discipline and fully embedded into a product squad alongside Engineers, Designers, Marketers, Product Managers, Finance Analysts etc.

    What you’ll be working on:

    You’ll work closely in the intersection between finance and data at Monzo, a role that will be critical to reach our ambitious goals within this area, which will be vital to future proof Monzo and support our mission to make money work for everyone.

    You’ll be collaborating closely with the finance leadership team to build out a modern, data driven finance function. You will contribute towards both defining and executing our finance strategy which currently includes significant systems and automation investment. This includes scoping and delivering the data work required to support our finance systems (accounting, financial forecasting, treasury and regulatory reporting) as well as continuing to transform our key processes and controls. You’ll also be acting as an interface to Monzo’s engineering discipline across these transformation projects.

    You will lead a data team responsible for building and maintaining reporting datasets required to embed control readiness and operational efficiency to allow Monzo to scale. This work will also be crucial to place finance on the right path in our journey to a potential IPO.

    You’ll be leading a talented team building and analysing data models for commercial insight into our business performance. This includes building relationships within our Financial Planning & Analysis function and across the business to ensure we have the right analytics capabilities for successful business partnering.

    Your day-to-day 

    • Establish yourself as a trusted partner to the finance leadership team with a reputation for getting things done
    • Work closely with finance leadership to define data’s priorities and support our finance strategy
    • Ensure we have the right data capabilities, infrastructure and talent in place to be able to to make data driven commercial decisions and enable robust and reliable financial/regulatory reporting
    • Bring data leadership and rigour to our finance function, enabling us to efficiently manage the bank’s performance and risk
    • Help hire, develop and retain top tier data analysts and analytics engineers
    • Promote a culture of proactive, high quality data insights that can impact the direction of Monzo

    You should apply if:

    What we’re doing here at Monzo excites you!

    • You have multiple years of experience in data, leading analytics teams and building data infrastructure
    • You consider yourself an empathetic leader and have experience managing multiple data individual contributors and data managers and you really enjoy that part of the job
    • You are comfortable getting hands-on as well as taking a step back and thinking strategically and proactively identifying opportunities
    • You have experience working together and collaborating with senior business stakeholders and finance teams
    • Working knowledge of Python, Airflow, dbt, Bigquery and Looker

     

    Nice to haves

    • You have experience in one or more Finance domains, for example: Financial Reporting, Treasury, Regulatory Reporting or Financial Planning & Analysis, Financial Risk and Investor Relations
    • Experience with financial systems such as NetSuite, SAP, Oracle, Anaplan, Blackline, Vermeg

    The Interview Process:

    Our interview process involves 3 main stages. We promise not to ask you any brain teasers or trick questions!

    • 30 minute recruiter call 
    • 45 minute call with hiring manager 
    • 4 x 1-hour video calls with various team members

    Our average process takes around 3-4 weeks but we will always work around your availability. You will have the chance to speak to our recruitment team at various points during your process but if you do have any specific questions ahead of this please contact us on tech-hiring@monzo.com

    What’s in it for you:

    ✈️ We can help you relocate to the UK 

    ✅ We can sponsor visas

    ????This role can be based in our London office, but we're open to distributed working within the UK (with ad hoc meetings in London).

    ⏰ We offer flexible working hours and trust you to work enough hours to do your job well, at times that suit you and your team.

    ????Learning budget of £1,000 a year for books, training courses and conferences

    ➕And much more, see our full list of benefits here

    #LI-REMOTE #LI-NJ1


    Equal opportunities for everyone

    Diversity and inclusion are a priority for us and we’re making sure we have lots of support for all of our people to grow at Monzo. At Monzo, we’re embracing diversity by fostering an inclusive environment for all people to do the best work of their lives with us. This is integral to our mission of making money work for everyone. You can read more in our blog, 2022 Diversity and Inclusion Report and 2023 Gender Pay Gap Report.

    We’re an equal opportunity employer. All applicants will be considered for employment without attention to age, ethnicity, religion, sex, sexual orientation, gender identity, family or parental status, national origin, or veteran, neurodiversity or disability status.

    See more jobs at Monzo

    Apply for this job

    +30d

    Senior Business Intelligence Analyst

    InstacartUnited States - Remote
    tableaujiraairflowsqlDesign

    Instacart is hiring a Remote Senior Business Intelligence Analyst

    We're transforming the grocery industry

    At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

    Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

    Instacart is a Flex First team

    There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

    Overview

    About the Role

    We are looking for an exceptional Senior Business Intelligence Analyst to help build and manage robust data models, leverage data visualization tools to create dashboards, and partner closely with cross functional teams to design data solutions needed to enable financial reporting and operations. 

     

    About the Team

    You will be joining the Financial Systems Analytics team, which sits within the Finance department at Instacart. This team is responsible for ensuring financial data is accessible, complete, accurate, and timely for downstream consumers. As part of the team, you will be a key contributor in enabling financial data reporting, analysis, and other critical business operations within Instacart.  

     

    About the Job

    • Build and regularly maintain data pipelines and models critical to Instacart’s business operation, including those used for financial reporting and analysis 
    • Partner closely with Accounting, Strategic Finance, Data Science, and other teams across the company to understand their most complex problems and develop effective data solutions, including definition and development of supporting data models and architecture
    • Contribute to the optimization, documentation, testing, and tooling efforts aimed at improving data quality and empowering data consumers across the organization
    • Regularly communicate progress, risks, and completion of projects with stakeholders, teammates, and management
    • Work closely with the Product, Data Engineering, and Business Development teams to stay current on the latest product rollouts and their data and financial impacts
    • Promote and drive a self-service data culture by developing self-service data models, building easy-to-use tools and dashboards, and teaching business users how to use them

     

    About You


    Minimum Requirements:

    • 5+ years of hands-on experience in BI, Data Science, or Data engineering
    • Bachelor’s Degree or equivalent
    • AdvancedSQL experience and dashboard building
    • Highly effective written and verbal communication skills
    • Proven ability to prioritize work and deliver finished products on tight deadlines
    • Ability to communicate and coordinate with cross-functional teams, gather information, perform root cause analysis, and recommend solutions to business problems
    • Positive attitude and enthusiasm for Instacart, your team, partners, and stakeholders

     

    Preferred Requirements: 

    • Familiarity with: Snowflake/Databricks/BigQuery or similar data warehouses, DBT/Apache Airflow or similar orchestration tools, Github, and Jira 
    • Familiarity with Visualization Tools: Mode, Tableau, or similar
    • Understanding of financial concepts, common accounting practices, and system solutions
    • Exposure to SOX compliance best practices, including practical applications and experience with ITGCs

     

    Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policyhere.

    Offers may vary based on many factors, such as candidate experience and skills required for the role.Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offeringshere.

    For US based candidates, the base pay ranges for a successful candidate are listed below.

    CA, NY, CT, NJ
    $147,000$163,000 USD
    WA
    $140,000$156,000 USD
    OR, DE, ME, MA, MD, NH, RI, VT, DC, PA, VA, CO, TX, IL, HI
    $135,000$150,000 USD
    All other states
    $122,000$135,000 USD

    See more jobs at Instacart

    Apply for this job

    +30d

    Senior Data Scientist (Argentina)

    KaratRemote (Argentina)
    airflowazurec++pythonAWS

    Karat is hiring a Remote Senior Data Scientist (Argentina)

    We're Karat, the world's largest interviewing company.

    Karat helps companies hire top engineering talent with confidence. As an end-to-end hiring solution, we work with organizations to improve the quality, efficiency, and equity of their technical hiring process. Global leaders like Walmart, Atlassian, and Citi rely on Karat to conduct hundreds of assessments and interviews every day through a powerful combination of human expertise and innovative technology. Our mission is to make every interview predictive, fair and enjoyable so we can unlock opportunity -- for everyone. We’re a passionate, focused, human-centric team, and we want you to join us!

    Come join our Artificial Intelligence team

    Our AI team will work across product boundaries to rapidly prototype the AI products and features that will define the future of technical interviewing for Karat, create a competitive moat for our business, and allow our customers to source, hire, and retain the right talent with increased predictivity, less bias, and previously unimaginable efficiency.

    What you will do

    As a Senior Data Scientist on the AI team at Karat, you will drive innovation by extracting key insights from our data, and providing actionable recommendations for innovative products that can be built into the Karat platform as we scale. Collaborating with a small team of Engineers, you will innovate on top of the Company’s internal and external data, leverage cutting-edge AI technology like LLMs, carry complex analysis, build proofs-of-concept and prototypes, create insight, and build models and products that serve a variety of key end-users in the Interviewing Cloud space.

    • Lead projects in close partnership with engineering teams to increase the signal captured during technical interviews, maximizing the product value provided to clients.
    • Work hands-on in coding, data preprocessing, feature engineering, model development, and exploratory data analysis.
    • Lead projects and mentor teammates working in the Data Science and AI realms across all teams and products.
    • Partner with business stakeholders to understand the Company’s business challenges, develop hypotheses, and propose data-driven solutions to those challenges.
    • Present complex findings and insights to non-technical audiences, influencing decision-makers across the organization.
    • Stay abreast of the latest advancements in Data Science, Machine Learning, and AI, and proactively identify opportunities to apply emerging technologies to Karat’s business needs.
    • Lead the dialogue within Karat about AI, Data Science and Machine Learning.

    The experience you will bring

    • 5+ years of tech industry experience in Data Science, Machine Learning, AI, or other related roles
    • Strong expertise in programming languages such as Python, and proficiency in using relevant data science libraries and frameworks
    • Experience with cloud platforms (e.g., AWS, Azure, GCP), data engineering pipeline and tools such as dbt, airflow, and model deployment process
    • Proven track record leading Data Science, Machine Learning or AI projects that have delivered significant business impact
    • Exceptional problem-solving skills with the ability to translate business challenges into data-driven solutions
    • Deep understanding of statistical analysis, data manipulation, and data visualization techniques
    • Excellent communication skills, with the ability to communicate complex technical concepts to both technical and non-technical audiences
    • Strong fluency in written and spoken English

    Applicants, please note: submissions not 100% in English will not be considered.


    Legal Employment Statement 

    Karat is a U.S. company. In order to work with individuals outside of the United States, we partner with a Professional Employer Organization (PEO). If hired for this position, your legal employer will be the PEO. This means your payroll, benefits offered, time off, etc., will be offered and managed by them.


    Statement of Inclusivity:

    In keeping with our beliefs and goals, no employee or applicant will face discrimination or harassment based on: race, color, ancestry, national origin, religion, age, gender, marital/domestic partner status, sexual orientation, gender identity or expression, disability status, or veteran status. Above and beyond discrimination and harassment based on “protected categories,” we also strive to prevent other subtler forms of inappropriate behavior (i.e., stereotyping) from ever gaining a foothold in our office. Whether blatant or hidden, barriers to success have no place at Karat.

    We value a diverse workforce: people of color, womxn, and LGBTQIA+ individuals are strongly encouraged to apply.

    If you have a disability or special need that requires accommodation, please let us know at accommodation@karat.com.

    See more jobs at Karat

    Apply for this job

    +30d

    Senior Data Scientist (Mexico)

    KaratRemote (Mexico)
    airflowazurec++pythonAWS

    Karat is hiring a Remote Senior Data Scientist (Mexico)

    We're Karat, the world's largest interviewing company.

    Karat helps companies hire top engineering talent with confidence. As an end-to-end hiring solution, we work with organizations to improve the quality, efficiency, and equity of their technical hiring process. Global leaders like Walmart, Atlassian, and Citi rely on Karat to conduct hundreds of assessments and interviews every day through a powerful combination of human expertise and innovative technology. Our mission is to make every interview predictive, fair and enjoyable so we can unlock opportunity -- for everyone. We’re a passionate, focused, human-centric team, and we want you to join us!

    Come join our Artificial Intelligence team

    Our AI team will work across product boundaries to rapidly prototype the AI products and features that will define the future of technical interviewing for Karat, create a competitive moat for our business, and allow our customers to source, hire, and retain the right talent with increased predictivity, less bias, and previously unimaginable efficiency.

    What you will do

    As a Senior Data Scientist on the AI team at Karat, you will drive innovation by extracting key insights from our data, and providing actionable recommendations for innovative products that can be built into the Karat platform as we scale. Collaborating with a small team of Engineers, you will innovate on top of the Company’s internal and external data, leverage cutting-edge AI technology like LLMs, carry complex analysis, build proofs-of-concept and prototypes, create insight, and build models and products that serve a variety of key end-users in the Interviewing Cloud space.

    • Lead projects in close partnership with engineering teams to increase the signal captured during technical interviews, maximizing the product value provided to clients.
    • Work hands-on in coding, data preprocessing, feature engineering, model development, and exploratory data analysis.
    • Lead projects and mentor teammates working in the Data Science and AI realms across all teams and products.
    • Partner with business stakeholders to understand the Company’s business challenges, develop hypotheses, and propose data-driven solutions to those challenges.
    • Present complex findings and insights to non-technical audiences, influencing decision-makers across the organization.
    • Stay abreast of the latest advancements in Data Science, Machine Learning, and AI, and proactively identify opportunities to apply emerging technologies to Karat’s business needs.
    • Lead the dialogue within Karat about AI, Data Science and Machine Learning.

    The experience you will bring

    • 5+ years of tech industry experience in Data Science, Machine Learning, AI, or other related roles
    • Strong expertise in programming languages such as Python, and proficiency in using relevant data science libraries and frameworks
    • Experience with cloud platforms (e.g., AWS, Azure, GCP), data engineering pipeline and tools such as dbt, airflow, and model deployment process
    • Proven track record leading Data Science, Machine Learning or AI projects that have delivered significant business impact
    • Exceptional problem-solving skills with the ability to translate business challenges into data-driven solutions
    • Deep understanding of statistical analysis, data manipulation, and data visualization techniques
    • Excellent communication skills, with the ability to communicate complex technical concepts to both technical and non-technical audiences
    • Strong fluency in written and spoken English

    Applicants, please note: submissions not 100% in English will not be considered.


    Legal Employment Statement 

    Karat is a U.S. company. In order to work with individuals outside of the United States, we partner with a Professional Employer Organization (PEO). If hired for this position, your legal employer will be the PEO. This means your payroll, benefits offered, time off, etc., will be offered and managed by them.


    Statement of Inclusivity:

    In keeping with our beliefs and goals, no employee or applicant will face discrimination or harassment based on: race, color, ancestry, national origin, religion, age, gender, marital/domestic partner status, sexual orientation, gender identity or expression, disability status, or veteran status. Above and beyond discrimination and harassment based on “protected categories,” we also strive to prevent other subtler forms of inappropriate behavior (i.e., stereotyping) from ever gaining a foothold in our office. Whether blatant or hidden, barriers to success have no place at Karat.

    We value a diverse workforce: people of color, womxn, and LGBTQIA+ individuals are strongly encouraged to apply.

    If you have a disability or special need that requires accommodation, please let us know at accommodation@karat.com.

    See more jobs at Karat

    Apply for this job

    +30d

    Senior Data Scientist

    KaratRemote (United States)
    airflowazurec++pythonAWS

    Karat is hiring a Remote Senior Data Scientist

    We're Karat, the world's largest interviewing company.

    Karat helps companies hire top engineering talent with confidence. As an end-to-end hiring solution, we work with organizations to improve the quality, efficiency, and equity of their technical hiring process. Global leaders like Walmart, Atlassian, and Citi rely on Karat to conduct hundreds of assessments and interviews every day through a powerful combination of human expertise and innovative technology. Our mission is to make every interview predictive, fair and enjoyable so we can unlock opportunity -- for everyone. We’re a passionate, focused, human-centric team, and we want you to join us!

    Come join our Artificial Intelligence team

    Our AI team will work across product boundaries to rapidly prototype the AI products and features that will define the future of technical interviewing for Karat, create a competitive moat for our business, and allow our customers to source, hire, and retain the right talent with increased predictivity, less bias, and previously unimaginable efficiency.

    What you will do

    As a Senior Data Scientist on the AI team at Karat, you will drive innovation by extracting key insights from our data, and providing actionable recommendations for innovative products that can be built into the Karat platform as we scale. Collaborating with a small team of Engineers, you will innovate on top of the Company’s internal and external data, leverage cutting-edge AI technology like LLMs, carry complex analysis, build proofs-of-concept and prototypes, create insight, and build models and products that serve a variety of key end-users in the Interviewing Cloud space.

    This is a Salary, Exemptposition. Immigration sponsorshipis notavailable.

    • Lead projects in close partnership with engineering teams to increase the signal captured during technical interviews, maximizing the product value provided to clients.
    • Work hands-on in coding, data preprocessing, feature engineering, model development, and exploratory data analysis.
    • Lead projects and mentor teammates working in the Data Science and AI realms across all teams and products.
    • Partner with business stakeholders to understand the Company’s business challenges, develop hypotheses, and propose data-driven solutions to those challenges.
    • Present complex findings and insights to non-technical audiences, influencing decision-makers across the organization.
    • Stay abreast of the latest advancements in Data Science, Machine Learning, and AI, and proactively identify opportunities to apply emerging technologies to Karat’s business needs.
    • Lead the dialogue within Karat about AI, Data Science and Machine Learning.

    The experience you will bring

    • 5+ years of tech industry experience in Data Science, Machine Learning, AI, or other related roles
    • Strong expertise in programming languages such as Python, and proficiency in using relevant data science libraries and frameworks
    • Experience with cloud platforms (e.g., AWS, Azure, GCP), data engineering pipeline and tools such as dbt, airflow, and model deployment process
    • Proven track record leading Data Science, Machine Learning or AI projects that have delivered significant business impact
    • Exceptional problem-solving skills with the ability to translate business challenges into data-driven solutions
    • Deep understanding of statistical analysis, data manipulation, and data visualization techniques
    • Excellent communication skills, with the ability to communicate complex technical concepts to both technical and non-technical audiences
    Individual base pay depends on various factors, in addition to primary work location, such as complexity and responsibility of role, job duties/requirements, and relevant experience and skills. This role may be eligible for additional rewards, including commissions, bonuses, and equity.
    The base salary pay range across the United States for this role is:
    $117,490.80$173,500 USD

    Statement of Inclusivity:

    In keeping with our beliefs and goals, no employee or applicant will face discrimination or harassment based on: race, color, ancestry, national origin, religion, age, gender, marital/domestic partner status, sexual orientation, gender identity or expression, disability status, or veteran status. Above and beyond discrimination and harassment based on “protected categories,” we also strive to prevent other subtler forms of inappropriate behavior (i.e., stereotyping) from ever gaining a foothold in our office. Whether blatant or hidden, barriers to success have no place at Karat.

    We value a diverse workforce: people of color, womxn, and LGBTQIA+ individuals are strongly encouraged to apply.

    If you have a disability or special need that requires accommodation, please let us know at accommodation@karat.com.

    See more jobs at Karat

    Apply for this job

    +30d

    Data Analyst

    Paramo TechnologiesBuenos Aires, AR Remote
    airflowsqlpython

    Paramo Technologies is hiring a Remote Data Analyst

    We are

    a cutting-edge e-commerce company developing products for our own technological platform. Our creative, smart and dedicated teams pool their knowledge and experience to find the best solutions to meet project needs, while maintaining sustainable and long-lasting results. How? By making sure that our teams thrive and develop professionally. Strong advocates of hiring top talent and letting them do what they do best, we strive to create a workplace that allows for an open, collaborative and respectful culture.

    What you will be doing

    As a Data Quality Analyst your primary priority is to ensure that the data used by the company is accurate, complete, and consistent. You will be ensuring that the data used by the company is of high quality, which would elevate the business decision-making, improves operational efficiency, and enhances customer satisfaction. This role also provides continuous automated and manual testing of data sets for use in internal data systems and for delivery from internal systems.

    As part of your essential functions you will have to:

    • Identify data quality issues and working with other teams to resolve them.
    • Establish data quality standards and metrics: The analyst/engineer would work with stakeholders to define the quality standards that data must meet, and establish metrics to measure data quality.
    • Monitor data quality: The analyst/engineer would monitor data quality on an ongoing basis, using automated tools and manual checks to identify issues.
    • Investigate data quality issues: When data quality issues are identified, the analyst/engineer would investigate them to determine their root cause and work with other teams to resolve them.
    • Develop data quality processes: The analyst/engineer would develop processes to ensure that data is checked for quality as it is collected and processed, and that data quality issues are addressed promptly. This stage includes the use of tools and technology in order to promote efficiency in the data check activities.
    • Train stakeholders: The analyst/engineer would train other stakeholders in the company on data quality best practices, to ensure that everyone is working towards the same quality goals.
    • Promote improvements in the development process in order to ensure the integrity and availability of the data.

    Some other responsibilities are:

    • Provide technical directions and mentor other engineers about data quality.
    • Perform data validity, accuracy, and integrity test across different components of the Data Platform.
    • Build automated test framework and tools to automate the Data Platform services and applications.
    • Automate regression tests and perform functional, integration, and load testing.
    • Articulate issues to developers during meetings and particularly in the daily standups.
    • Triage production-level issues by analyzing the production logs and working with the development team until the issues are resolved.
    • Proactively solve problems and suggest process improvements.
    • Provide test case coverage and defect metrics to substantiate release decisions.
    • Work with the product managers and the team leads to understand the requirements and come up with the relevant test cases and execute them.

    Knowledge and skills you need to have

    • Bachelor in Computer Science or Information Systems or 3+ years’ experience with corporate data management systems in high-compliance contexts.
    • 3+ years of experience writing complex SQL on large customer data sets (complex queries).
    • High proficiency in relational or non-relational databases.
    • Knowledgeable about industry data compliance strategies and practices, such as continuous integration, regression testing, and versioning.
    • Familiarity with Big Data environments, dealing with large diverse data sets.
    • Experience with BI projects.
    • Strong scripting experience with any of the scripting languages.
    • Accountability for receiving challenges
    • Excellent communication skills, with the ability to drive & collaborate with cross teams.
    • Fluent in Spanish & English

    To apply for this position, you must be based in Latin America. Applications from other locations will be rejected from this particular selection process.

    Bonus points for the following

    Additional requirements, not essential but "nice to have".

    • Python experience (for data analysis - Airflow)

    Why choose us?

    We provide the opportunity to be the best version of yourself, develop professionally, and create strong working relationships, whether working remotely or on-site. While offering a competitive salary, we also invest in our people's professional development and want to see you grow and love what you do. We are dedicated to listening to our team's needs and are constantly working on creating an environment in which you can feel at home.

    We offer a range of benefits to support your personal and professional development:

    Benefits:

    • 22 days of annual leave
    • 10 days of public/national holidays
    • Health insurance options
    • Access to online learning platforms
    • On-site English classes in some countries, and many more.

    Join our team and enjoy an environment that values and supports your well-being. If this sounds like the place for you, contact us now!

    See more jobs at Paramo Technologies

    Apply for this job

    +30d

    Senior Software Engineer - Data

    JW PlayerUnited States - Remote
    Full TimeS3EC2agileairflowjavadockerelasticsearchkubernetespythonAWSbackend

    JW Player is hiring a Remote Senior Software Engineer - Data

    About JWP:

    JWP is transforming the Digital Video Economy as a trusted partner for over 40,000 broadcasters, publishers, and video-driven brands through our cutting-edge video software and data insights platform. JWP empowers customers with unprecedented independence and control over their digital video content. Established in 2004 as an open-source video player, JWP has evolved into the premier force driving digital video for businesses worldwide. With a rich legacy of pioneering video technology, JWP customers currently generate 8 billion video impressions/month and 5 billion minutes of videos watched/month. At JWP, everyone shares a passion for revolutionizing the digital video landscape. If you are ready to be a part of a dynamic and collaborative team then join us in shaping the future of video! 

    The Data Engineering Team: 

    At JWP, our data team is a dynamic and innovative team, managing the data lifecycle, from ingestion to processing and analysis, touching every corner of our thriving business ecosystem. Engineers on the team play a pivotal role in shaping the company's direction by making key decisions about our infrastructure, technology stack, and implementation strategies. 

    The Opportunity: 

    We are looking to bring on a Senior Software Engineer to join our Data Engineering team. As an Engineer on the team, you will be diving into the forefront of cutting-edge big data tools and technology. In this role, you will have the opportunity to partner closely with various teams to tackle crucial challenges for one of the world's largest and rapidly expanding video companies. Join us and make an impact at the forefront of digital innovation.

    As a Senior Data Engineer, you will:

    • Contribute to the development of distributed batch and real-time data infrastructure.
    • Mentor and work closely with junior engineers on the team. 
    • Perform code reviews with peers. 
    • Lead small to medium sized projects, documenting and ticket writing the projects. 
    • Collaborate closely with Product Managers, Analysts, and cross-functional teams to gather insights and drive innovation in data products. 

    Requirements for the role:

    • Minimum 5+ years of backend engineering experience with a passionate interest for big data.
    • Expertise with Python or Java and SQL. 
    • Familiarity with Kafka
    • Experience with a range of datastores, from relational to key-value to document
    • Demonstrate humility, empathy, and a collaborative spirit that fuels team success. 

    Bonus Points:

    • Data engineering experience, specifically with data modeling, warehousing and building ETL pipelines
    • Familiarity with AWS - in particular, EC2, S3, RDS, and EMR
    • Familiarity with Snowflake
    • Familiarity with Elasticsearch
    • Familiarity with data processing tools like Hadoop, Spark, Kafka, and Flink
    • Experience with Docker, Kubernetes, and application monitoring tools
    • Experience and/or training with agile methodologies
    • Familiarity with Airflow for task and dependency management

    Perks of being at JWP, United States

    Our goal is to take care of you and ensure you will be successful in your new role. Your success is our success! 

    As a full time employee, you will qualify for:

    • Private Medical, Vision and Dental Coverage for you and your family
    • Unlimited Paid Time Off
    • Stock Options Purchase Program
    • Quarterly and Annual Team Events
    • Professional Career Development Program and Career Development Progression
    • New Employee Home Office Setup Stipend
    • Monthly Connectivity Stipend
    • Free and discounted perks through JWP's benefit partners
    • Bi-Annual Hack Weeks for those who are interested in using their coding knowledge
    • Fireside chats with individuals throughout JWP

    *Benefits are subject to location and can change at the discretion of the Company. 

    Check out our social channels:

        

    We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

    See more jobs at JW Player

    Apply for this job

    +30d

    Senior Data Engineer, Finance

    InstacartUnited States - Remote
    airflowsqlDesign

    Instacart is hiring a Remote Senior Data Engineer, Finance

    We're transforming the grocery industry

    At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

    Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

    Instacart is a Flex First team

    There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

    Overview

     

    At Instacart, our mission is to create a world where everyone has access to the food they love and more time to enjoy it together. Millions of customers every year use Instacart to buy their groceries online, and the Data Engineering team is building the critical data pipelines that underpin all of the myriad of ways that data is used across Instacart to support our customers and partners.

    About the Role 

     

    The Finance data engineering team plays a critical role in defining how financial data is modeled and standardized for uniform, reliable, timely and accurate reporting. This is a high impact, high visibility role owning critical data integration pipelines and models across all of Instacart’s products. This role is an exciting opportunity to join a key team shaping the post-IPO financial data vision and roadmap for the company.

     

    About the Team 

     

    Finance data engineering is part of the Infrastructure Engineering pillar, working closely with accounting, billing & revenue teams to support the monthly/quarterly book close, retailer invoicing and internal/external financial reporting. Our team collaborates closely with product teams to capture critical data needed for financial use cases.

     

    About the Job 

    • You will be part of a team with a large amount of ownership and autonomy.
    • Large scope for company-level impact working on financial data.
    • You will work closely with engineers and both internal and external stakeholders, owning a large part of the process from problem understanding to shipping the solution.
    • You will ship high quality, scalable and robust solutions with a sense of urgency.
    • You will have the freedom to suggest and drive organization-wide initiatives.

     

    About You

    Minimum Qualifications

    • 8+ years of working experience in a Data/Software Engineering role, with a focus on building data pipelines.
    • Expert with SQL and  knowledge of Python.
    • Experience building high quality ETL/ELT pipelines.
    • Past experience with data immutability, auditability, slowly changing dimensions or similar concepts.
    • Experience building data pipelines for accounting/billing purposes.
    • Experience with cloud-based data technologies such as Snowflake, Databricks, Trino/Presto, or similar.
    • Adept at fluently communicating with many cross-functional stakeholders to drive requirements and design shared datasets.
    • A strong sense of ownership, and an ability to balance a sense of urgency with shipping high quality and pragmatic solutions.
    • Experience working with a large codebase on a cross functional team.

     

    Preferred Qualifications

    • Bachelor’s degree in Computer Science, computer engineering, electrical engineering OR equivalent work experience.
    • Experience with Snowflake, dbt (data build tool) and Airflow
    • Experience with data quality monitoring/observability, either using custom frameworks or tools like Great Expectations, Monte Carlo etc

     

    #LI-Remote

    Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policyhere.

    Offers may vary based on many factors, such as candidate experience and skills required for the role.Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offeringshere.

    For US based candidates, the base pay ranges for a successful candidate are listed below.

    CA, NY, CT, NJ
    $221,000$245,000 USD
    WA
    $212,000$235,000 USD
    OR, DE, ME, MA, MD, NH, RI, VT, DC, PA, VA, CO, TX, IL, HI
    $203,000$225,000 USD
    All other states
    $183,000$203,000 USD

    See more jobs at Instacart

    Apply for this job