Company Name:
Company Url:
Short Pitch:
Description:
Headquarter Location:
Tags:


Job Url:

Shippeo


Shippeo, the European leader in real-time transportation visibility, helps major shippers and logistics service providers leverage transportation to deliver exceptional customer service and achieve operational excellence. Their Multimodal Visibility Network connects FTL, LTL, parcel, and container transport and integrates 700+ TMS, telematics and ELD systems using a unique API. The Shippeo platform provides instant access to real-time delivery tracking, automates customer processes and offers unmatched ETA accuracy thanks to a proprietary and industry-leading algorithm developed in-house. Global brands like Carrefour, Schneider Electric, Faurecia, Saint-Gobain and Eckes Granini trust Shippeo to track more than 10 million shipments per year across 70 countries.

Shippeo is hiring a Remote Senior Communication and Content Manager

Job Description


As a Senior Communication & Content Manager at Shippeo, you will drive and coordinate all aspects of public relations and content creation—strategizing, planning, and producing impactful materials to raise market awareness and empower our teams with targeted, high-quality content.

Your mission is to craft and execute content and PR strategies that educate and guide prospects and clients throughout their buying journey. You’ll develop a deep understanding of our industry’s pain points, value drivers, the buyer’s persona, market trends, and the triggers influencing purchasing decisions.

This job is open to candidates based in the UK, in France or Germany.

 

Key responsibilities

  • Define and maintain Shippeo’s global communication and content strategy, roadmap, and calendar

  • Develop and write a range of content, including articles, press releases, case studies, whitepapers, SEO-optimized website copy, and presentations

  • Coordinate press releases with customers, partners, investors, and agencies, distributing via platforms like Businesswire

  • Align content initiatives with the product marketing strategy, collaborating closely with the Head of Marketing and Product team

  • Manage and publish company social media posts, ensuring alignment with brand guidelines in design, tone, and quality

  • Stay updated on industry trends and thought leadership by monitoring market dynamics, identifying key opinion leaders, and conducting interviews

  • Measure content performance using analytics tools in collaboration with the Marketing Operations team

Qualifications

 

What you bring

  • Native-level English proficiency

  • A bachelor’s degree in Journalism, Public Relations, Marketing, Economics, or a related field

  • 5+ years of experience managing content production or communications

  • Exceptional writing and verbal communication skills with proven storytelling expertise across long- and short-form formats 

  • Demonstrates strong ownership and a hands-on approach, with a keen ability to deliver beyond expectations while maintaining high standards of quality

  • Highly organized with great time management and prioritization skills, ensuring efficient handling of multiple projects in fast-paced environment
     
  • Curious and proactive team player, excelling in clear communication and fostering collaboration to achieve shared goals
  • Experience in B2B software, supply chain, or transportation industries is a plus

See more jobs at Shippeo

Apply for this job

6d

Senior Analytics Engineer

ShippeoParis, France, Remote
agiletableausqlDesigngitpythonAWS

Shippeo is hiring a Remote Senior Analytics Engineer

Job Description

Your mission
As a senior data analyst, you will drive major analytics initiatives with autonomy and vision. You’ll transform raw data into actionable insights, empowering teams across the organization. With your expertise in data storytelling, you’ll be the go-to person for creating impactful visualizations and presenting findings that shape strategic decisions.

What you'll do

  • Data mastery:

    • Uncover complex relationships within datasets and reveal insights that others might miss

    • Design and manipulate large datasets to build actionable and insightful reports that align with business needs

    • Create elegant, on-trend visualizations that simplify complex data, making it accessible to technical and non-technical stakeholders alike

  • Strategic impact:

    • Lead analytics projects critical to business success, ensuring seamless communication and collaboration across teams

    • Develop production-ready code for large-scale projects, maintaining best practices in documentation and quality

    • Mentor junior analysts, elevating their technical and analytical skills

  • Cross-functional leadership:

    • Act as a trusted advisor by engaging with stakeholders to understand their needs and expectations

    • Keep teams informed and aligned on progress, timelines, and changes with clear and proactive communication

    • Foster innovation by creating opportunities for the team to grow and thrive, all while ensuring top-notch results

Qualifications

Your profile

  • Must-haves:

    • MSc degree (or equivalent) in Data Science, Applied Mathematics, Statistics, or a related field

    • 4+ years of hands-on experience in building and deploying data products in an Agile environment.

    • Strong proficiency in SQL and relational databases.

    • Strong experience in cloud data warehouse (ideally Snowflake)

    • Expertise in a data visualization tool like Tableau or Looker

    • Solid experience with data preparation scripts and creating data marts

    • Skills on DBT

    • Knowledge of algorithms, data structures, and version control (Git)

  • Bonus points:

    • Familiarity with cloud platforms (GCP or AWS)

    • Python proficiency and an understanding of programming best practices

See more jobs at Shippeo

Apply for this job

Shippeo is hiring a Remote Software Engineer Backend (PHP Symfony)

Job Description

About Us:
Our product is composed of a mission-critical SaaS web platform (API everywhere), with high traffic inbound/outbound integrations. Our platform handles over 30 million incoming API calls and generates 100+ million external API calls, making scalability our biggest challenge. We enable customers to proactively manage exceptions by analyzing real-time data from various stakeholders.

Team Structure:
Our technical team is divided into three feature teams:

  • Platform:Focus on building high-availability integrations to fetch and send orders, events, and positions. 

  • Solution:Automatically track road and maritime orders, detect exceptions, and alert users so they can respond effectively.

  • Analyze to Improve:Leverage data to offer better insights and improvements for users.

 

The Role:
We’re looking for a Senior Backend Software Engineer to join our team, where you’ll tackle the complexities of scalability and build high-availability integrations. Reporting to the Senior Engineering Manager, you'll focus on improving our technical architecture, developing new features, and ensuring production-readiness at scale.

 

Key Responsibilities:

  • Lead the design, development, testing, deployment, and maintenance of scalable solutions

  • Develop server-side application logic using PHP Symfony in an event-driven environment

  • Mentor peers and guide technical decisions, with a focus on scalability and performance

  • Collaborate closely with product managers and engineers to design performant, scalable APIs

  • Address scalability challenges, handling tens of millions of API calls

  • Implement modern architectural patterns: DDD, CQRS, Event Sourcing, Microservices

  • Document APIs, processes, and database schemas for high-performance systems

  • Ensure optimal performance and availability across all services

Qualifications

  • 5+ years of experience in software engineering in a high-paced, large-scale environment

  • Expert in PHP Symfony and relational + NoSQL databases

  • Experience with message brokers (e.g., RabbitMQ), CI tools, and monitoring stacks like Prometheus, Grafana, Kibana

  • Strong understanding of CQRS, Event Sourcing, and DDD principles

  • Proven experience managing high-scalability challenges

  • Pragmatic problem solver with a "build it, run it, monitor it" mindset

  • Fluent in French and English

 

See more jobs at Shippeo

Apply for this job

27d

(Senior) Data Engineer - France

ShippeoParis, France, Remote
MLairflowsqlRabbitMQdockerkubernetespython

Shippeo is hiring a Remote (Senior) Data Engineer - France

Job Description

The Data Intelligence Tribe is responsible for leveraging Shippeo’s data from our large shipper and carrier base, to build data products that help our users (shippers and carriers alike) and ML models to provide predictive insights. This tribe’s typical responsibilities are to:

  • get accurately alerted in advance of any potential delays on their multimodal flows or anomalies so that they can proactively anticipate any resulting disruptions

  • extract the data they need, get direct access to it or analyze it directly on the platform to gain actionable insights that can help them increase their operational performance and the quality and compliance of their tracking

  • provide best-in-class data quality by implementing advanced cleansing & enhancement rules

As a Data Engineer at Shippeo, your objective is to ensure that data is available and exploitable by our Data Scientists and Analysts on our different data platforms. You will contribute to the construction and maintenance of Shippeo’s modern data stack that’s composed of different technology blocks:

  • Data Acquisition (Kafka, KafkaConnect, RabbitMQ),

  • Batch data transformation (Airflow, DBT),

  • Cloud Data Warehousing (Snowflake, BigQuery),

  • Stream/event data processing (Python, docker, Kubernetes) and all the underlying infrastructure that support these use cases.

 

Qualifications

Required:

  • You have a degree (MSc or equivalent) in Computer Science.

  • 3+ years of experience as a Data Engineer.

  • Experience building, maintaining, testing and optimizing data pipelines and architectures

  • Programming skills in Python 

  • Advanced working knowledge of SQL, experience working with relational databases and familiarity with a variety of databases.

  • Working knowledge of message queuing and stream processing.

  • Advanced knowledge of Docker and Kubernetes.

  • Advanced knowledge of a cloud platform (preferably GCP).

  • Advanced knowledge of a cloud based data warehouse solution (preferably Snowflake).

  • Experience with Infrastructure as code (Terraform/Terragrunt)

  • Experience building and evolving CI/CD pipelines (Github Actions).

Desired: 

  • Experience with Kafka and KafkaConnect (Debezium).

  • Monitoring and alerting on Grafana / Prometheus.

  • Experience working on Apache Nifi.

  • Experience working with workflow management systems such as Airflow.

See more jobs at Shippeo

Apply for this job

Shippeo is hiring a Remote (Senior) Data Engineer - France (F/M/D)

Job Description

The Data Intelligence Tribe is responsible for leveraging Shippeo’s data from our large shipper and carrier base, to build data products that help our users (shippers and carriers alike) and ML models to provide predictive insights. This tribe’s typical responsibilities are to:

  • get accurately alerted in advance of any potential delays on their multimodal flows or anomalies so that they can proactively anticipate any resulting disruptions

  • extract the data they need, get direct access to it or analyze it directly on the platform to gain actionable insights that can help them increase their operational performance and the quality and compliance of their tracking

  • provide best-in-class data quality by implementing advanced cleansing & enhancement rules

As a Data Engineer at Shippeo, your objective is to ensure that data is available and exploitable by our Data Scientists and Analysts on our different data platforms. You will contribute to the construction and maintenance of Shippeo’s modern data stack that’s composed of different technology blocks:

  • Data Acquisition (Kafka, KafkaConnect, RabbitMQ),

  • Batch data transformation (Airflow, DBT),

  • Cloud Data Warehousing (Snowflake, BigQuery),

  • Stream/event data processing (Python, docker, Kubernetes) and all the underlying infrastructure that support these use cases.

 

Qualifications

Required:

  • You have a degree (MSc or equivalent) in Computer Science.

  • 3+ years of experience as a Data Engineer.

  • Experience building, maintaining, testing and optimizing data pipelines and architectures

  • Programming skills in Python 

  • Advanced working knowledge of SQL, experience working with relational databases and familiarity with a variety of databases.

  • Working knowledge of message queuing and stream processing.

  • Advanced knowledge of Docker and Kubernetes.

  • Advanced knowledge of a cloud platform (preferably GCP).

  • Advanced knowledge of a cloud based data warehouse solution (preferably Snowflake).

  • Experience with Infrastructure as code (Terraform/Terragrunt)

  • Experience building and evolving CI/CD pipelines (Github Actions).

Desired: 

  • Experience with Kafka and KafkaConnect (Debezium).

  • Monitoring and alerting on Grafana / Prometheus.

  • Experience working on Apache Nifi.

  • Experience working with workflow management systems such as Airflow.

See more jobs at Shippeo

Apply for this job

+30d

(Senior) Data Engineer (F/M/D)

ShippeoParis, France, Remote
MLairflowsqlRabbitMQdockerkubernetespython

Shippeo is hiring a Remote (Senior) Data Engineer (F/M/D)

Job Description

The Data Intelligence Tribe is responsible for leveraging Shippeo’s data from our large shipper and carrier base, to build data products that help our users (shippers and carriers alike) and ML models to provide predictive insights. This tribe’s typical responsibilities are to:

  • get accurately alerted in advance of any potential delays on their multimodal flows or anomalies so that they can proactively anticipate any resulting disruptions

  • extract the data they need, get direct access to it or analyze it directly on the platform to gain actionable insights that can help them increase their operational performance and the quality and compliance of their tracking

  • provide best-in-class data quality by implementing advanced cleansing & enhancement rules

As a Data Engineer at Shippeo, your objective is to ensure that data is available and exploitable by our Data Scientists and Analysts on our different data platforms. You will contribute to the construction and maintenance of Shippeo’s modern data stack that’s composed of different technology blocks:

  • Data Acquisition (Kafka, KafkaConnect, RabbitMQ),

  • Batch data transformation (Airflow, DBT),

  • Cloud Data Warehousing (Snowflake, BigQuery),

  • Stream/event data processing (Python, docker, Kubernetes) and all the underlying infrastructure that support these use cases.

 

Qualifications

Required:

  • You have a degree (MSc or equivalent) in Computer Science.

  • 3+ years of experience as a Data Engineer.

  • Experience building, maintaining, testing and optimizing data pipelines and architectures

  • Programming skills in Python 

  • Advanced working knowledge of SQL, experience working with relational databases and familiarity with a variety of databases.

  • Working knowledge of message queuing and stream processing.

  • Advanced knowledge of Docker and Kubernetes.

  • Advanced knowledge of a cloud platform (preferably GCP).

  • Advanced knowledge of a cloud based data warehouse solution (preferably Snowflake).

  • Experience with Infrastructure as code (Terraform/Terragrunt)

  • Experience building and evolving CI/CD pipelines (Github Actions).

Desired: 

  • Experience with Kafka and KafkaConnect (Debezium).

  • Monitoring and alerting on Grafana / Prometheus.

  • Experience working on Apache Nifi.

  • Experience working with workflow management systems such as Airflow.

See more jobs at Shippeo

Apply for this job