Data Engineer Remote Jobs

108 Results

1d

Data Engineer

DevOPSsqlDesignazurepython

Sunscrapers Sp. Openings is hiring a Remote Data Engineer

Are you ready to take the challenge?

We’re looking for aData Engineer to join our team in Warsaw or remotely.

Advance your career with Sunscrapers, a leading force in software development, now expanding its presence in a data-centric environment. Join us in our mission to help clients grow and innovate through a comprehensive tech stack and robust data-related projects. Enjoy a competitive compensation package that reflects your skills and expertise while working in a company that values ambition, technical excellence, trust-based partnerships, and actively supports contributions to R&D initiatives.

As a Data Engineer you’ll play a pivotal role in building a robust data platform that drives engagement solutions for leading healthcare brands. Your work will directly support the development of omnichannel digital health experiences, empowering consumers to access the programs, benefits, and care they need.

In this role, you’ll leverage cutting-edge technologies such as Apache Spark, Databricks, and Delta Tables to design and implement scalable data solutions. You’ll enable actionable insights by integrating diverse data sources, building efficient pipelines, and supporting data-driven decision-making.

Your responsibilities will include:

  • Design and optimize data infrastructure using Python, PySpark, Apache Spark, and Delta Spark,
  • Implement strong data governance frameworks to ensure quality, security, and compliance,
  • Connect Delta Tables to a SQL engine (like Databricks SQL) for efficient querying and analytics,
  • Leverage strong DevOps expertise to deploy and maintain data systems in Azure,
  • Create batch and streaming pipelines for data processing.

What's important for us?

  • At least 3 years of professional experience as a data engineer,
  • Undergraduate or graduate degree in Computer Science, Engineering, Mathematics, or similar,
  • Excellent command of spoken and written English (at least C1),
  • Experience in designing data infrastructure using Python, PySpark, Apache Spark, and Delta Spark,
  • Experience in managing production spark clusters either in Databricks,
  • Proficiency in SQL and experience with Delta Lake architectures,
  • Great analytical skills and attention to detail - asking questions and proactively searching for answers,
  • Creative problem-solving skills,
  • Great customer service and troubleshooting skills.

You'll score extra points for:

  • Familiarity with CI/CD pipelines and containerization (Docker, Kubernetes),
  • Experience with real-time data tools like Kafka or Azure Event Grid,
  • Experience with BigQuery,
  • Experience in managing data governance in the healthcare space.

What do we offer?

  • Working alongside a talented team that’s changing the image of Poland abroad.
  • Flexible working hours and remote work possibility.
  • Comfortable office in a penthouse in central Warsaw equipped with all the necessary tools to conquer the universe (Macbook, external screen, ergonomic chairs).
  • Fully equipped kitchen with fruit, hot and cold drinks.
  • Multisport card & Private medical care.
  • Culture of good feedback: evaluation meetings, mentoring.
  • We value and appreciate our engineers eagerness to learn and improve so we strongly encourage and support their growth!

Sounds like a perfect place for you? Don’t hesitate to click apply and submit your application today!

See more jobs at Sunscrapers Sp. Openings

Apply for this job

1d

Senior Data Engineer

Sunscrapers Sp. OpeningsWarsaw,Masovian Voivodeship,Poland, Remote
Design

Sunscrapers Sp. Openings is hiring a Remote Senior Data Engineer

Are you ready to take the challenge?

We’re looking for a Senior Data Engineer to join our team in Warsaw or remotely.

Advance your career with Sunscrapers, a leading force in software development, now expanding its presence in a data-centric environment. Join us in our mission to help clients grow and innovate through a comprehensive tech stack and robust data-related projects. Enjoy a competitive compensation package that reflects your skills and expertise while working in a company that values ambition, technical excellence, trust-based partnerships, and actively supports contributions to R&D initiatives.

As a Senior Data Engineer you’ll design and implement a system supporting the decision process for the US-based private investment firm. You’ll need to integrate data from multiple systems and sources to enable data insights, machine learning and data-driven decision processes. You’ll build integrated data models, data warehouse and data pipelines.

The ideal candidate will be well organized, eager to constantly improve and learn, driven and, most of all - a team player!

Your responsibilities will include:

  • Modeling datasets and schemes for consistency and easy access,
  • Design and implement data transformations and data marts,
  • Integrating third-party systems and external data sources into data warehouse,
  • Building data flows for fetching, aggregation and data modeling using batch pipelines.

What's important for us?

  • At least 5 years of professional experience as a data engineer,
  • Undergraduate or graduate degree in Computer Science, Engineering, Mathematics, or similar,
  • Excellent command in spoken and written English, at least C1,
  • Strong professional experience with Python and SQL,
  • Hands on experience on DBT and Snowflake,
  • Experience in building data pipelines with Airflow or alternative solutions,
  • Strong understanding of various data modeling techniques like Kimball Star Schema,
  • Great analytical skills and attention to detail - asking questions and proactively searching for answers,
  • Creative problem-solving skills,
  • Great customer service and troubleshooting skills.

You will score extra points for:

  • Expertise in AWS or Azure stack,
  • Experience with infrastructure-as-code tools, like Terraform,
  • Devops skills to automate deployment and streamline development,
  • Good understanding of Docker, Kubernetes and AWS EKS.

What do we offer?

  • Working alongside a talented team that’s changing the image of Poland abroad.
  • Flexible working hours and remote work possibility.
  • Comfortable office in a penthouse in central Warsaw equipped with all the necessary tools to conquer the universe (Macbook, external screen, ergonomic chairs).
  • Fully equipped kitchen with fruit, hot and cold drinks.
  • Multisport card & Private medical care.
  • Culture of good feedback: evaluation meetings, mentoring.
  • We value and appreciate our engineers eagerness to learn and improve so we strongly encourage and support their growth!


Sounds like a perfect place for you? Don’t hesitate to click apply and submit your application today!

See more jobs at Sunscrapers Sp. Openings

Apply for this job

2d

Senior Data Engineer

BrazeRemote - Ontario
SalesBachelor's degreeairflowsqlDesignkubernetes

Braze is hiring a Remote Senior Data Engineer

At Braze, we have found our people. We’re a genuinely approachable, exceptionally kind, and intensely passionate crew.

We seek to ignite that passion by setting high standards, championing teamwork, and creating work-life harmony as we collectively navigate rapid growth on a global scale while striving for greater equity and opportunity – inside and outside our organization.

To flourish here, you must be prepared to set a high bar for yourself and those around you. There is always a way to contribute: Acting with autonomy, having accountability and being open to new perspectives are essential to our continued success. Our deep curiosity to learn and our eagerness to share diverse passions with others gives us balance and injects a one-of-a-kind vibrancy into our culture.

If you are driven to solve exhilarating challenges and have a bias toward action in the face of change, you will be empowered to make a real impact here, with a sharp and passionate team at your back. If Braze sounds like a place where you can thrive, we can’t wait to meet you.

WHAT YOU’LL DO

Join our dynamic team dedicated to revolutionizing data infrastructure and products for impactful decision-making at Braze. We collaboratively shape data engineering strategies, optimizing data pipelines and architecture to drive business growth and enhance customer experiences.

Responsibilities:

  • Lead the design, implementation, and monitoring of scalable data pipelines and architectures using tools like Snowflake and dbt
  • Develop and maintain robust ETL processes to ensure high-quality data ingestion, transformation, and storage
  • Collaborate closely with data scientists, analysts, and other engineers to design and implement data solutions that drive customer engagement and retention
  • Optimize and manage data flows and integrations across various platforms and applications
  • Ensure data quality, consistency, and governance by implementing best practices and monitoring systems
  • Work extensively with large-scale event-level data, aggregating and processing it to support business intelligence and analytics
  • Implement and maintain data products using advanced techniques and tools
  • Collaborate with cross-functional teams including engineering, product management, sales, marketing, and customer success to deliver valuable data solutions
  • Continuously evaluate and integrate new data technologies and tools to enhance our data infrastructure and capabilities

WHO YOU ARE

The ideal candidate for this role possesses:

  • 5+ years of hands-on experience in data engineering, cloud data warehouses, and ETL development, preferably in a customer-facing environment
  • Proven expertise in designing and optimizing data pipelines and architectures
  • Strong proficiency in advanced SQL and data modeling techniques
  • A track record of leading impactful data projects from conception to deployment
  • Effective collaboration skills with cross-functional teams and stakeholders
  • In-depth understanding of technical architecture and data flow in a cloud-based environment
  • Ability to mentor and guide junior team members on best practices for data engineering and development
  • Passion for building scalable data solutions that enhance customer experiences and drive business growth
  • Strong analytical and problem-solving skills, with a keen eye for detail and accuracy
  • Extensive experience working with and aggregating large event-level data
  • Familiarity with data governance principles and ensuring compliance with industry regulations
  • Prefer, but don’t require, experience with Kubernetes for container orchestration and Airflow for workflow management

 #LI-Remote

WHAT WE OFFER

Details of these benefits plan will be provided if a candidate receives an offer of employment. Benefits may vary by location.

From offering comprehensive benefits to fostering flexible environments, we’ve got you covered so you can prioritize work-life harmony.

  • Competitive compensation that may include equity
  • Retirement and Employee Stock Purchase Plans
  • Flexible paid time off
  • Comprehensive benefit plans covering medical, dental, vision, life, and disability
  • Family services that include fertility benefits and equal paid parental leave
  • Professional development supported by formal career pathing, learning platforms, and tuition reimbursement
  • Community engagement opportunities throughout the year, including an annual company wide Volunteer Week
  • Employee Resource Groups that provide supportive communities within Braze
  • Collaborative, transparent, and fun culture recognized as a Great Place to Work®

ABOUT BRAZE

Braze is a leading customer engagement platform that powers lasting connections between consumers and brands they love. Braze allows any marketer to collect and take action on any amount of data from any source, so they can creatively engage with customers in real time, across channels from one platform. From cross-channel messaging and journey orchestration to Al-powered experimentation and optimization, Braze enables companies to build and maintain absolutely engaging relationships with their customers that foster growth and loyalty.

Braze is proudly certified as a Great Place to Work® in the U.S., the UK and Singapore. We ranked #3 on Great Place to Work UK’s 2024 Best Workplaces (Large), #3 on Great Place to Work UK’s 2023 Best Workplaces for Wellbeing (Medium), #4 on Great Place to Work’s 2023 Best Workplaces in Europe (Medium), #10 on Great Place to Work UK’s 2023 Best Workplaces for Women (Large), #19 on Fortune’s 2023 Best Workplaces in New York (Large). We were also featured in Built In's 2024 Best Places to Work, U.S. News Best Technology Companies to Work For, and Great Place to Work UK’s 2023 Best Workplaces in Tech.

You’ll find many of us at headquarters in New York City or around the world in Austin, Berlin, Chicago, Jakarta, London, Paris, San Francisco, Singapore, Sydney and Tokyo – not to mention our employees in nearly 50 remote locations.

BRAZE IS AN EQUAL OPPORTUNITY EMPLOYER

At Braze, we strive to create equitable growth and opportunities inside and outside the organization.

Building meaningful connections is at the heart of everything we do, and that includes our recruiting practices. We're committed to offering all candidates a fair, accessible, and inclusive experience – regardless of age, color, disability, gender identity, marital status, maternity, national origin, pregnancy, race, religion, sex, sexual orientation, or status as a protected veteran. When applying and interviewing with Braze, we want you to feel comfortable showcasing what makes you you.

We know that sometimes different circumstances can lead talented people to hesitate to apply for a role unless they meet 100% of the criteria. If this sounds familiar, we encourage you to apply, as we’d love to meet you.

Please see ourCandidate Privacy Policy for more information on how Braze processes your personal information during the recruitment process and, if applicable based on your location, how you can exercise any privacy rights.

See more jobs at Braze

Apply for this job

2d

Senior Data Engineer

SOPHiA GENETICSRolle,Vaud,Switzerland, Remote Hybrid
Design

SOPHiA GENETICS is hiring a Remote Senior Data Engineer

SOPHiA GENETICS (NASDAQ: SOPH) combines Data-Driven Medicine, Genomics and Radiomics, to ensure that the data used to help patients today will also benefit the patients of tomorrow. To help us achieve our ambitious mission, we are now searching for a Senior Data Engineer with Big Data experience to join our team in Rolle, Switzerland.

Why us:

We believe there is a smarter, more data-driven way to make decisions in healthcare and our AI SaaS Platform enables that. Our platform is a one-of-a-kind globally distributed information system that brings together hospitals and labs to provide data ingestion and processing, analysis and modeling, reporting and intelligence, distribution and sharing of a multitude of complex sources of structured and unstructured data, including genomics, imaging, and clinical data, delivered as a multi-tenant SaaS platform on the cloud. 

As a Senior Data Engineer, you will be part of a team of engineers focused on developing and maintaining our core internal data platform and microservices connecting it to all corners of our business. 

Your mission:

Reporting directly to the Head of Data, the Senior Data Engineer will be responsible for the development of the core data platform, its evolution, as well as designing individual components and services, while collaborating daily with senior technical staff. You will have the opportunity to recommend and drive new initiatives and support our fast-growing organization.

The value add:

  • You will have a key role in the development and evolution of our next-gen multimodal data platform, aided by your manager and your team members. This will include design, implementation, testing, documentation, deployment, maintenance and support of the services and other projects owned by your team.
  • You will be responsible for designing and building of individual components and services, as well as contributing to the overall platform architecture. You will collaborate daily with senior technical staff in- and outside of the team
  • You will be expected to participate in Level 3 Support activities.
  • You will actively participate in code and design reviews with other members of the team 
  • As needed for your projects, you will participate in estimations and risk assessments, and exchange with stakeholders in Product and Project Management and other departments.
  • You will participate in the team’s processes and recurring activities while helping to organize them. You will share your knowledge of best practices in the team and mentor junior team members.

You have demonstrated experience in developing reliable and performant data platforms and services while having a firm grasp on the underlying challenges of releasing a distributed data and software solution to production. You have a basic understanding of the domain of genomics and digital healthcare and care for the impact you can have in this field. You know modern data and software engineering processes, have good knowledge of tools, technologies, and best practices. You seek to exchange regularly and communicate effectively with other members of your team. 

The experience you bring:

  • Master’s degree in Computer Science or Engineering or equivalent professional experience
  • At least 4-6 years of experience working with distributed data, data lakes, microservice-oriented architectures, and APIs, ideally in the healthcare field. 
  • Expertise with Python ETLs in a data processing environment, ideally Databricks
  • Expertise with distributed big data architectures (schemas, transfers, storage, partitioning, performance monitoring and optimization)
  • Solid knowledge of modern scalable database and data lake technologies, especially Spark & SQL
  • Experience with containerization and orchestration technologies, as well as basic DevOps processes and tooling
  • Experience with software engineering best-practices, Agile, CI/CD, Unit & integration testing
  • Good interpersonal and communication skills with a growth mindset
  • Tooling: Azure data services ecosystem, Databricks & Unity Catalog, Terraform, Gitlab
  • Experience with multimodal data spanning of digital healthcare, clinical, radiomics and genomics (is a plus)
  • Excellent level of English, French is a plus

You will be joining an organization with the patient at the heart of every decision and action, driven by purpose as we drive exponential growth. 

  • Opportunity to work on cutting-edge research projects with an immediate global impact 
  • A flexible, friendly and international working environment with a collaborative atmosphere 
  • An exciting company mission that brings together science and technology to directly impact the lives of patients with life threatening illness
  • A fast-growing company with plenty of opportunity for personal growth and development 
  • A hard technical challenge to solve with exciting modern technology - cloud computing, Big Data, DevOps, machine learning 

If you’re a dynamic, self-motivated professional who believes nothing is impossible, love to learn and be curious, we’d love to have you as part of our team!

The Process 

Apply now with your CV and any supporting information. All resumes MUST be in English for a successful review. 

Start Date: ASAP 

Location: Rolle, Switzerland (3 days in office)

Contract: Full-Time, Permanent 

See more jobs at SOPHiA GENETICS

Apply for this job

3d

Senior ETL Data Engineer

AETOSremote, REMOTE, Remote
agileBachelor's degreesqlsalesforceoraclelinux

AETOS is hiring a Remote Senior ETL Data Engineer

Job Description

Aetos LLC is seeking a Senior ETL Data Engineer team member to join an existing team providing Extract, Transform and Load (ETL) solutions to a government client. The ideal individual will have 5+ years of experience with Informatica PowerCenter. Must be responsible for successful technical delivery and support of Data Warehousing, Data Migration and Transformation, and Business Intelligence projects using an Agile project management methodology. The duties of this role will include all aspects of data processing, storage, and ingestion, as well as data analysis and visualization of relative multi-program data. 

Qualifications

Responsibilities:  

ETL/Data Warehouse: 

  • Create, maintain, and reverse engineer the Extract, Transform, and Load (ETL) procedures for the Data Warehouse (DW) environment using the Informatica PowerCenter suite. 
  • Perform analysis of RDBMS tables and PowerCenter objects to answer questions pertaining to the data warehouse and the data transformations. 
  • Create and maintain scripts and files that perform various functions on the Informatica integration servers.  Use Putty or other Unix text editor to maintain Linux environment.  
  • Maintain data model documentation (ERwin) if changes to the ETL require database changes, and develop, test, and deploy associated DDL.  
  • Manage releases of changes to ETL, scripts, DDL, and scheduling components from Development to Test to Production.   
  • Provide support for the Test, Certification, and Production DW environments.  
  • Maintain Consolidated Data Model (CDM).  
  • Any knowledge of Informatica Cloud Integration Services a plus 
  • Provide ongoing development and maintenance of financial data marts and enterprise data warehouse using BI best practices, relational structures, dimensional data, structured query language skills, data warehouse and reporting techniques.  
  • Collaborate with end users to identify needs and opportunities for improved delivery of data supporting agency financial operations and mission.   
  • Convert business requirements and high-level data collection needs into well-specified ETL, analyses, reporting and visualizations.  
  • Define and log work using JIRA.  
  • Participate in recurring team meetings (Agile). 

Education & Qualifications Required:   

  • Bachelor's degree in computer science, Software Engineering, or commensurate experience in a related field.   
  • 5 + years of experience using Informatica PowerCenter at development level (creating mappings, workflows, etc). 
  • 7+ years of relevant experience in ETL development support and maintenance. 
  • Strong SQL (Oracle) abilities. 
  • Proficiency in shell scripting 
  • ETL environment where Salesforce a source a plus 
  • ETL environment where Control-M used a plus 
  • 2+ years of Informatica PowerCenter administration.  If in a Linux environment a plus. 
  • Knowledge or usage of Informatica IICS, EDC, and or AXON a plus.  
  • Excellent analytical, organizational, verbal, and written communication skills.  
  • Experience in gathering requirements and formulating business metrics for reporting.  
  • Familiarity with Erwin data modeling tool.  
  • Experience working in a Microsoft SharePoint environment.  
  • Experience with AGILE and writing User Stories.  
  • Must be able to present diagnostic, troubleshooting steps and conclusions to varied audiences.  
  • Experience monitoring and maintaining enterprise Data Warehouse platforms and BI reporting services.  
  • Banking and lending domain experience a plus.   

See more jobs at AETOS

Apply for this job

3d

Senior Data Engineer

GeminiRemote (USA)
remote-firstairflowsqlDesigncsskubernetespythonjavascript

Gemini is hiring a Remote Senior Data Engineer

About the Company

Gemini is a global crypto and Web3 platform founded by Tyler Winklevoss and Cameron Winklevoss in 2014. Gemini offers a wide range of crypto products and services for individuals and institutions in over 70 countries.

Crypto is about giving you greater choice, independence, and opportunity. We are here to help you on your journey. We build crypto products that are simple, elegant, and secure. Whether you are an individual or an institution, we help you buy, sell, and store your bitcoin and cryptocurrency. 

At Gemini, our mission is to unlock the next era of financial, creative, and personal freedom.

In the United States, we have a flexible hybrid work policy for employees who live within 30 miles of our office headquartered in New York City and our office in Seattle. Employees within the New York and Seattle metropolitan areas are expected to work from the designated office twice a week, unless there is a job-specific requirement to be in the office every workday. Employees outside of these areas are considered part of our remote-first workforce. We believe our hybrid approach for those near our NYC and Seattle offices increases productivity through more in-person collaboration where possible.

The Department: Data

The Role: Senior Data Engineer

As a member of our data engineering team, you'll deliver high quality work while solving challenges that impact the whole or part of the team's data architecture. You'll update yourself with recent advances in Big data space and provide solutions for large-scale applications aligning with team's long term goals. Your work will help resolve complex problems with identifying root causes, documenting the solutions, and implementing Operations excellence (Data auditing, validation, automation, maintainability) in mind. Communicating your insights with leaders across the organization is paramount to success.

Responsibilities:

  • Design, architect and implement best-in-class Data Warehousing and reporting solutions
  • Lead and participate in design discussions and meetings
  • Mentor data engineers and analysts
  • Design, automate, build, and launch scalable, efficient and reliable data pipelines into production using Python
  • Build real-time data and reporting solutions
  • Design, build and enhance dimensional models for Data Warehouse and BI solutions
  • Research new tools and technologies to improve existing processes
  • Develop new systems and tools to enable the teams to consume and understand data more intuitively
  • Partner with engineers, project managers, and analysts to deliver insights to the business
  • Perform root cause analysis and resolve production and data issues
  • Create test plans, test scripts and perform data validation
  • Tune SQL queries, reports and ETL pipelines
  • Build and maintain data dictionary and process documentation

Minimum Qualifications:

  • 5+ years experience in data engineering with data warehouse technologies
  • 5+ years experience in custom ETL design, implementation and maintenance
  • 5+ years experience with schema design and dimensional data modeling
  • Experience building real-time data solutions and processes
  • Advanced skills with Python and SQL are a must
  • Experience with one or more MPP databases(Redshift, Bigquery, Snowflake, etc)
  • Experience with one or more ETL tools(Informatica, Pentaho, SSIS, Alooma, etc)
  • Strong computer science fundamentals including data structures and algorithms
  • Strong software engineering skills in any server side language, preferable Python
  • Experienced in working collaboratively across different teams and departments
  • Strong technical and business communication

Preferred Qualifications:

  • Kafka, HDFS, Hive, Cloud computing, machine learning, text analysis, NLP & Web development experience is a plus
  • Experience with Continuous integration and deployment
  • Knowledge and experience of financial markets, banking or exchanges
  • Web development skills with HTML, CSS, or JavaScript
It Pays to Work Here
 
The compensation & benefits package for this role includes:
  • Competitive starting salary
  • A discretionary annual bonus
  • Long-term incentive in the form of a new hire equity grant
  • Comprehensive health plans
  • 401K with company matching
  • Paid Parental Leave
  • Flexible time off

Salary Range: The base salary range for this role is between $136,000 - $170,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidate’s compensation, we consider a number of factors including skillset, experience, job scope, and current market data.

At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace. If you have a specific need that requires accommodation, please let a member of the People Team know.

#LI-AA1

Apply for this job

3d

Sr Data Engineer

BeyondTrustRemote United States

BeyondTrust is hiring a Remote Sr Data Engineer

Job Application for Sr Data Engineer at BeyondTrust{"@context":"schema.org","@type":"JobPosting","hiringOrganization":{"@type":"Organization","name":"BeyondTrust","logo":"https://recruiting.cdn.greenhouse.io/external_greenhouse_job_boards/logos/000/010/289/resized/Beyond_Trust.png?1555420135"},"title":"Sr Data Engineer","datePosted":"2024-11-19","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":null,"addressRegion":null,"addressCountry":null,"postalCode":null}},"description":"\u003cp\u003eBeyondTrust is a place where you can bring your purpose to life through the work that you do, creating a safer world through our cyber security SaaS portfolio.\u003c/p\u003e\n\u003cp\u003eOur culture of flexibility, trust, and continual learning means you will be recognized for your growth, and for the impact you make on our success. You will be surrounded by people who challenge, support, and inspire you to be the best version of yourself.\u003c/p\u003e\n\u003cp\u003e\u003cu\u003eThe Role\u003c/u\u003e\u003c/p\u003e\n\u003cp\u003eAs a Senior Data Engineer at BeyondTrust, you will help build and enhance our state of the art datalakehouse which is responsible for consuming billions of events each day. With security and computational efficiency at top of mind, you will help cut through the noise and create valuable, actionable insights from our vast quantity of data to deliver immediate value to our customers.\u0026nbsp;Our engineers are problem solvers at heart and will tackle both business problems and technical engineering challenges alike with a focus on how \u0026amp; why before solving\u003c/p\u003e\n\u003cp\u003e\u003cu\u003eWhat You’ll Do\u003c/u\u003e\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eOptimize data workloads at a software level by improving processing efficiency.\u003c/li\u003e\n\u003cli\u003eDevelop new data processing routes to remove redundancy or reduce transformation overhead.\u003c/li\u003e\n\u003cli\u003eMonitor and maintain existing data workflows.\u003c/li\u003e\n\u003cli\u003eUse observability best practices to ensure pipeline performance.\u003c/li\u003e\n\u003cli\u003ePerform complex transformations on both real time and batch data assets.\u003c/li\u003e\n\u003cli\u003eCreate new ML/Engineering solutions to tackle existing issues in the cybersecurity space.\u003c/li\u003e\n\u003cli\u003eLeverage CI/CD best practices to effectively develop and release source code.\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cu\u003eWhat You’ll Bring\u003c/u\u003e\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eStrong programming and technology knowledge in cloud data processing.\u003c/li\u003e\n\u003cli\u003ePrevious experience working in matured data lakes\u003c/li\u003e\n\u003cli\u003eStrong data modelling skills for analytical workloads.\u003c/li\u003e\n\u003cli\u003eSpark (or equivalent parallel processing framework) experience is needed, existing Databricks knowledge is a plus.\u003c/li\u003e\n\u003cli\u003eInterest and aptitude for cybersecurity; interest in identity security is highly preferred.\u003c/li\u003e\n\u003cli\u003eTechnical understanding of underlying systems and computation minutiae.\u003c/li\u003e\n\u003cli\u003eExperience working with distributed systems and data processing on object stores.\u003c/li\u003e\n\u003cli\u003eAbility to work autonomously\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cu\u003eNeed to Know\u003c/u\u003e\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003e

See more jobs at BeyondTrust

Apply for this job

Databricks is hiring a Remote Big Data engineer

Job Application for Big Data engineer at Databricks

See more jobs at Databricks

Apply for this job

Tiger Analytics is hiring a Remote Senior Data Engineer

Tiger Analytics is a fast-growing advanced analytics consulting firm. Our consultants bring deep expertise in Data Science, Machine Learning and AI. We are the trusted analytics partner for multiple Fortune 500 companies, enabling them to generate business value from data. Our business value and leadership has been recognized by various market research firms, including Forrester and Gartner. We are looking for top-notch talent as we continue to build the best global analytics consulting team in the world.

As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data pipelines on cloud infrastructure. You will work closely with cross-functional teams to support data analytics, machine learning, and business intelligence initiatives.

  • Bachelor’s degree in Computer Science or similar field
  • 8+ years of experience in a Data Engineer role
  • Experience with relational SQL and NoSQL databases like MySQL, Postgres
  • Strong analytical skills and advanced SQL knowledge
  • Development of ETL pipelines using Python & SQL
  • Having a good experience with Customer Data Platforms (CDP)
  • Experience in SQL optimization and performance tuning
  • Experience with data modeling and building high-volume ETL pipelines.
  • Working experience with any cloud platform
  • Experience with Google Tag Manager and Power BI is a plus
  • Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc.
  • Experience extracting/querying/joining large data sets at scale
  • A desire to work in a collaborative, intellectually curious environment
  • Strong communication and organizational skills

This position offers an excellent opportunity for significant career development in a fast-growing and challenging entrepreneurial environment with a high degree of individual responsibility.

See more jobs at Tiger Analytics

Apply for this job

6d

Data Platform Engineer - Remote

Two95 International,Pennsylvania,United States, Remote
3 years of experience10 years of experiencesql

Two95 International is hiring a Remote Data Platform Engineer - Remote

Title– Data Platform Engineer

     Position – 6+ Months contract to hire

     Location– Remote

     Rate -$Open (Best Possible)

  • 8-10 years of experience with IBM i server administration and DB2 for i platform (DB2/400) required.
  • At least 5 years of hands-on experience with DB2 for i as a database administrator (DBA).
  • Minimum of 3 years of experience with relational database management systems (RDBMS) and OLTP/OLAP concepts.
  • Proficiency in IBM i navigation, including ACS, Schemas, Run SQL, plan cache, stored procedures, and creating user-defined functions (UDFs).
  • Strong skills in writing complex SQL queries, including joins, sub-selects, and with statements.
  • Proven experience managing high-volume, high-velocity application/data environments.

Note: If interested please send your updated resume and include your rate requirement along with your contact details with a suitable time when we can reach you. If you know of anyone in your sphere of contacts, who would be a perfect match for this job then, we would appreciate if you can forward this posting to them with a copy to us.

We look forward to hearing from you at the earliest!

See more jobs at Two95 International

Apply for this job

8d

Lead Data Engineer

Extreme ReachLondon,England,United Kingdom, Remote Hybrid
DevOPSagileDesign

Extreme Reach is hiring a Remote Lead Data Engineer

XR is a global technology platform powering the creative economy. Its unified platform moves creative and productions forward, simplifying the fragmentation and delivering global insights that drive increased business value. XR operates in 130 countries and 45 languages, serving the top global advertisers and enabling $150 billion in video ad spend around the world. More than half a billion creative brand assets are managed in XR’s enterprise platform. 

Above all, we are a supportive and collaborative culture dedicated to DEI. We are caring, dedicated, positive, genuine, trustworthy, experienced, passionate and fun people with loyalty to our customers and our fellow teammates. It is our belief that the better we work together to help our clients achieve their goals, the more successful XR will be.  

The Opportunity 

We are looking for a motivated and results driven Lead Data Engineer to join our Development Team; responsible for designing, and managing the infrastructure and data systems that power analytics and business intelligence within an organization including, but not limited to, Lake House architecture and solution development, performance optimization, data feeds development, and opportunities to contribute to Machine Learning & AI initiatives. This role blends advanced technical skills with leadership capabilities to drive the development and integration solutions at scale. You will contribute to bringing the product up to modern cloud and tool stack. You will play a crucial role in collaborating and managing cross-functional relationships to ensure seamless integration and alignment of data initiatives and translate business requirements into technical solutions. 

Job Responsibilities: 

  • Lead the design and implementation of data lake architecture based on variety of technologies such as Databricks, Exasol, S3. 
  • Take accountability and ownership for deploying technical frameworks, processes and best practices which allow engineers of all levels to build extensible, performant and maintainable solutions. 
  • Manage cross-team and stakeholder relationships to drive collaboration and meet shared goals. 
  • Design and implement scalable, reliable, and high-performance data architectures to support large-scale data processing and machine learning workflows. 
  • Architect and develop end-to-end data pipelines, including data extraction, transformation, and loading (ETL) processes. 
  • Optimize data pipelines and storage solutions for performance, scalability, and cost efficiency.  
  • Design the process for monitoring and troubleshooting of data infrastructure issues, identifying performance bottlenecks and ensuring high uptime. 
  • Utilize containerized, serverless architecture patterns in system design; 
  • Promote and drive automated testing, DevOps & CI/CD methodologies to work successfully within an agile environment. 
  • Ensure that data governance, privacy, and security policies are adhered to, in compliance with industry standards and regulations (e.g., GDPR, etc). 
  • Lead, mentor, and support a team of data engineers, providing guidance and support for their technical development. 
  • Collaborate with global cross-functional teams including DevOps, security teams and business stakeholders. 
  • Collaborate with data scientists and machine learning engineers to ensure seamless integration with AI/ML projects. 
  • Stay current with emerging data technologies and trends, evaluating and implementing new tools, frameworks, and platforms to improve the data engineering workflows. 
  • Foster a culture of continuous improvement, encouraging innovation and the adoption of modern tools and best practices in data engineering. 

  • MS/BS in Computer Science or related background is essential; 
  • Significant hands-on experience (7+ years) in data engineering, with 2+ years in lead or senior technical role; 
  • Proficiency with Python and SQL is essential; 
  • Proficiency with Spark is essential;  
  • Proven track record of successfully managing large-scale data architectures; 
  • Strong expertise in designing and managing data lakes, data warehouses, data modelling, ETL processes, and database design; 
  • Strong leadership and mentoring skills to guide and develop junior team members; 
  • Experience with shell scripting, system diagnostic and automation tooling; 
  • Experience with various database technologies (MS SQL, Postgres, MySQL) including database performance optimization (e.g., indexing, query optimization); 
  • Experience with No-SQL technologies; 
  • Experience with cloud services (AWS); 
  • Proven experience in implementing DevOps practices; 
  • Experience implementing data quality and code quality practices; 
  • Experience with various programming languages (Java, Scala, Javascript, etc) is beneficial; 
  • Proficiency with infrastructure as a code, code automation, CI/CD is beneficial; 
  • Experience in data governance and compliance is beneficial; 
  • Experience with Docker and containers is desirable; 
  • Experience in visualization tools such PowerBI is desirable; 
  • Excellent interpersonal skills with the ability to collaborate and communicate effectively across diverse teams; 
  • Strong problem solving, organization and analytical skills; 
  • Ability to manage competing priorities, handle complexity, and drive projects to completion; 
  • Keen eye for detail. 

See more jobs at Extreme Reach

Apply for this job

11d

Senior Data Engineer

Nile BitsCairo, Egypt, Remote
agileairflowsqlDesigndockerlinuxpythonAWS

Nile Bits is hiring a Remote Senior Data Engineer

Job Description

  • Designing and implementing core functionality within our data pipeline in order to support key business processes
  • Shaping the technical direction of the data engineering team
  • Supporting our Data Warehousing approach and strategy
  • Maintaining our data infrastructure so that our jobs run reliably and at scale
  • Taking responsibility for all parts of the data ecosystem, including data governance, monitoring and alerting, data validation, and documentation
  • Mentoring and upskilling other members of the team

Qualifications

  • Experience building data pipelines and/or ETL processes
  • Experience working in a Data Engineering role
  • Confident writing performant and readable code in Python, building upon the rich Python ecosystem wherever it makes sense to do so.
  • Good software engineering knowledge & skills: OO programming, design patterns, SOLID design principles and clean code
  • Confident writing SQL and good understanding of database design.
  • Experience working with web APIs.
  • Experience leading projects from a technical perspective
  • Knowledge of Docker, shell scripting, working with Linux
  • Experience with a cloud data warehouse
  • Experience in managing deployments and implementing observability and fault tolerance in cloud based infrastructure (i.e. CI/CD, Infrastructure as Code, container-based infrastructure, auto-scaling, monitoring and alerting)
  • Pro-active with a self-starter mindset; able to identify elegant solutions to difficult problems and able to suggest new and creative approaches.
  • Analytical, problem-solving and an effective communicator; leveraging technology and subject matter expertise in the business to accelerate our roadmap.
  • Able to lead technical discussions, shape the direction of the team, identify opportunities for innovation and improvement
  • Able to lead and deliver projects, ensuring stakeholders are kept up-to-date through regular communication
  • Willing to support the rest of the team when necessary, sharing knowledge and best practices, documenting design decisions, etc.
  • Willing to step outside your comfort zone to broaden your skills and learn new technologies.
  • Experience working with open source orchestration frameworks like Airflow or data analytics tools such as dbt
  • Experience with AWS services or those of another cloud provider
  • Experience with Snowflake
  • Good understanding of Agile

See more jobs at Nile Bits

Apply for this job

11d

ML Data Engineer

MLMid LevelFull TimeagilesqlDesignazure

Titan Healthcare Management Solutions is hiring a Remote ML Data Engineer

ML Data Engineer - Titan Healthcare Management Solutions - Career Page #resumator-a

See more jobs at Titan Healthcare Management Solutions

Apply for this job

12d

Sr Big Data Engineer

Ingenia AgencyMexico - Remote
Bachelor's degreesqloraclepython

Ingenia Agency is hiring a Remote Sr Big Data Engineer

At Ingenia Agency we’re looking for a Data Engineerto join our team.

Responsible for creating and sustaining pipelines that allow for the analysis of data.

What will you be doing?

  • Conceptualizing and generating infrastructure that allows data to be accessed and analyzed in a global setting.
  • Load raw data from our SQL Servers, manipulate and save it into Google Cloud databases.
  • Detecting and correcting errors in data and writing scripts to clean such data up.
  • Work with scientists and clients in the business to gather requirements and ensure easy flow of data.

What are we looking for?

  • Age indifferent.
  • Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field.
  • Master's degree in a relevant field is advantageous.
  • Proven experience as a Data Engineer.
  • Expert proficiency in Python, ETL and SQL.
  • Familiarity with Google Cloud/ AWS/Azure or suitable equivalent.
  • Excellent analytical and problem-solving skills.
  • A knack for independent and group work.
  • Knowledge of Oracle and MDM Hub.
  • Capacity to successfully manage a pipeline of duties with minimal supervision.
  • Advanced English.
  • Be Extraordinary!

What are we offering?

  • Competitive salary
  • Contract for specific period of time
  • Law benefits:
    • 10 days of vacations to the first year fulfilled
    • IMSS
  • Additional benefits:
    • Contigo Membership (Insurance of minor medical expenses)
      • Personal accident policy.
      • Funeral assistance.
      • Dental and visual health assistance.
      • Emotional wellness.
      • Benefits & discounts.
      • Network of medical services and providers with a discount.
      • Medical network with preferential prices.
      • Roadside assistance with preferential price, among others.
    • 3 special permits a year, to go out to any type of procedure that you have to do half day equivalent
    • Half day off for birthdays
    • 5 days of additional vacations in case of marriage
    • 50% scholarship in language courses in the Anglo
    • Percentage scholarship in the study of graduates or masters with the Tec. de Mty.
    • Agreement with ticket company for preferential rates for events of entertainment.



See more jobs at Ingenia Agency

Apply for this job

12d

Sr Data Engineer GCP

Ingenia AgencyMexico - Remote
Bachelor's degree5 years of experience3 years of experienceairflowsqlapipython

Ingenia Agency is hiring a Remote Sr Data Engineer GCP


AtIngenia Agency we’re looking for a Sr Data Engineer to join our team.

Responsible for creating and sustaining pipelines that allow for the analysis of data.

What will you be doing?

  • Sound understanding of Google Cloud Platform.
  • Should have worked on Big Query, Workflow or Composer.
  • Should know how to reduce BigQuery costs by reducing the amount of data processed by the queries.
  • Should be able to speed up queries by using denormalized data structures, with or without nested repeated fields.
  • Exploring and preparing data using BigQuery.
  • Experience in delivering artifacts scripts Python, dataflow components, SQL, Airflow and Bash/Unix scripting.
  • Building and productionizing data pipelines using dataflow.

What are we looking for?

  • Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field.
  • Age indifferent.
  • 3 to 5 years of experience in GCP is required.
  • Must have Excellent GCP, Big Query and SQL skills.
  • Should have at least 3 years of experience in BigQuery Dataflow and Experience with Python and Google Cloud SDK API Scripting to create reusable framework.
  • Candidate should have strong hands-on experience in PowerCenter
  • In depth understanding of architecture, table partitioning, clustering, type of tables, best practices.
  • Proven experience as a Data Engineer, Software Developer, or similar.
  • Expert proficiency in Python, R, and SQL.
  • Candidates with Google Cloud certification will be preferred
  • Excellent analytical and problem-solving skills.
  • A knack for independent and group work.
  • Capacity to successfully manage a pipeline of duties with minimal supervision.
  • Advanced English.
  • Be Extraordinary!

What are we offering?

  • Competitive salary
  • Law benefits:
    • 10 days of vacations to the first year fulfilled
    • IMSS
  • Additional benefits:
    • Contigo Membership (Insurance of minor medical expenses)
      • Personal accident policy.
      • Funeral assistance.
      • Dental and visual health assistance.
      • Emotional wellness.
      • Benefits & discounts.
      • Network of medical services and providers with a discount.
      • Medical network with preferential prices.
      • Roadside assistance with preferential price, among others.
    • 3 special permits a year, to go out to any type of procedure that you have to do half day equivalent
    • Half day off for birthdays
    • 5 days of additional vacations in case of marriage
    • 50% scholarship in language courses in the Anglo
    • Percentage scholarship in the study of graduates or masters with the Tec. de Mty.
    • Agreement with ticket company for preferential rates for events of entertainment.

See more jobs at Ingenia Agency

Apply for this job

12d

Data Engineer

In All Media IncArgentina - Remote
DevOPSmongodbapidockerkubernetespythonAWSbackend

In All Media Inc is hiring a Remote Data Engineer

Data Analyst Engineer

The objective of this project and role is to focus on delivering the solution your business partners need to grow the business, e.g. an application, an API, a rules engine, or a Data Pipeline. You know what it takes to deliver the best possible, within the given deadline

Deliverables
Tool called conversion, delivering recommendations on this tool , solving technical debt updates and maintaining add net new recommendations, we can directly measure these recommendations by count and impact (example - how many more features were adopted)
CS - simplify data on active points and deliver the best recommendations, more net new


    Requirements:

      Technologies Backend * Python * Flask * SQLAlchemy * PyMySQL * MongoDB * Internal SOA libraries * Healthcheck tools * Tracing tools DevOps * GitLab CI/CD * Docker * Kubernetes * AWS We're seeking a talented Software Engineer Level 2 to join our dynamic team responsible for developing a suite of innovative tools. These tools are essential in automating and streamlining communication processes with our clients. If you are passionate about solving complex problems and improving user experiences, we want you on our team.

    See more jobs at In All Media Inc

    Apply for this job

    13d

    Data Engineer

    Charlotte TilburyLondon,England,United Kingdom, Remote Hybrid
    terraformairflowsqlDesigngitpythonAWSjavascript

    Charlotte Tilbury is hiring a Remote Data Engineer

    About Charlotte Tilbury Beauty

    Founded by British makeup artist and beauty entrepreneur Charlotte Tilbury MBE in 2013, Charlotte Tilbury Beauty has revolutionised the face of the global beauty industry by de-coding makeup applications for everyone, everywhere, with an easy-to-use, easy-to-choose, easy-to-gift range. Today, Charlotte Tilbury Beauty continues to break records across countries, channels, and categories and to scale at pace.

    Over the last 10 years, Charlotte Tilbury Beauty has experienced exceptional growth and is one of the most talked about brands in the beauty industry and beyond. It has become a global sensation across 50 markets (and growing), with over 2,300 employees globally who are part of the Dream Team making the magic happen.

    Today, Charlotte Tilbury Beauty is a truly global business, delivering market-leading growth, innovative retail and product launches fuelled by industry-leading tech — all with an internal culture of embracing challenges, disruptive thinking, winning together, and sharing the magic. The energy behind the bran­d is infectious, and as we grow, we are always looking for extraordinary talent who want to be part of this our success and help drive our limitless ambitions.

    The Role

     

    Data is at the heart of our strategy to engage and delight our customers, and we are determined to harness its power to go as far as we can to deliver a euphoric, personalised experience that they'll love. 

     

    We're seeking a skilled and experienced Data Engineer to join our Data function to join our team of data engineers in the design, build & maintenance of the pipelines that support this ambition. The ideal candidate will not only be able to see many different routes to engineering success, but also to work collaboratively with Engineers, Analysts, Scientists & stakeholders to design & build robust data products to meet business requirements.

     

    Our stack is primarily GCP, with Fivetran handling change detection capture, Google Cloud Functions for file ingestion, Dataform & Composer (Airflow) for orchestration, GA & Snowplow for event tracking and Looker as our BI Platform. We use Terraform Cloud to manage our infrastructure programmatically as code.

     

    Reporting Relationships

     

    This role will report into the Lead Data Engineer

     

    About you and attributes we're looking for



    • Extensive experience with cloud data warehouses and analytics query engines such as BigQuery, Redshift or Snowflow and a good understanding of cloud technologies in general. 
    • Proficient in SQL, Python and Git 
    • Prior experience with HCL (Terraform configuration language), YAML, JavaScript, CLIs and Bash.
    • Prior experience with serverless tooling e.g. Google Cloud Functions, AWS Lambdas, etc.
    • Familiarity with tools such as Fivetran and Dataform/DBT 
    • Bachelor's or Master's degree in Computer Science, Data Science, or related field 
    • Collaborative mindset and a passion for sharing ideas & knowledge
    • Demonstrable experience developing high quality code in the retail sector is a bonus

    At Charlotte Tilbury Beauty, our mission is to empower everybody in the world to be the most beautiful version of themselves. We celebrate and support this by encouraging and hiring people with diverse backgrounds, cultures, voices, beliefs, and perspectives into our growing global workforce. By doing so, we better serve our communities, customers, employees - and the candidates that take part in our recruitment process.

    If you want to learn more about life at Charlotte Tilbury Beauty please follow ourLinkedIn page!

    See more jobs at Charlotte Tilbury

    Apply for this job

    13d

    Data Engineer

    LegalistRemote
    agilenosqlsqlDesignc++dockerkubernetesAWS

    Legalist is hiring a Remote Data Engineer

    Intro description:

    Legalist is an institutional alternative asset management firm. Founded in 2016 and incubated at Y Combinator, the firm uses data-driven technology to invest in credit assets at scale. We are always looking for talented people to join our team.

    As a highly collaborative organization, our data engineers work cross-functionally with software engineering, data science, and product management to optimize growth and strategy of our data pipeline. In this position, you will be joining the data engineering team in an effort to take our data pipeline to the next level.

    Where you come in:

    • Design and develop scalable data pipelines to collect, process, and analyze large volumes of data efficiently.
    • Collaborate with cross-functional teams including data scientists, software engineers, and product managers to understand data requirements and deliver solutions that meet business needs.
    • Develop ELT processes to transform raw data into actionable insights, leveraging tools and frameworks such as Airbyte, BigQuery, Dagster, DBT or similar technologies.
    • Participate in agile development processes, including sprint planning, daily stand-ups, and retrospective meetings, to deliver iterative improvements and drive continuous innovation.
    • Apply best practices in data modeling and schema design to ensure data integrity, consistency, and efficiency.
    • Continuously monitor and optimize data pipelines and systems for performance, availability, scalability, and cost-effectiveness.

    What you’ll be bringing to the team:

    • Bachelor’s degree (BA or BS) or equivalent.
    • A minimum of 2 years of work experience in data engineering or similar role.
    • Advanced SQL knowledge and experience working with a variety of databases (SQL, NoSQL, Graph, Multi-model).
    • A minimum of 2 years professional experience with ETL//ELT, data modeling and Python.
    • Familiarity with cloud environments like GCP, AWS, as well as cloud solutions like Kubernetes, Docker, BigQuery, etc.
    • You have a pragmatic, data-driven mindset and are not dogmatic or overly idealistic about technology choices and trade-offs.
    • You have an aptitude for learning new things quickly and have the confidence and humility to ask clarifying questions.

    Even better if you have, but not necessary:

    • Experience with one or more of the following: data processing automation, data quality, data warehousing, data governance, business intelligence, data visualization.
    • Experience working with TB scale data.

    See more jobs at Legalist

    Apply for this job

    16d

    Senior ML & Data Engineer

    XeBrazil, Remote
    MLDevOPSDesignmobileapidockerpythonAWSbackend

    Xe is hiring a Remote Senior ML & Data Engineer

    At XE, we live currencies. We provide a comprehensive range of currency services and products, including our Currency Converter, Market Analysis, Currency Data API and quick, easy, secure Money Transfers for individuals and businesses. We leverage technology to deliver these services through our website, mobile apps and by phone. Last year, we helped nearly 300 million people access information about the currencies that matter to them, and over 150,000 people used us to send money overseas. Thousands of businesses relied on us for information about the currency markets, advice on managing their foreign exchange risk or trusted us with their business-critical international payments. At XE, we share the belief that behind every currency exchange, query or transaction is a person or business trying to accomplish something important, so we work together to develop new and better currency services that put our customers first. We are proud to be part of Euronet Worldwide (Nasdaq: EEFT), a global leader in processing secure electronic financial transactions. Under Euronet, we have brought together our key brands – XE, HiFX and Currency Online– to become the business that XE is today.

    The Senior ML and Data Engineer will be responsible for designing, building, and maintaining the infrastructure, platform, and processes required to successfully deploy and manage machine learning models in a production environment. This includes tasks such as developing Entity Resolution Solutions, building production features, and integrating ML solutions into production systems. 

    This role will work closely with data scientists and software engineers to ensure that machine learning models can seamlessly integrate into existing systems and processes. The role will also be responsible for identifying and implementing best practices for managing and optimizing machine learning models in production. 

    Thea ideal candidate for this role will have extensive experience in both software engineering and machine learning, as well as a deep understanding of the challenges and best practices involved in deploying machine learning models in production. Experience working with cloud computing platforms such as AWS or GCP is a plus. 

     

    What You'll Do

    • Build and maintain production-level real-time and batch MLOps pipelines. 
    • Deploy backend and real-time machine learning features and models. 
    • Design and develop multiple ML microservices and APIs. 
    • Monitor and optimize the performance of machine learning systems in production. 
    • Work closely with data scientists and software engineers to ensure the successful integration of machine learning microservices into existing systems and processes. 
    • Mentor junior engineers and provide technical leadership within the team. 
    • Stay updated with the latest advancements in machine learning and data engineering technologies and methodologies. 

    Who You Are

    • Degree in Computer Science, Software Engineering, or a related discipline. 
    • Extensive experience in developing and maintaining API services in a cloud environment. 
    • Strong object and service-oriented programming skills in Python to write efficient, scalable code. 
    • Knowledge of modern containerization techniques - Docker, Docker Compose. 
    • Experience with relational and unstructured databases and data lakes. 
    • An understanding of business goals and how data policies can affect them. 
    • Effective communication and collaboration skills. 
    • A strong understanding of the concepts associated with privacy and data security. 
    • Proven experience in mentoring and leading engineering teams. 
    • Familiarity with CI/CD pipelines and DevOps practices. 

    Perks & Benefits

    • Annual salary increase review
    • End of the year bonus (Christmas bonus)
    • ESPP (Employee Stock Purchase Plan)
    • 30 days vacation per year
    • Insurance guaranteed for employees ( Health, Oncological , Dental , Life Insurance)
    • No fee when using RIA service/wire transfers

    We want Xe to be a great place to work and to ensure that our communities are represented across our workforce.  A vital part of this is ensuring we are a truly inclusive organisation that encourages diversity in all respects. 

    At Xe we are committed to making our recruitment practices barrier-free and as accessible as possible for everyone.  This includes making adjustments or changes for disabled people, neurodiverse people or people with long-term health conditions. If you would like us to do anything differently during the application, interview or assessment process, including providing information in an alternative format, please contact us on recruitment@xe.com 

    See more jobs at Xe

    Apply for this job

    16d

    Senior ML & Data Engineer

    XeChile, Remote
    MLDevOPSDesignmobileapidockerpythonAWSbackend

    Xe is hiring a Remote Senior ML & Data Engineer

    At XE, we live currencies. We provide a comprehensive range of currency services and products, including our Currency Converter, Market Analysis, Currency Data API and quick, easy, secure Money Transfers for individuals and businesses. We leverage technology to deliver these services through our website, mobile apps and by phone. Last year, we helped nearly 300 million people access information about the currencies that matter to them, and over 150,000 people used us to send money overseas. Thousands of businesses relied on us for information about the currency markets, advice on managing their foreign exchange risk or trusted us with their business-critical international payments. At XE, we share the belief that behind every currency exchange, query or transaction is a person or business trying to accomplish something important, so we work together to develop new and better currency services that put our customers first. We are proud to be part of Euronet Worldwide (Nasdaq: EEFT), a global leader in processing secure electronic financial transactions. Under Euronet, we have brought together our key brands – XE, HiFX and Currency Online– to become the business that XE is today.

    The Senior ML and Data Engineer will be responsible for designing, building, and maintaining the infrastructure, platform, and processes required to successfully deploy and manage machine learning models in a production environment. This includes tasks such as developing Entity Resolution Solutions, building production features, and integrating ML solutions into production systems. 

    This role will work closely with data scientists and software engineers to ensure that machine learning models can seamlessly integrate into existing systems and processes. The role will also be responsible for identifying and implementing best practices for managing and optimizing machine learning models in production. 

    Thea ideal candidate for this role will have extensive experience in both software engineering and machine learning, as well as a deep understanding of the challenges and best practices involved in deploying machine learning models in production. Experience working with cloud computing platforms such as AWS or GCP is a plus. 

     

    What You'll Do

    • Build and maintain production-level real-time and batch MLOps pipelines. 
    • Deploy backend and real-time machine learning features and models. 
    • Design and develop multiple ML microservices and APIs. 
    • Monitor and optimize the performance of machine learning systems in production. 
    • Work closely with data scientists and software engineers to ensure the successful integration of machine learning microservices into existing systems and processes. 
    • Mentor junior engineers and provide technical leadership within the team. 
    • Stay updated with the latest advancements in machine learning and data engineering technologies and methodologies. 

    Who You Are

    • Degree in Computer Science, Software Engineering, or a related discipline. 
    • Extensive experience in developing and maintaining API services in a cloud environment. 
    • Strong object and service-oriented programming skills in Python to write efficient, scalable code. 
    • Knowledge of modern containerization techniques - Docker, Docker Compose. 
    • Experience with relational and unstructured databases and data lakes. 
    • An understanding of business goals and how data policies can affect them. 
    • Effective communication and collaboration skills. 
    • A strong understanding of the concepts associated with privacy and data security. 
    • Proven experience in mentoring and leading engineering teams. 
    • Familiarity with CI/CD pipelines and DevOps practices. 

    Perks & Benefits

    • Annual salary increase review
    • End of the year bonus (Christmas bonus)
    • ESPP (Employee Stock Purchase Plan)
    • Paid day off for birthday
    • 15 days vacation per year
    • Insurance guaranteed for employees (Health, Oncological, Dental, Life)
    • No fee when using Ria service/wire transfers

    We want Xe to be a great place to work and to ensure that our communities are represented across our workforce.  A vital part of this is ensuring we are a truly inclusive organisation that encourages diversity in all respects. 

    At Xe we are committed to making our recruitment practices barrier-free and as accessible as possible for everyone.  This includes making adjustments or changes for disabled people, neurodiverse people or people with long-term health conditions. If you would like us to do anything differently during the application, interview or assessment process, including providing information in an alternative format, please contact us on recruitment@xe.com 

    See more jobs at Xe

    Apply for this job