Data Engineer Remote Jobs

101 Results

Libertex Group is hiring a Remote Data Engineer

Established in 1997, the Libertex Group has helped shape the online trading industry by merging innovative technology, market movements and digital trends. 

The multi-awarded online trading platform, Libertex, enables traders to access the market and invest in stocks or trade CFDs with underlying assets being commodities, Forex, ETFs, cryptocurrencies, and others.

Libertex is, also, the Official Online Trading Partner of FC Bayern, bringing the exciting worlds of football and trading together.

We build innovative fintech so people can #TradeForMore with Libertex.

Job Overview

We are responsible for designing and implementing ETL processes using modern dbt technology, managing DWH, data marts, and dashboards, as well as modeling, transforming, testing, and deploying data.

  • Strong SQL Skills (T-SQL preferred) - Expertise in writing complex queries, optimizing database performance, and ensuring data integrity. Ability to design and develop data models, ETL/ELT pipelines, and transformations.
  • Experience with MSSQL Server - Hands-on experience in database design, query optimization, and performance tuning on MSSQL.
  • Familiarity with dbt (Data Build Tool) - Experience in developing, managing, and optimizing data models and transformations in dbt. Ability to design and implement robust data pipelines using dbt, ensuring data accuracy and reliability.
  • Proficiency in Python - Ability to write clean, efficient, and reusable Python scripts for automating data processes. Experience in writing Python code to handle ETL tasks, data manipulation, and API integrations.

Nice to have:

  • Experience with Apache Airflow will be big plus
  • Experience with Docker will be plus
  • Experience with GitLab CI/CD will be plus
  • Strong communication skills to collaborate with data engineers, analysts, and business stakeholders.
  • Proactive problem-solving attitude and a continuous improvement mindset.
  • Excel (MS Office) - advanced level
  • Intermediate (B1) and higher level of English

Responsibilities:

  • Interaction with all team members and participation in development at all stages.
  • Building data integration; Development and transforming data via dbt-models.
  • Creating auto-tests, documentation of models and tests.
  • Create data pipelines on regular base.
  • Data warehouse performance optimizations; Applying best data engineering practices.

  • Work in a pleasant and enjoyable environment near the Montenegrin sea or mountains
  • Quarterly bonuses based on Company performance
  • Generous relocation package for the employee and their immediate family/partner 
  • Medical Insurance Plan with coverage for the employee and their immediate family from day one
  • 24 working days of annual leave 
  • Yearly reimbursement of travel expenses for the employee and family's flight home
  • Corporate events and team building activities
  • Udemy Business unlimited membership & language training courses 
  • Professional and personal development opportunities in a fast-growing environment 

See more jobs at Libertex Group

Apply for this job

1d

Senior Data Engineer

DailyPay IncRemote, United States
SalestableausqlDesignc++python

DailyPay Inc is hiring a Remote Senior Data Engineer

About Us:

DailyPay, Inc. is transforming the way people get paid. As the industry’s leading on-demand pay solution, DailyPay uses an award-winning technology platform to help America’s top employers build stronger relationships with their employees. This voluntary employee benefit enables workers everywhere to feel more motivated to work harder and stay longer on the job, while supporting their financial well-being outside of the workplace.

DailyPay is headquartered in New York City, with operations throughout the United States as well as in Belfast. For more information, visit DailyPay's Press Center.

The Role:

DailyPay is looking for a Senior Data Engineer to join our Data Engineering Team. The Data Engineering Team is responsible for building the data infrastructure that underpins our data analytics and data products that are used cross-functionally inside the company (sales, marketing, operations, engineering, etc.) as well as by DailyPay partner companies. The team also ingests internal and external data to help provide insights about the payroll industry in general, as well as about personal finance and financial wellbeing.

If this opportunity excites you, we encourage you to apply even if you do not meet all of the qualifications.

How You Will Make an Impact:

  • Build and maintain company’s ETL/ELT infrastructure and data pipelines
  • Design and implement data testing / scaling capabilities for data pipelines
  • Maintain and optimize monitoring and alerting for company’s ELT, data pipelines, data warehouse, and analytics infrastructure
  • Optimize database performance while reducing warehouse costs and development times
  • Participate in code approvals and PR review process for company-wide analytics engineering efforts
  • Architect data lakehouse for DailyPay to use as the single source of truth for internal and external client reporting/analytics

What You Bring to The Team:

  • 7+ years SQL experience; expert SQL capability
  • Familiarity with BI tools such as Tableau, Looker, Metabase or similar
  • Experience in Data Architecture for Dimensional Models, Data Lakes and Data Lakehouses
  • Excellent communication skills including experience speaking to technical and business audiences and working globally
  • 2+ years Python experience
  • 1+ years of dbt experience
  • Experience with Snowflake, Redshift, and ETL tools like Fivetran, Qlik Replicate or Stitch is a plus

What We Offer:

  • Exceptional health, vision, and dental care
  • Opportunity for equity ownership
  • Life and AD&D, short- and long-term disability
  • Employee Assistance Program
  • Employee Resource Groups
  • Fun company outings and events
  • Unlimited PTO
  • 401K with company match

 

 

#BI-Remote #LI-Remote

 

Pay Transparency.  DailyPay takes a market-based approach to compensation, which may vary depending on your location. United States locations are categorized into three tiers based on a cost of labor index for that geographic area. The salary ranges are listed by geographic tier. Additionally, this role may be eligible for variable incentive compensation and stock options. Where a candidate fits within the compensation range for a role is based on their demonstrated experience, qualifications, skills, and internal equity. 

New York City
$145,000$194,000 USD
Remote, Premium (California, Connecticut, Washington D.C., New Jersey, New York, Massachusetts, Washington)
$133,000$178,000 USD
Remote, Standard
$126,000$169,000 USD

 


DailyPay is committed to fostering an inclusive, equitable culture of belonging, grounded in empathy and respect, which values openness to opinions, awareness of lived experiences, fair treatment and access for all. We strive to build and develop diverse teams to create an organization where innovation thrives, where the full potential of each person is engaged, and their views, beliefs and values are integrated into our ways of working. 

We encourage people of all backgrounds to join us on our mission. If you require reasonable accommodation for any aspect of the recruitment process, please send a request to peopleops@dailypay.com. All requests for accommodation will be addressed as confidentially as practicable.

DailyPay is an equal opportunity employer. All qualified applicants will receive consideration without regard to race, color, religion or creed, alienage or citizenship status, political affiliation, marital or partnership status, age, national origin, ancestry, physical or mental disability, medical condition, veteran status, gender, gender identity, pregnancy, childbirth (or related medical conditions), sex, sexual orientation, sexual and other reproductive health decisions, genetic disorder, genetic predisposition, carrier status, military status, familial status, or domestic violence victim status and any other basis protected under federal, state, or local laws.

See more jobs at DailyPay Inc

Apply for this job

5d

Junior Data Engineer

Accesa - RatiodataEmployee can work remotely, Romania, Remote
agileairflowpostgressqlDesignjavapythonbackend

Accesa - Ratiodata is hiring a Remote Junior Data Engineer

Job Description

One of our clients operates prominently in the financial sector, where we enhance operations across their extensive network of 150,000 workstations and support a workforce of 4,500 employees. As part of our commitment to optimizing data management strategies, we are migrating data warehouse (DWH) models into data products within the Data Integration Platform (DIP). 

Responsibilities: 

Drive Data Efficiency: Create and maintain optimal data transformation pipelines. 

Master Complex Data Handling: Work with large, complex financial data sets to generate outputs that meet functional and non-functional business requirements. 

Lead Innovation and Process Optimization: Identify, design, and implement process improvements such as automating manual processes, optimizing data delivery, and re-designing infrastructure for higher scalability. 

• Architect Scalable Data Infrastructure: Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using open-source technologies. 

• Unlock Actionable Insights: Build/use analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. 

• Collaborate with Cross-Functional Teams: Work with stakeholders, including Senior Management, Departments Heads, Product, Data, and Design teams, to assist with data-related technical issues and support their data infrastructure needs. 

Qualifications

Must have: 

  • 1+ years of experience in a similar role, preferably within Agile teams. 

  • Skilled in SQL and relational databases for data manipulation. 

  • Experience in building and optimizing Big Data pipelines and architectures. 

  • Knowledge of Big Data tools such as Spark, and object-oriented programming in Java; experience with Spark using Python is a plus.  

  • Proven experience in performing data analysis and root cause analysis on diverse datasets to identify opportunities for improvement. 

  • Strong analytical skills in working with both structured and unstructured data. 

 

Nice to have: 

  • Experience in engaging with customer stakeholders  

  • Expertise in manipulating and processing large, disconnected datasets to extract actionable  

  • Familiarity with innovative technologies in message queuing, stream processing, and scalable big data storage solutions  

  • Technical skills in the following areas are a plus: Relational Databases (eg. Postgres), Big Data Tools: (eg. Databricks), and Workflow Management (eg. Airflow), and Backend Development using Spring Boot.

  • Experience with ETL processes, including scheduling and orchestration using tools like Apache Airflow (or similar). 

Apply for this job

5d

Senior Data Engineering

NielsenIQIllinois, IL, Remote
DevOPSagileDesignazurejenkinspythonAWS

NielsenIQ is hiring a Remote Senior Data Engineering

Job Description

Position Description 

  • Meet with stakeholders to understand the big picture and asks. 
  • Recommend architecture aligned with the goals and objectives of the product/organization. 
  • Recommend standard ETL design patterns and best practice. 
  • Drive the detail design and architectural discussions as well as customer requirements sessions to support the implementation of code and procedures for our big data product.
  • Design and develop proof of concept/prototype to demonstrate architecture feasibility. 
  • Collaborate with developers on the team to meet product deliverables. 
  • Must have familiarity with data science tech stack. Any one of the  languages :SAS,SPSS code or R-code. 
  • Work independently and collaboratively on a multi-disciplined project team in an Agile development environment. 
  • Ability to identify and solve for code/design optimization. 
  • Learn and integrate with a variety of systems, APIs, and platforms. 
  • Interact with a multi-disciplined team to clarify, analyze, and assess requirements. 
  • Be actively involved in the design, development, and testing activities in big data applications. 

Qualifications

  • Hands-on experience Python and Pyspark, Jupyter Notebooks, Python. 
  • Familiarity with Databricks. Azure Databricks is a plus. 
  • Familiarity with data cleansing, transformation, and validation. 
  • Proven architecture skills on Big Data  projects. 
  • Hands-on experience with a code versioning tool such as GitHub, Bitbucket, etc. 
  • Hands-on experience building pipelines in GitHub (or Azure Devops,Github, Jenkins, etc.) 
  • Hands-on experience with Spark. 
  • Strong written and verbal communication skills. 
  • Self-motivated and ability to work well in a team. 

Any mix of the following skills is also valuable: 

  • Experience with data visualization tools such as Power BI or Tableau. 
  • Experience with DEVOPS CI/CD tools and automation processes (e.g., Azure DevOPS, GitHub, BitBucket). 
  • Experience with Azure Cloud Services and Azure Data Factory. 
  • Azure or AWS Cloud certification preferred. 

Education:

  • Bachelor of Science degree from an accredited university 

See more jobs at NielsenIQ

Apply for this job

7d

Senior Data Engineer

MozillaRemote
sqlDesignc++python

Mozilla is hiring a Remote Senior Data Engineer

To learn the Hiring Ranges for this position, please select your location from the Apply Now dropdown menu.

To learn more about our Hiring Range System, please click this link.

Why Mozilla?

Mozilla Corporation is the non-profit-backed technology company that has shaped the internet for the better over the last 25 years. We make pioneering brands like Firefox, the privacy-minded web browser, and Pocket, a service for keeping up with the best content online. Now, with more than 225 million people around the world using our products each month, we’re shaping the next 25 years of technology and helping to reclaim an internet built for people, not companies. Our work focuses on diverse areas including AI, social media, security and more. And we’re doing this while never losing our focus on our core mission – to make the internet better for people. 

The Mozilla Corporation is wholly owned by the non-profit 501(c) Mozilla Foundation. This means we aren’t beholden to any shareholders — only to our mission. Along with thousands of volunteer contributors and collaborators all over the world, Mozillians design, build and distributeopen-sourcesoftware that enables people to enjoy the internet on their terms. 

About this team and role:

As a Senior Data Engineer at Mozilla, your primary area of focus will be on our Analytics Engineering team. This team focuses on modeling our data so that the rest of Mozilla has access to it, in the appropriate format, when they need it, to help them make data informed decisions. This team is also tasked with helping to maintain and make improvements to our data platform. Some recent improvements include introducing a data catalog, building in data quality checks among others. Check out the Data@Mozilla blog for more details on some of our work.

What you’ll do: 

  • Work with data scientists to design data modes, answer questions and guide product decisions
  • Work with other data engineers to design and maintain scalable data models and ETL pipelines
  • Help improve the infrastructure for ingesting, storing and transforming data at a scale of tens of terabytes per day
  • Help design and build systems to monitor and analyze data from Mozilla’s products
  • Establish best practices for governing data containing sensitive information, ensuring compliance and security

What you’ll bring: 

  • At a minimum 3 years of professional experience in data engineering
  • Proficiency with the programming languages used by our teams (SQL and Python)
  • Demonstrated experience designing data models used to represent specific business activities to power analysis
  • Strong software engineering fundamentals: modularity, abstraction, data structures, and algorithms
  • Ability to work collaboratively with a distributed team, leveraging strong communication skills to ensure alignment and effective teamwork across different time zones
  • Our team requires skills in a variety of domains. You should have proficiency in one or more of the areas listed below, and be interested in learning about the others:
    • You have used data to answer specific questions and guide company decisions.
    • You are opinionated about data models and how they should be implemented; you partner with others to map out a business process, profile available data, design and build flexible data models for analysis.
    • You have experience recommending / implementing new data collection to help improve the quality of data models.
    • You have experience with data infrastructure: databases, message queues, batch and stream processing
    • You have experience building modular and reusable ETL/ELT pipelines in distributed databases
    • You have experience with highly scalable distributed systems hosted on cloud providers (e.g. Google Cloud Platform)
  • Commitment to our values:
    • Welcoming differences
    • Being relationship-minded
    • Practicing responsible participation
    • Having grit

What you’ll get:

  • Generous performance-based bonus plans to all regular employees - we share in our success as one team
  • Rich medical, dental, and vision coverage
  • Generous retirement contributions with 100% immediate vesting (regardless of whether you contribute)
  • Quarterly all-company wellness days where everyone takes a pause together
  • Country specific holidays plus a day off for your birthday
  • One-time home office stipend
  • Annual professional development budget
  • Quarterly well-being stipend
  • Considerable paid parental leave
  • Employee referral bonus program
  • Other benefits (life/AD&D, disability, EAP, etc. - varies by country)

About Mozilla 

Mozilla exists to build the Internet as a public resource accessible to all because we believe that open and free is better than closed and controlled. When you work at Mozilla, you give yourself a chance to make a difference in the lives of Web users everywhere. And you give us a chance to make a difference in your life every single day. Join us to work on the Web as the platform and help create more opportunity and innovation for everyone online.

Commitment to diversity, equity, inclusion, and belonging

Mozilla understands that valuing diverse creative practices and forms of knowledge are crucial to and enrich the company’s core mission.  We encourage applications from everyone, including members of all equity-seeking communities, such as (but certainly not limited to) women, racialized and Indigenous persons, persons with disabilities, persons of all sexual orientations,gender identities, and expressions.

We will ensure that qualified individuals with disabilities are provided reasonable accommodations to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment, as appropriate. Please contact us at hiringaccommodation@mozilla.com to request accommodation.

We are an equal opportunity employer. We do not discriminate on the basis of race (including hairstyle and texture), religion (including religious grooming and dress practices), gender, gender identity, gender expression, color, national origin, pregnancy, ancestry, domestic partner status, disability, sexual orientation, age, genetic predisposition, medical condition, marital status, citizenship status, military or veteran status, or any other basis covered by applicable laws.  Mozilla will not tolerate discrimination or harassment based on any of these characteristics or any other unlawful behavior, conduct, or purpose.

Group: D

#LI-DNI

Req ID: R2679

See more jobs at Mozilla

Apply for this job

7d

Senior Data Engineer

PlentificLondon,England,United Kingdom, Remote Hybrid
B2B

Plentific is hiring a Remote Senior Data Engineer

We're Plentific, the world’s leading real-time property solution, and we're looking for top talent to join our ambitious team. We’re a global company, headquartered in London, and operating across the United Kingdom, Germany and North America.

As a B2B company, we're dedicated to helping landlords, letting agents and property managers streamline operations, unlock revenue, increase tenant satisfaction, and remain compliant through our award-winning SaaS technology platform. We also work with SMEs and large service providers, helping them access more work and grow their businesses.

We're not just any proptech - we're backed by some of the biggest names in the business, including A/O PropTech, Highland Europe, Mubadala, RXR Digital Ventures and Target Global and work with some of the world’s most prominent real estate players.

But we're not just about business - we're also building stronger communities where people can thrive by ensuring the quality and safety of buildings, supporting decarbonisation through our ESG Retrofit Centre of Excellence and championing diversity across the sector through the Women’s Trade Network. We're committed to creating exceptional experiences for our team members, too. Our culture is open and empowering, and we're always looking for passionate, driven individuals to join us on our mission.

So, what's in it for you?

  • A fast-paced, friendly, collaborative and hybrid/flexible working environment
  • Ample opportunities for career growth and progression
  • A multicultural workplace with over 20 nationalities that value diversity, equity, and inclusion
  • Prioritisation of well-being with social events, digital learning, career development programs and much more

If you're ready to join a dynamic and innovative team that’s pioneering change in real estate, we'd love to hear from you.

The Role

We’re looking for a proactive and energetic individual with extensive experience in Data Engineering and Machine Learning to join our growing business. You’ll be working alongside highly technical and motivated teams and report to the Head of Data Engineering. You would be expected to contribute to the growth of the data/ML/AI products both internally and for our customers. You’ll be working on the cutting edge of technology and will thrive if you have a desire to learn and keep up to date with the latest trends in Data Infrastructure, Machine Learning and Generative AI. For people with the right mindset, this provides a very intellectually stimulating environment.

Responsibilities

  • Be one of the architects for our data model defined in dbt.
  • Take ownership and refine our existing real time data pipelines.
  • Create and maintain analytics dashboards that are defined as-code in Looker
  • Create and productize Machine Learning and LLM-based features
  • Be a mentor for the more junior data engineers in the team
  • Proficient in SQL and Python. A live coding interview is part of the hiring process.
  • Experience in data modelling with dbt
  • Experience organising the data governance across a company, including the matrix of access permissions for a data warehouse.
  • Experience with BI tools as code. Looker experience is a nice to have.
  • Experience building ETL/ELT data ingestion and transformation pipelines
  • Experience training Machine Learning Algorithms
  • Experience productizing Machine Learning from the infrastructure perspective (MLOps)
  • Nice to have: experience productizing multimodal (text, images, audio, video) GenAI products with frameworks such as LangChain

As you can see, we are quickly progressing with our ambitious plans and are eager to grow our team of doers to achieve our vision of managing over 2 million properties through our platform across various countries. You can help us shape the future of property management across the globe. Here’s what we offer:

  • A competitive compensation package
  • 25 days annual holiday
  • Flexible working environment including the option to work abroad
  • Private health care for you and immediate family members with discounted gym membership, optical, dental and private GP
  • Enhanced parental leave
  • Life insurance (4x salary)
  • Employee assistance program
  • Company volunteering day and charity salary sacrifice scheme
  • Learning management system powered by Udemy
  • Referral bonus and charity donation if someone you introduce joins the company
  • Season ticket loan, Cycle to work, Electric vehicle and Techscheme programs
  • Pension scheme
  • Work abroad scheme
  • Company-sponsored lunches, dinners and social gatherings
  • Fully stocked kitchen with drinks, snacks, fruit, breakfast cereal etc.

See more jobs at Plentific

Apply for this job

7d

Data Engineer

Blend36Edinburgh, United Kingdom, Remote
terraformsqlDesignazurepython

Blend36 is hiring a Remote Data Engineer

Job Description

Life as a Data Engineer at Blend

We are looking for someone who is ready for the next step in their career and is excited by the idea of solving problems and designing best in class. 

However, they also need to be aware of the practicalities of making a difference in the real world – whilst we love innovative advanced solutions, we also believe that sometimes a simple solution can have the most impact.   

Our Data Engineer is someone who feels the most comfortable around solving problems, answering questions and proposing solutions. We place a high value on the ability to communicate and translate complex analytical thinking into non-technical and commercially oriented concepts, and experience working on difficult projects and/or with demanding stakeholders is always appreciated. 

Reporting to a Senior Data Engineer and working closely with the Data Science and Business Development teams, this role will be responsible for driving high delivery standards and innovation in the company. Typically, this involves delivering data solutions to support the provision of actionable insights for stakeholders. 

What can you expect from the role? 

  • Preparing and presenting data driven solutions to stakeholders.
  • Analyse and organise raw data.
  • Design, develop, deploy and maintain ingestion, transformation and storage solutions.
  • Use a variety of Data Engineering tools and methods to deliver.
  • Own projects end-to-end.
  • Contributing to solutions design and proposal submissions.
  • Supporting the development of the data engineering team within Blend.
  • Maintain in-depth knowledge of data ecosystems and trends. 
  • Mentor junior colleagues.

Qualifications

What you need to have? 

  • Proven track record of building analytical production pipelines using Python and SQL programming.
  • Working knowledge of large-scale data such as data warehouses and their best practices and principles in managing them.
  • Experience with development, test and production environments and knowledge and experience of using CI/CD.
  • ETL technical design, development and support.
  • Knowledge of Data Warehousing and database operation, management & design.

Nice to have 

  • Knowledge in container deployment.
  • Experience of creating ARM template design and production (or other IaC, e.g., CloudFormation, Terraform).
  • Experience in cloud infrastructure management.
  • Experience of Machine Learning deployment.
  • Experience in Azure tools and services such as Azure ADFv2, Azure Databricks, Storage, Azure SQL, Synapse and Azure IoT.
  • Experience of leveraging data out of SAP or S/4HANA.

See more jobs at Blend36

Apply for this job

8d

Staff Data Security Engineer

GeminiRemote (USA)
remote-firstDesignkuberneteslinuxpythonAWS

Gemini is hiring a Remote Staff Data Security Engineer

About the Company

Gemini is a global crypto and Web3 platform founded by Tyler Winklevoss and Cameron Winklevoss in 2014. Gemini offers a wide range of crypto products and services for individuals and institutions in over 70 countries.

Crypto is about giving you greater choice, independence, and opportunity. We are here to help you on your journey. We build crypto products that are simple, elegant, and secure. Whether you are an individual or an institution, we help you buy, sell, and store your bitcoin and cryptocurrency. 

At Gemini, our mission is to unlock the next era of financial, creative, and personal freedom.

In the United States, we have a flexible hybrid work policy for employees who live within 30 miles of our office headquartered in New York City and our office in Seattle. Employees within the New York and Seattle metropolitan areas are expected to work from the designated office twice a week, unless there is a job-specific requirement to be in the office every workday. Employees outside of these areas are considered part of our remote-first workforce. We believe our hybrid approach for those near our NYC and Seattle offices increases productivity through more in-person collaboration where possible.

The Department: Platform Security

The Role: Staff Data Security Engineer

The Platform Security team secures Gemini’s infrastructure through service hardening and by developing and supporting a suite of foundational tools. We provide secure-by-default infrastructure, consumable security services, and expert consultation to engineering teams for secure cloud and non-cloud infrastructure.

The Platform Security team covers a broad problem space that includes all areas of Gemini’s platform infrastructure. In the past, this team has focused specifically on cloud security and we continue to invest heavily in this area.  This role will bring additional depth and specialization in database design, security and  We also value expertise in neighboring areas of infrastructure and platform security engineering including: PKI, core cryptography, identity management, network security, etc.

Responsibilities:

  • Design, deploy, and maintain database, and relevant security controls for security and engineering teams.
  • Build and improve security controls capturing data in transit and data at rest. 
  • Partner with engineering teams on security architecture and implementation decisions.
  • Own our database security roadmap and act as relevant SME within Gemini.
  • Collaborate with AppSec, Threat Detection, Incident Response, GRC and similar security functions to identify, understand, and reduce security risk.

Minimum Qualifications:

  • 6+ years of experience in the field.
  • Extensive knowledge of database architecture and security principles.
  • Significant experience with container orchestration technologies and relevant security considerations. We often use Kubernetes and EKS.
  • Experience in SRE, systems engineering, or network engineering.
  • Experience with distributed systems or cloud computing. We often use AWS.
  • Significant software development experience. We often use Python or Go.
  • Experience building and owning high-availability critical systems or cloud-based services
  • Able to self-scope, define, and manage short and long term technical goals.

Preferred Qualifications:

  • Proven track record securing databases and ensuring data integrity.
  • Experience securing AWS and Linux environments, both native and third-party.
  • Experience designing and implementing cryptographic infrastructure such as PKI, secrets management, authentication, or secure data storage/transmission.
  • Experience designing and implementing systems for identity and access management.
  • Experience with configuration management and infrastructure as code. We often use Terraform.
It Pays to Work Here
 
The compensation & benefits package for this role includes:
  • Competitive starting salary
  • A discretionary annual bonus
  • Long-term incentive in the form of a new hire equity grant
  • Comprehensive health plans
  • 401K with company matching
  • Paid Parental Leave
  • Flexible time off

Salary Range: The base salary range for this role is between $172,000 - $215,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidate’s compensation, we consider a number of factors including skillset, experience, job scope, and current market data.

At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace. If you have a specific need that requires accommodation, please let a member of the People Team know.

#LI-AH1

Apply for this job

14d

Data Engineer

Zone ITSydney,New South Wales,Australia, Remote Hybrid

Zone IT is hiring a Remote Data Engineer

We are currently seeking a highly motivated and experienced Data Engineer to a full-time position. You will be responsible for designing and implementing data architectures, integrating data from various sources, and optimizing data pipelines to ensure efficient and accurate data processing.

Key responsibilities:

  • Design and implement data architectures, including databases and processing systems
  • Integrate data from various sources and ensure data quality and reliability
  • Optimize data pipelines for scalability and performance
  • Develop and maintain ETL processes and data transformation solutions
  • Apply data security measures and ensure compliance with data privacy regulations
  • Create and maintain documentation related to data systems design and maintenance
  • Collaborate with cross-functional teams to understand data requirements and provide effective data solutions

Key skills and qualifications:

  • Bachelor's degree or higher in Computer Science, Data Science, or a related field
  • Strong proficiency in SQL, Python, and/or Java
  • Experience with ETL processes and data integration
  • Working knowledge of data modeling and database design principles
  • Familiarity with big data technologies such as Hadoop, Spark, or Kafka is a plus
  • Experience with cloud platforms such as AWS, Azure, or GCP is a plus
  • Strong problem-solving and analytical skills
  • Excellent communication and collaboration abilities

About Us

Zone IT Solutions is Australia based Recruitment Company. We specialize in Digital, ERP and larger IT Services. We offer flexible, efficient and collaborative solutions to any organization that requires IT, experts. Our agile, agnostic and flexible solutions will help you source the IT Expertise you need. Our delivery Offices are in Melbourne, Sydney and India. If you are looking for new opportunities; please share your profile at Careers@zoneitsolutions.com or contact us at 0434189909

Also follow our LinkedIn page for new job opportunities and more.

Zone IT Solutions is an equal opportunity employer and our recruitment process focuses on essential skills and abilities. We welcome applicants from a diverse range of backgrounds, including Aboriginal and Torres Strait Islander peoples, people from culturally and linguistically diverse (CALD) backgrounds and people with disabilities.

See more jobs at Zone IT

Apply for this job

14d

Data Engineer

Tech9Remote
MLFull TimeDevOPSagileterraformsqlDesignazurepython

Tech9 is hiring a Remote Data Engineer

Data Engineer - Tech9 - Career Page: Work with skil

See more jobs at Tech9

Apply for this job

18d

Sr. Data Engineer - Remote

Trace3Remote
DevOPSagilenosqlsqlDesignazuregraphqlapijavac++c#pythonbackend

Trace3 is hiring a Remote Sr. Data Engineer - Remote


Who is Trace3?

Trace3 is a leading Transformative IT Authority, providing unique technology solutions and consulting services to our clients. Equipped with elite engineering and dynamic innovation, we empower IT executives and their organizations to achieve competitive advantage through a process of Integrate, Automate, Innovate.

Our culture at Trace3 embodies the spirit of a startup with the advantage of a scalable business. Employees can grow their career and have fun while doing it!

Trace3 is headquartered in Irvine, California. We employ more than 1,200 people all over the United States. Our major field office locations include Denver, Indianapolis, Grand Rapids, Lexington, Los Angeles, Louisville, Texas, San Francisco.  

Ready to discover the possibilities that live in technology?

 

Come Join Us!

Street-Smart Thriving in Dynamic Times

We are flexible and resilient in a fast-changing environment. We continuously innovate and drive constructive change while keeping a focus on the “big picture.” We exercise sound business judgment in making high-quality decisions in a timely and cost-effective manner. We are highly creative and can dig deep within ourselves to find positive solutions to different problems.

Juice - The “Stuff” it takes to be a Needle Mover

We get things done and drive results. We lead without a title, empowering others through a can-do attitude. We look forward to the goal, mentally mapping out every checkpoint on the pathway to success, and visualizing what the final destination looks and feels like.

Teamwork - Humble, Hungry and Smart

We are humble individuals who understand how our job impacts the company's mission. We treat others with respect, admit mistakes, give credit where it’s due and demonstrate transparency. We “bring the weather” by exhibiting positive leadership and solution-focused thinking. We hug people in their trials, struggles, and failures – not just their success. We appreciate the individuality of the people around us.


 

Who We’re Looking For:

We’re looking to add a Senior Data Integration Engineer with a strong background in data engineering and development.  You will work with a team of software and data engineers to build client-facing data-first solutions utilizing data technologies such as SQL Server and MongoDB. You will develop data pipelines to transform/wrangle/integrate the data into different data zones.

To be successful in this role, you will need to hold extensive knowledge of SQL, relational databases, ETL pipelines, and big data fundamentals.  You will also need to possess strong experience in the development and consumption of RESTful APIs.  The ideal candidate will also be a strong independent worker and learner.

 

What You’ll Be Doing

  • Develop processes and data models for consuming large quantities of 3rd party vendor data via RESTful APIs.
  • Develop data processing pipelines to analyze, transform, and migrate data between applications and systems.
  • Analyze data from multiple sources and negotiate differences in storage schema using the ETL process.
  • Develop APIs for external consumption by partners and customers.
  • Develop and support our ETL environment by recommending improvements, monitoring, and deploying quality and validation processes to ensure accuracy and integrity of data.
  • Design, develop, test, deploy, maintain, and improve data integration pipelines.
  • Create technical solutions that solve business problems and are well engineered, operable, and maintainable.
  • Design and implement tools to detect data anomalies (observability). Ensure that data is accurate, complete, and high quality across all platforms.
  • Troubleshoot data issues and perform root cause analysis to proactively resolve product and operational issues.
  • Assembles large and complex data sets; develops data models based on specifications using structured data sets.
  • Develops familiarity with emerging and complex automations and technologies that support business processes.
  • Develops scalable and re-usable frameworks for ingestion and transformation of large datasets.
  • Work within an Agile delivery / DevOps methodology to deliver product increments in iterative sprints.
  • Working with stakeholders including the Executive, Product, Data and Design teams to support their data infrastructure needs while assisting with data-related technical issues.
  • Develop data models and mappings and build new data assets required by users. Perform exploratory data analysis on existing products and datasets.
  • Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.
  • Engage in logical and physical design of databases, table creation, script creation, views, procedures, packages, and other database objects.
  • Create documentation for solutions and processes implemented or updated to ensure team members and stakeholders can correctly interpret it.
  • Design and implement processes and/or process improvements to help the development of technology solutions.

 

Your Skills and Experience (In Order of Importance):

  • 5+ years of relational database development experience; including SQL query generation and tuning, database design, and data concepts.
  • 5+ years of backend and Restful API development experience in NodeJS (experience with GraphQL a plus).
  • 5+ years of development experience with the following languages Python, Java, C#/ .NET.
  • 5+ years of experience with SQL and NoSQL databases; including MS SQL and MongoDB.
  • 5+ years consuming RESTful APIs with data ingestion and storage.
  • 5+ years developing RESTful APIs for use by customers and 3rd
  • 3+ years of professional work experience designing and implementing data pipelines in a cloud environment.
  • 3+ years of experience working within Azure cloud.
  • Experience in integrating and ingesting data from external data sources.
  • Strong diagnostic skills and ability to research, troubleshoot, and logically determine solutions.
  • Ability to effectively prioritize tasks in a fast-paced, high-volume, and evolving work environment.
  • Comfortable managing multiple and changing priorities, and meeting deadlines.
  • Highly organized, detail-oriented, excellent time management skills.
  • Excellent written and verbal communication skills.
Actual salary will be based on a variety of factors, including location, experience, skill set, performance, licensure and certification, and business needs. The range for this position in other geographic locations may differ. Certain positions may also be eligible for variable incentive compensation, such as bonuses or commissions, that is not included in the base salary.
Estimated Pay Range
$142,500$168,700 USD

The Perks:

  • Comprehensive medical, dental and vision plans for you and your dependents
  • 401(k) Retirement Plan with Employer Match, 529 College Savings Plan, Health Savings Account, Life Insurance, and Long-Term Disability
  • Competitive Compensation
  • Training and development programs
  • Stocked kitchen with snacks and beverages
  • Collaborative and cool culture
  • Work-life balance and generous paid time off

 

***To all recruitment agencies: Trace3 does not accept unsolicited agency resumes/CVs. Please do not forward resumes/CVs to our careers email addresses, Trace3 employees or any other company location. Trace3 is not responsible for any fees related to unsolicited resumes/CVs.

See more jobs at Trace3

Apply for this job

19d

Process Engineer

Tessenderlo GroupPhoenix, AZ, Remote
Designapi

Tessenderlo Group is hiring a Remote Process Engineer

Job Description

Are you an experienced Chemical Engineer passionate about process optimization and hands-on work? Do you thrive in environments where you're given the autonomy to lead, innovate, and solve complex problems? If so, we have an exciting opportunity for you!

As a Process Engineer III with Tessenderlo Kerley, Inc., you will be pivotal in troubleshooting, designing, and implementing different processes at multiple sites. You will collaborate closely with plant operations, HS&E, and project teams to achieve company production, quality control, and compliance goals. In addition, you will work with the Process Engineering Manager and other engineers to learn company tools and standard practices. Tessenderlo Kerley has multiple facilities in the U.S. and abroad, offering countless opportunities for professional growth and development.

The ideal candidate for this role will have a sharp eye for detail, strong organizational skills and the ability to balance multiple projects. You’ll alsoneed a solid technical background in chemical plant operations, an interest in analyzing process data, and the drive to find practical solutions for engineering challenges.

Key Responsibilities:

  • Chemical engineering– Understanding piping and instrumentation diagrams, mass and energy balances, chemical compatibility, and product quality controls.
  • Process Safety Management – Participation or leadership of PHA/HAZOPs, assisting with change management.
  • Design– P&ID redlines, equipment/instrument specifications, and calculations (line sizing, PSV sizing per API codes, rotating equipment sizing).
  • Project Execution– Scope of work development, gathering and review of vendor bids, and collaboration with other engineering disciplines.
  • Field Work:Provide technical support for troubleshooting, turnarounds and project commissioning efforts at 2-4 sites, with approximately 30-40% travel.

    Qualifications

    What We’re Looking For:

    • A Bachelor of Science degree in Chemical Engineering.
    • At least five years of hands-on process engineering experience, ideally with some exposure to Sulfur Recovery Units.
    • Strong, independent decision-making skills to drive projects with minimal oversight.
    • Technical skills such as P&ID design, equipment/instrument sizing and selection, review of procedures and operating manuals.
    • A knack for balancing multiple projects and sites while maintaining safety and productivity standards.
    • A motivated, safety-conscious individual who inspires others through professionalism and effective communication.

    What we can offer you:

    • Independence: You will have the freedom to make impactful decisions and optimize processes with minimal supervision.
    • Continuous Learning: You will participate in seminars and gain exposure to various subjects, processes and cutting-edge technology.
    • Diverse Experiences: With both office and fieldwork, you'll collaborate with cross-functional teams, travel to multiple sites (domestic and minimal international), and tackle unique challenges.
    • Flexibility: Tessenderlo Kerley values professional growth and allows engineers to explore their interests related to company projects and assignments.
    • Safety First: You will join a company with an outstanding safety record and where your well-being is a top priority.

    Physical Requirements:

    • Ability to lift 50 pounds, climb stairs and use a variety of safety equipment, including respirators and SCBAs.

    If you’re a problem solver, project executor, and passionate about pushing the boundaries of process engineering, this is the role for you!

    Join our team and take your career to the next level by applying your skills to real-world challenges in a dynamic and rewarding environment.

    See more jobs at Tessenderlo Group

    Apply for this job

    21d

    Data Engineer

    Phocas SoftwareChristchurch,Canterbury,New Zealand, Remote Hybrid
    sqlpostgresqlpython

    Phocas Software is hiring a Remote Data Engineer

    We're a business planning and analytics company on a mission to make people feel good about data. Since 2001, we’ve helped thousands of companies turn complex business data into performance boosting results. Despite our now global status of 300 world-class humans, we’ve held on to our start-up roots. The result is a workplace that’s fast, exciting and designed for fun.

    As long as you’re happy, the rest falls into place. Think less stress, higher performance, more energy and all-round nicer human. Your friends and family will be delighted.

    As the Internal Data Specialist, you'll ensure the business can leverage our internal data sources, allowing us to make better decisions, react faster to changes and build confidence in our data and decisions. Your work will be split between support and project deliverables working with the Phocas IT and Finance teams and the wider business.

    What will you be doing?

    • Supporting internal reporting systems and data transformation processes.
    • Implementing new dashboards, reports, data sources and improvements to existing data sources.
    • Creating scalable and robust pipelines to ingest data from multiple structured and unstructured sources (APIs, databases, flat files, etc.) into our data platform.
    • Generating answers to the business’ questions by working with our internal data assets.
    • Improving business understanding of our data including where it comes from, how it fits together, and the meaning of the data.

    What are we looking for?

    • A degree in data science/computer science or similar, and solid (5+ years) experience in similar roles, working with data analytics products (Phocas, Power BI, etc.). 
    • Strong database experience (SQL Server, PostgreSQL) and experience with scripting languages (Python).
    • A general understanding of finance basics: terms, systems, processes, and best practices. 
    • Strong experience designing, developing, and supporting complex data import and transformation processes.
    • Experience creating technical and non-technical documentation and user guides and a natural tendency to produce strong documentation (both comments within the code, and externally.
    • Proven critical thinking skills; able to proactively problem solve and develop out of the box solutions. 
    • A growth mindset: a willingness to embrace new challenges and opportunities to grow.
    • Someone who can develop strong relationships and work collaboratively and supportively with a diverse global team
    • Bonus points for experience building financial reporting solutions, working with third-party APIs to extract data in an automated manner, and/or experience working in internal customer facing support roles.

    Why work at Phocas? 

    • People – when we ask what people like about working here, 'the people’ is the single most common answer 
    • Social/fun stuff – opportunities to get together, sometimes (optional) silly games, & food. We all really like food. 
    • Our office – spacious, conveniently located in sunny Sydenham, plenty of parking for four-, two- or even single wheeled vehicles.  
    • Southern Cross, Life, TPD and Income Protection Insurance 
    • Extra paid parental leave 
    • Flexible/hybrid working policy  

    Phocas is an Accredited Employer and typically we are strong supporters of international talent, but due to current visa settings and processing times, we can only consider applicants with current NZ working rights.

    We are a 2024 Circle Back Initiative Employer – we commit to respond to every applicant.

    To all recruitment agencies: Phocas does not accept agency resumes. Please do not forward resumes to our jobs alias, Phocas employees or any other company location. Phocas will not be responsible for any fees related to unsolicited resumes.

    Phocas is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans status or any other characteristic protected by law.

    #LI-NG1 #LI-HYBRID

    See more jobs at Phocas Software

    Apply for this job

    25d

    Senior Data Engineer

    BloomreachRemote CEE, Czechia, Slovakia
    redisremote-firstc++kubernetespython

    Bloomreach is hiring a Remote Senior Data Engineer

    Bloomreach is the world’s #1 Commerce Experience Cloud, empowering brands to deliver customer journeys so personalized, they feel like magic. It offers a suite of products that drive true personalization and digital commerce growth, including:

    • Discovery, offering AI-driven search and merchandising
    • Content, offering a headless CMS
    • Engagement, offering a leading CDP and marketing automation solutions

    Together, these solutions combine the power of unified customer and product data with the speed and scale of AI optimization, enabling revenue-driving digital commerce experiences that convert on any channel and every journey. Bloomreach serves over 850 global brands including Albertsons, Bosch, Puma, FC Bayern München, and Marks & Spencer. Bloomreach recently raised $175 million in a Series F funding round, bringing its total valuation to $2.2 billion. The investment was led by Goldman Sachs Asset Management with participation from Bain Capital Ventures and Sixth Street Growth. For more information, visit Bloomreach.com.

     

    We want you to join us as a full-timeSenior Data Engineer into our Data Pipelineteam. We work remotely first, but we are more than happy to meet you in our nice office in Bratislava or Brno. And if you are interested in who will be your engineering manager, check out Vaclav's Linkedin.

    Intrigued? Read on ????…

    Your responsibilities

    • You will develop and maintain Data LakeHouse on top of GCP platform using Apache IceBerg, BigQuery, BigLake tables, DataPlex and DataProc in the form ofApacheSpark/Flink with open file formats like AVRO and Parquet
    • You will help to maintain a streaming mechanism on how data from Apache Kafka gets into the Data LakeHouse
    • You will optimise the Data LakeHouse for near-real-time and non-real-time analytical use-cases primarily for customer activation and scenarios/campaign evaluation 
    • You should help with areas like data discovery and managed access to data through the data governance layer and data catalog using DataPlex so our engineering teams can leverage from this unified Data LakeHouse
    • You feel responsible for DataModeling and schema evolution
    • You should help us with adopting the concepts from Data Fabrics and Data Mesh to run data as a product to unlock the potential the data can unleash for our clients
    • You should bring expertise into the team from similar previous projects to influence how we adopt and evolve the concepts mentioned above and as an addition to that to topics like Zero-copy or reverse ETL to increase the ease of integration with client’s platforms
    • You will also help to maintain the existing data exports to Google’s BigQuery using google’s DataFlows and Apache Beam 
    • You will help us run and support our services in production handling high-volume traffic using Google Cloud Platform and Kubernetes.
    • You will review the code of your peers and they'll review yours. We have high code quality standards and the four-eyes principle is a must!

    Your qualifications

    • You have production experience with building and operating a DataLake, Data Warehouse or Data LakeHouses
    • You have a taste for big data streaming, storage and processing using open source technologies
    • You can demonstrate your understanding of what it means to treat data as a product
    • You know what are Data Mashes and Data Fabrics and what is critical to make sure for them to bring value
    • You are able to learn and adapt. It'll be handy while exploring new tech, navigating our not-so-small code base, or when iterating on our team processes.
    • You knowdata structures,you knowPython and (optionaly) Go.

    Our tech stack

    • Google Cloud Platform, DataFlow, Apache Beam, BigQuery, BigLake Table
    • Open formats IceBerg, Avro, Parquet
    • DataProc, Spark, Flink, Presto
    • Python, GO
    • Apache Kafka, Kubernetes, GitLab
    • BigTable, Mongo, Redis
    • … and much more ????

    Compensations

    • Salary range starting from 4300 EUR gross per month,going up depending on your experience and skills
    • There's a bonus based on company performance and your salary.
    • You will be entitled to restricted stock options ????that will truly make you a part of Bloomreach.
    • You can spend 1500 USD per year on the education of your choice (books, conferences, courses, ...).
    • You can count on free access to Udemy courses.
    • We have 4 company-wide disconnect days throughout the year during which you will be encouraged not to work and spend a day with your friends and family "disconnected".
    • You will have extra 5 days of paid vacation????. Extra days off for extra work-life balance ????.
    • Food allowance!
    • Sweet referral bonus up to 3000 USD based on the position.

    Your success story.

    • During the first 30 days, you will get to know the team, the company, and the most important processes. You’ll work on yourfirst tasks. We will help you to get familiar with our codebase and our product.
    • During the first 90 days, you will participate in yourfirst, more complex projects. You will help the team to find solutions to various problems, break the solution down into smaller tasks and participate in implementation. You will learn how we identify problems, how we prioritize our efforts, and how we deliver value to our customers.
    • During the first 180 days, you’ll become an integral part of the team. You will achieve the first goals we will set together to help you grow and explore new and interesting things. You will help us to deliver multi-milestone projects bringing great value to our customers. You will help us mitigate your first incidents and eventually even join the on-call rotation. You will get a sense of where the team is heading and you’ll help us toshape our future.
    • Finally, you’ll find out that our values are truly lived by us ????. We are dreamers and builders. Join us!

     

    More things you'll like about Bloomreach:

    Culture:

    • A great deal of freedom and trust. At Bloomreach we don’t clock in and out, and we have neither corporate rules nor long approval processes. This freedom goes hand in hand with responsibility. We are interested in results from day one. 

    • We have defined our5 valuesand the 10 underlying key behaviors that we strongly believe in. We can only succeed if everyone lives these behaviors day to day. We've embedded them in our processes like recruitment, onboarding, feedback, personal development, performance review and internal communication. 

    • We believe in flexible working hours to accommodate your working style.

    • We work remote-first with several Bloomreach Hubs available across three continents.

    • We organize company events to experience the global spirit of the company and get excited about what's ahead.

    • We encourage and support our employees to engage in volunteering activities - every Bloomreacher can take 5 paid days off to volunteer*.
    • TheBloomreach Glassdoor pageelaborates on our stellar 4.6/5 rating. The Bloomreach Comparably page Culture score is even higher at 4.9/5

    Personal Development:

    • We have a People Development Program -- participating in personal development workshops on various topics run by experts from inside the company. We are continuously developing & updating competency maps for select functions.

    • Our resident communication coachIvo Večeřais available to help navigate work-related communications & decision-making challenges.*
    • Our managers are strongly encouraged to participate in the Leader Development Program to develop in the areas we consider essential for any leader. The program includes regular comprehensive feedback, consultations with a coach and follow-up check-ins.

    • Bloomreachers utilize the $1,500 professional education budget on an annual basis to purchase education products (books, courses, certifications, etc.)*

    Well-being:

    • The Employee Assistance Program -- with counselors -- is available for non-work-related challenges.*

    • Subscription to Calm - sleep and meditation app.*

    • We organize ‘DisConnect’ days where Bloomreachers globally enjoy one additional day off each quarter, allowing us to unwind together and focus on activities away from the screen with our loved ones.

    • We facilitate sports, yoga, and meditation opportunities for each other.

    • Extended parental leave up to 26 calendar weeks for Primary Caregivers.*

    Compensation:

    • Restricted Stock Units or Stock Options are granted depending on a team member’s role, seniority, and location.*

    • Everyone gets to participate in the company's success through the company performance bonus.*

    • We offer an employee referral bonus of up to $3,000 paid out immediately after the new hire starts.

    • We reward & celebrate work anniversaries -- Bloomversaries!*

    (*Subject to employment type. Interns are exempt from marked benefits, usually for the first 6 months.)

    Excited? Join us and transform the future of commerce experiences!

    If this position doesn't suit you, but you know someone who might be a great fit, share it - we will be very grateful!


    Any unsolicited resumes/candidate profiles submitted through our website or to personal email accounts of employees of Bloomreach are considered property of Bloomreach and are not subject to payment of agency fees.

     #LI-Remote

    See more jobs at Bloomreach

    Apply for this job

    25d

    Data Engineer II - (Remote - US)

    MediavineAtlanta,Georgia,United States, Remote
    sqlDesignpythonAWS

    Mediavine is hiring a Remote Data Engineer II - (Remote - US)

    Mediavine is seeking an experienced Data Engineer to join our engineering team. We are looking for someone who enjoys solving interesting problems and wants to work with a small team of talented engineers on a product used by thousands of publishers. Applicants must be based in the United States.

    About Mediavine

    Mediavine is a fast-growing advertising management company representing over 10,000 websites in the food, lifestyle, DIY, and entertainment space. Founded by content creators, for content creators, Mediavine is a Top 20 Comscore property, exclusively reaching over 125 million monthly unique visitors. With best-in-class technology and a commitment to traffic quality and brand safety, we ensure optimal performance for our creators.

    Mission & Culture

    We are striving to build an inclusive and diverse team of highly talented individuals that reflect the industries we serve and the world we live in. The unique experiences and perspectives of our team members is encouraged and valued. If you are talented, driven, enjoy the pace of a start-up like environment, let’s talk!

    Position Title & Overview:

    The Data & Analytics team consists of data analysts, data engineers and analytics engineers working to build the most effective platform and tools to help uncover opportunities and make decisions with data here at Mediavine. We partner with Product, Support, Ad Operations and other teams within the Engineering department to understand behavior, develop accurate predictors and build solutions that provide the best internal and external experience possible.

    A Data Engineer at Mediavine will help build and maintain our data infrastructure. Building scalable data pipelines, managing transformation processes, and ensuring data quality and security at all steps along the way. This will include writing and maintaining code in Python and SQL, developing on AWS, and selecting and using third-party tools like Rundeck, Metabase, and others to round out the environment. You will be involved in decisions around tool selection and coding standards.

     Our current data engineering toolkit consists of custom Python data pipelines, AWS infrastructure including Kinesis pipelines, Rundeck scheduling, dbt for transformation and Snowflake as our data warehouse platform. We are open to new tools and expect this position to be a part of deciding the direction we take. 

    Essential Responsibilities:

    • Create data pipelines that make data available for analytic and application use cases
    • Develop self-healing, resilient processes that do not require constant care and feeding to run smoothly
    • Create meaningful data quality notifications with clear actions for interested parties including other internal teams and other members of the data and analytics team
    • Leading projects from a technical standpoint, creating project Technical Design Documents
    •  Support data analysts and analytics engineers ability to meet the needs of the organization
    • Participate in code reviews, understanding coding standards, ensuring test coverage and being aware of best practices
    • Build or implement tooling around data quality, governance and lineage, in the dbt framework and Snowflake but external to that as needed
    • Provide next level support when data issues are discovered and communicated by the data analysts
    • Work with data analysts and analytics engineers to standardize transformation logic in the dbt layer for consistency and ease of exploration by end users
    • Enable analytics engineers and data analysts by providing data modeling guidance, query optimization and aggregation advice

    Location: 

    • Applicants must be based in the United States

    You Have: 

    • 3+ years of experience in a data engineering role
    • Strong Python skills (Understands tradeoffs, optimization, etc)
    • Strong SQL skills (CTEs, window functions, optimization)
    • Experience working in cloud environments (AWS preferred, GCS, Azure)
    • An understanding of how to best structure data to enable internal and external facing analytics
    • Familiarity with calling APIs to retrieve data (Authentication flows, filters, limits, pagination)
    • Experience working with DevOps to deploy, scale and monitor data infrastructure
    • Scheduler experience either traditional or DAG based
    • Comfortable working with multi-TB cloud data warehouses (Snowflake preferred, Redshift, Big Query)
    • Experience with other DBMS systems (Postgres in particular)

    Nice to haves:

    • Experience with web analysis such as creating data structure that support product funnels, user behavior, and decision path analysis 
    • Understanding of Snowflake external stages, file formats and snowpipe
    • Experience with orchestration tools particularly across different technologies and stacks
    • Experience with dbt
    • Knowledge of Ad Tech, Google Ad Manager and all of it’s fun quirks (so fun)
    • The ability to make your teammates laugh (it wouldn’t hurt if you were fun to work with is what I’m saying)
    • Familiarity with event tracking systems (NewRelic, Snowplow, etc)
    • Experience with one or more major BI tools (Domo, Looker, PowerBI, etc.)
    • 100% remote 
    • Comprehensive benefits including Health, Dental, Vision and 401k match
    • Generous paid time off 
    • Wellness and Home Office Perks 
    • Up to 12 weeks of paid Parental Leave 
    • Inclusive Family Forming Benefits 
    • Professional development opportunities 
    • Travel opportunities for teams, our annual All Hands retreat as well as industry events

    Mediavine provides equal employment opportunities to applicants and employees. All aspects of employment will be based on merit, competence, performance, and business needs. We do not discriminate on the basis of race, color, religion, marital status, age, national origin, ancestry, physical or mental disability, medical condition, pregnancy, genetic information, gender, sexual orientation, gender identity or expression, veteran status, or any other status protected under federal, state, or local law.

    We strongly encourage minorities and individuals from underrepresented groups in technology to apply for this position.

    At Mediavine, base salary is one part of our competitive total compensation and benefits package and is determined using a salary range.  Individual compensation varies based on job-related factors, including business needs, experience, level of responsibility and qualifications. The base salary range for this role at the time of posting is $115,000 - $130,000 USD/yr.

    See more jobs at Mediavine

    Apply for this job

    26d

    Data Engineer

    golangMaster’s DegreetableauterraformscalasqlDesignazureapijavac++kubernetespythonAWS

    Cloudflare is hiring a Remote Data Engineer

    About Us

    At Cloudflare, we are on a mission to help build a better Internet. Today the company runs one of the world’s largest networks that powers millions of websites and other Internet properties for customers ranging from individual bloggers to SMBs to Fortune 500 companies. Cloudflare protects and accelerates any Internet application online without adding hardware, installing software, or changing a line of code. Internet properties powered by Cloudflare all have web traffic routed through its intelligent global network, which gets smarter with every request. As a result, they see significant improvement in performance and a decrease in spam and other attacks. Cloudflare was named to Entrepreneur Magazine’s Top Company Cultures list and ranked among the World’s Most Innovative Companies by Fast Company. 

    We realize people do not fit into neat boxes. We are looking for curious and empathetic individuals who are committed to developing themselves and learning new skills, and we are ready to help you do that. We cannot complete our mission without building a diverse and inclusive team. We hire the best people based on an evaluation of their potential and support them throughout their time at Cloudflare. Come join us! 

     

    Locations - Austin Highly referred, must be willing to relocate. 


    About the team  

    The Business Intelligence team at Cloudflare is responsible for building a centralized cloud data lake and an analytics platform that enables our internal Business Partners and Machine Learning teams with actionable insights and also provides a 360 view of our business. Our goal is to democratize data, support Cloudflare’s critical business needs, provide reporting and analytics via self-service tools to fuel existing and new business critical initiatives.

    About the role

    We are looking for an experienced Data Engineer to join our Business Intelligence Data Science team. The role involves designing, building, and maintaining data pipelines and infrastructure to support data science and machine learning initiatives. You will work closely with data scientists and other stakeholders to ensure data accessibility and quality, enabling the creation and operation of machine learning models and analytics solutions. You will help support our data science team to build solutions using Large Language Models, AI services, and machine learning solutions following industry AI standards and best practices. 

    What you will do

    • Collaborate with data scientists and business stakeholders to design datasets and engineer features for advanced analytical models.
    • Develop, test, and manage robust and scalable data pipelines and systems.
    • Collect, explore, validate, and prepare data for comprehensive analysis.
    • Monitor, optimize, and support development, testing, and production environments to maintain efficiency.
    • Ensure adherence to data governance and compliance standards and processes.
    • Design application components and evolve architecture, including APIs, services, data access, and integration.
    • Implement automation tools and frameworks, including CI/CD pipelines, to streamline processes.
    • Develop tools to automate workload monitoring and take proactive measures to scale the platform or resolve issues.
    • Mentor and guide junior data engineers, fostering a culture of continuous learning and development.

     

    Desired skills, knowledge, and experience 

    • Bachelor’s or Master’s degree in Computer Science, Engineering, Information Technology or related field
    • 3+ years of experience designing data solutions, modeling data, and developing ETL/ELT pipelines at large scale.
    • 3+ years of experience in a programming language (e.g., Python, Java, Scala, Golang).
    • Experience in developing large-scale data solutions using cloud platforms (e.g., AWS, Azure, Google Cloud) and big data technologies (e.g., Apache Spark).
    • Experience in data cataloging, classification, data quality, metadata management, and data lifecycle.
    • Experience with SQL and data visualization/analytics tools (e.g., Tableau, Looker Studio).
    • Experience with CI/CD pipelines and source control.
    • Experience with Infrastructure as Code tools like Terraform
    • Proficiency in API design and development of RESTful web services or GraphQL.
    • Working knowledge of containerization technologies like Kubernetes and Docker.
    • Experience with machine learning techniques and tools is a plus

    What Makes Cloudflare Special?

    We’re not just a highly ambitious, large-scale technology company. We’re a highly ambitious, large-scale technology company with a soul. Fundamental to our mission to help build a better Internet is protecting the free and open Internet.

    Project Galileo: We equip politically and artistically important organizations and journalists with powerful tools to defend themselves against attacks that would otherwise censor their work, technology already used by Cloudflare’s enterprise customers--at no cost.

    Athenian Project: We created Athenian Project to ensure that state and local governments have the highest level of protection and reliability for free, so that their constituents have access to election information and voter registration.

    1.1.1.1: We released 1.1.1.1to help fix the foundation of the Internet by building a faster, more secure and privacy-centric public DNS resolver. This is available publicly for everyone to use - it is the first consumer-focused service Cloudflare has ever released. Here’s the deal - we don’t store client IP addresses never, ever. We will continue to abide by our privacy commitmentand ensure that no user data is sold to advertisers or used to target consumers.

    Sound like something you’d like to be a part of? We’d love to hear from you!

    This position may require access to information protected under U.S. export control laws, including the U.S. Export Administration Regulations. Please note that any offer of employment may be conditioned on your authorization to receive software or technology controlled under these U.S. export laws without sponsorship for an export license.

    Cloudflare is proud to be an equal opportunity employer.  We are committed to providing equal employment opportunity for all people and place great value in both diversity and inclusiveness.  All qualified applicants will be considered for employment without regard to their, or any other person's, perceived or actual race, color, religion, sex, gender, gender identity, gender expression, sexual orientation, national origin, ancestry, citizenship, age, physical or mental disability, medical condition, family care status, or any other basis protected by law.We are an AA/Veterans/Disabled Employer.

    Cloudflare provides reasonable accommodations to qualified individuals with disabilities.  Please tell us if you require a reasonable accommodation to apply for a job. Examples of reasonable accommodations include, but are not limited to, changing the application process, providing documents in an alternate format, using a sign language interpreter, or using specialized equipment.  If you require a reasonable accommodation to apply for a job, please contact us via e-mail athr@cloudflare.comor via mail at 101 Townsend St. San Francisco, CA 94107.

    See more jobs at Cloudflare

    Apply for this job

    28d

    Data and Analytics Engineer

    airflowsqlDesignpython

    Cloudflare is hiring a Remote Data and Analytics Engineer

    About Us

    At Cloudflare, we are on a mission to help build a better Internet. Today the company runs one of the world’s largest networks that powers millions of websites and other Internet properties for customers ranging from individual bloggers to SMBs to Fortune 500 companies. Cloudflare protects and accelerates any Internet application online without adding hardware, installing software, or changing a line of code. Internet properties powered by Cloudflare all have web traffic routed through its intelligent global network, which gets smarter with every request. As a result, they see significant improvement in performance and a decrease in spam and other attacks. Cloudflare was named to Entrepreneur Magazine’s Top Company Cultures list and ranked among the World’s Most Innovative Companies by Fast Company. 

    We realize people do not fit into neat boxes. We are looking for curious and empathetic individuals who are committed to developing themselves and learning new skills, and we are ready to help you do that. We cannot complete our mission without building a diverse and inclusive team. We hire the best people based on an evaluation of their potential and support them throughout their time at Cloudflare. Come join us! 

    Available Locations: Lisbon or Remote Portugal

    About the team

    You will be part of the Network Strategy team within Cloudflare’s Infrastructure Engineering department. The Network Strategy team focuses on building both external and internal relationships that allow for Cloudflare to scale and reach user populations around the world. Our group takes a long term and technical approach to forging mutually beneficial and sustainable relationships with all of our network partners. 

    About the role

    We are looking for an experienced Data and Analytics Engineer to join our team to scale our data insights initiatives. You will work with a wide array of data sources about network traffic, performance, and cost. You’ll be responsible for building data pipelines, doing ad-hoc analytics based on the data, and automating our analysis. Important projects include understanding the resource consumption and cost of Cloudflare’s broad product portfolio.

    A candidate will be successful in this role if they’re flexible and able to match the right solution to the right problem. Flexibility is key. Cloudflare is a fast-paced environment and requirements change frequently.

    What you'll do

    • Design and implement data pipelines that take unprocessed data and make it usable for advanced analytics
    • Work closely with other product and engineering teams to ensure our products and services collect the right data for our analytics
    • Work closely with a cross functional team of data scientists and analysts and internal stakeholders on strategic initiatives 
    • Build tooling, automation, and visualizations around our analytics for consumption by other Cloudflare teams

    Examples of desirable skills, knowledge and experience

    • Excellent Python and SQL (one of the interviews will be a code review)
    • B.S. or M.S in Computer Science, Statistics, Mathematics, or other quantitative fields, or equivalent experience
    • Minimum 3 years of industry experience in software engineering, data engineering, data science or related field with a track record of extracting, transforming and loading large datasets 
    • Knowledge of data management fundamentals and data storage/computing principles
    • Excellent communication & problem solving skills 
    • Ability to collaborate with cross functional teams and work through ambiguous business requirements

    Bonus Points

    • Familiarity with Airflow 
    • Familiarity with Google Cloud Platform or other analytics databases

    What Makes Cloudflare Special?

    We’re not just a highly ambitious, large-scale technology company. We’re a highly ambitious, large-scale technology company with a soul. Fundamental to our mission to help build a better Internet is protecting the free and open Internet.

    Project Galileo: We equip politically and artistically important organizations and journalists with powerful tools to defend themselves against attacks that would otherwise censor their work, technology already used by Cloudflare’s enterprise customers--at no cost.

    Athenian Project: We created Athenian Project to ensure that state and local governments have the highest level of protection and reliability for free, so that their constituents have access to election information and voter registration.

    1.1.1.1: We released 1.1.1.1to help fix the foundation of the Internet by building a faster, more secure and privacy-centric public DNS resolver. This is available publicly for everyone to use - it is the first consumer-focused service Cloudflare has ever released. Here’s the deal - we don’t store client IP addresses never, ever. We will continue to abide by our privacy commitmentand ensure that no user data is sold to advertisers or used to target consumers.

    Sound like something you’d like to be a part of? We’d love to hear from you!

    This position may require access to information protected under U.S. export control laws, including the U.S. Export Administration Regulations. Please note that any offer of employment may be conditioned on your authorization to receive software or technology controlled under these U.S. export laws without sponsorship for an export license.

    Cloudflare is proud to be an equal opportunity employer.  We are committed to providing equal employment opportunity for all people and place great value in both diversity and inclusiveness.  All qualified applicants will be considered for employment without regard to their, or any other person's, perceived or actual race, color, religion, sex, gender, gender identity, gender expression, sexual orientation, national origin, ancestry, citizenship, age, physical or mental disability, medical condition, family care status, or any other basis protected by law.We are an AA/Veterans/Disabled Employer.

    Cloudflare provides reasonable accommodations to qualified individuals with disabilities.  Please tell us if you require a reasonable accommodation to apply for a job. Examples of reasonable accommodations include, but are not limited to, changing the application process, providing documents in an alternate format, using a sign language interpreter, or using specialized equipment.  If you require a reasonable accommodation to apply for a job, please contact us via e-mail athr@cloudflare.comor via mail at 101 Townsend St. San Francisco, CA 94107.

    See more jobs at Cloudflare

    Apply for this job

    +30d

    Azure Data Engineer

    ProArchHyderabad,Telangana,India, Remote
    Designazure

    ProArch is hiring a Remote Azure Data Engineer

    ProArch is hiring a skilled Azure Data Engineer to join our team. As an Azure Data Engineer, you will be responsible for designing, implementing, and maintaining data processing systems using Azure technologies. Additionally, you will collaborate with cross-functional teams to understand business requirements, identify opportunities for data-driven improvements, and deliver high-quality solutions. If you have a strong background in Azure data tools and technologies, excellent problem-solving skills, and a passion for data engineering, we want to hear from you!

    Responsibilities:

    • Design, develop, and implement data engineering solutions on the Azure platform
    • Create and maintain data pipelines and ETL processes
    • Optimize data storage and retrieval for performance and scalability
    • Collaborate with data scientists and analysts to build data models and enable data-driven insights
    • Ensure data quality and integrity through data validation and cleansing
    • Monitor and troubleshoot data pipelines and resolve any issues
    • Stay up-to-date with the latest Azure data engineering best practices and technologies
    • Excellent communication skills
    • Strong experience in Python/Pyspark
    • The ability to understand businesses concepts and work with customers to process data accurately.
    • A solid of understanding Azure Data Lake, Spark for Synapse (or Azure Databricks), Synapse Pipelines (or Azure Data Factory), Mapping Data Flows, SQL Server, Synapse Serverless/Pools (or SQL Data Warehouse).
    • Experience with source control, version control and moving data artifacts from Dev to Test to Prod.
    • A proactive self-starter, who likes deliver value, solves challenges and make progress.
    • Comfortable working in a team or as an individual contributor
    • Good data modelling skills (e.g., relationships, entities, facts, and dimensions)

    See more jobs at ProArch

    Apply for this job

    +30d

    Senior Data Engineer

    CLEAR - CorporateNew York, New York, United States (Hybrid)
    tableauairflowsqlDesignjenkinspythonAWS

    CLEAR - Corporate is hiring a Remote Senior Data Engineer

    Today, CLEAR is well-known as a leader in digital and biometric identification, reducing friction for our members wherever an ID check is needed. We’re looking for an experienced Senior Data Engineer to help us build the next generation of products which will go beyond just ID and enable our members to leverage the power of a networked digital identity. As a Senior Data Engineer at CLEAR, you will participate in the design, implementation, testing, and deployment of applications to build and enhance our platform- one that interconnects dozens of attributes and qualifications while keeping member privacy and security at the core. 


    A brief highlight of our tech stack:

    • SQL / Python / Looker / Snowflake / Airflow / Databricks / Spark / dbt

    What you'll do:

    • Build a scalable data system in which Analysts and Engineers can self-service changes in an automated, tested, secure, and high-quality manner 
    • Build processes supporting data transformation, data structures, metadata, dependency and workload management
    • Develop and maintain data pipelines to collect, clean, and transform data. Owning end to end data product from ingestion to visualization
    • Develop and implement data analytics models
    • Partner with product and other stakeholders to uncover requirements, to innovate, and to solve complex problems
    • Have a strong sense of ownership, responsible for architectural decision-making and striving for continuous improvement in technology and processes at CLEAR

     What you're great at:

    • 6+ years of data engineering experience
    • Working with cloud-based application development, and be fluent in at least a few of: 
      • Cloud services providers like AWS
      • Data pipeline orchestration tools like Airflow, Dagster, Luigi, etc
      • Big data tools like Spark, Kafka, Snowflake, Databricks, etc
      • Collaboration, integration, and deployment tools like Github, Argo, and Jenkins 
      • Data visualization tool like Looker, Tableau, etc
    • Articulating technical concepts to a mixed audience of technical and non-technical stakeholders
    • Collaborating and mentoring less experienced members of the team
    • Comfort with ambiguity 
    • Curiosity about technology, believe in constant learning, and ability to be autonomous to figure out what's important

    How You'll be Rewarded:

    At CLEAR we help YOU move forward - because when you’re at your best, we’re at our best. You’ll work with talented team members who are motivated by our mission of making experiences safer and easier. Our hybrid work environment provides flexibility. In our offices, you’ll enjoy benefits like meals and snacks. We invest in your well-being and learning & development with our stipend and reimbursement programs. 

    We offer holistic total rewards, including comprehensive healthcare plans, family building benefits (fertility and adoption/surrogacy support), flexible time off, free OneMedical memberships for you and your dependents, and a 401(k) retirement plan with employer match. The base salary range for this role is $175,000 - $215,000, depending on levels of skills and experience.

    The base salary range represents the low and high end of CLEAR’s salary range for this position. Salaries will vary depending on various factors which include, but are not limited to location, education, skills, experience and performance. The range listed is just one component of CLEAR’s total compensation package for employees and other rewards may include annual bonuses, commission, Restricted Stock Units.

    About CLEAR

    Have you ever had that green-light feeling? When you hit every green light and the day just feels like magic. CLEAR's mission is to create frictionless experiences where every day has that feeling. With more than 25+ million passionate members and hundreds of partners around the world, CLEAR’s identity platform is transforming the way people live, work, and travel. Whether it’s at the airport, stadium, or right on your phone, CLEAR connects you to the things that make you, you - unlocking easier, more secure, and more seamless experiences - making them all feel like magic.

    CLEAR provides reasonable accommodation to qualified individuals with disabilities or protected needs. Please let us know if you require a reasonable accommodation to apply for a job or perform your job. Examples of reasonable accommodation include, but are not limited to, time off, extra breaks, making a change to the application process or work procedures, policy exceptions, providing documents in an alternative format, live captioning or using a sign language interpreter, or using specialized equipment.

    See more jobs at CLEAR - Corporate

    Apply for this job

    +30d

    Data Engineer with Databricks (Remote)

    Loginsoft Consulting LLCRichardson, TX - Remote
    10 years of experiencesqlDesignazurepythonAWS

    Loginsoft Consulting LLC is hiring a Remote Data Engineer with Databricks (Remote)

    NOTE: THIS POSITION IS TO JOIN AS W2 ONLY.

    Data Engineer with Databricks

    Location: Remote

    Duration: 6+ Months

    Daily Responsibilities:

    • Design, develop, test, deploy, maintain, and improve software applications and services.
    • Implement data pipelines and ETL processes using Databricks and Snowflake.
    • Collaborate with other engineers to understand requirements and translate them into technical solutions.
    • Optimize and fine-tune performance of data pipelines and database queries.
    • Ensure code quality through code reviews, unit testing, and continuous integration.
    • Contribute to architecture and technical design discussions.

    Technology requirements:

    • SQL
    • GCP
    • AWS

    Degree or certifications required:

    • No degrees required

    Years experience:

    • 10 years of experience as a Software Engineer or related role.

    Required background/ Skillsets:

    • Prior working experience in Data bricks
    • Snowflake
    • Python
    • Data science background

    Proficiency in Python for software development and scripting.
    Hands-on experience with Databricks and Snowflake for data engineering and analytics.
    Strong understanding of database design, SQL, and data modeling principles.
    Experience with cloud platforms such as AWS, Azure, or GCP.
    Familiarity with machine learning concepts and frameworks is a plus.
    Excellent problem-solving skills and ability to work independently and as part of a team.
    Strong communication skills and ability to collaborate effectively across teams.

    See more jobs at Loginsoft Consulting LLC

    Apply for this job