Data Engineer Remote Jobs

519 Results

5d

Associate Data Engineer

RTS LabsGlen Allen, VA Remote
tableausqlDesignmobileazure

RTS Labs is hiring a Remote Associate Data Engineer

We’ll cut to the chase - do you have:

  • Have strong SQL skills
  • At least 1-3 years experience developing business intelligence solutions
  • Experience with ETL development
  • Excellent communication and facilitation skills
  • MS SQL (T-SQL, Stored Procedures, Functions), SSRS, SSIS, Microsoft Power BI, Tableau
  • Exposure to scripting in a variety of languages
  • Consulting or client experience is a plus
  • Azure Data Factory and Azure functions is a plus

Job Requirements

  • Deep technical knowledge of ETL systems and visualization tools
  • Design, develop, and present Business Intelligence solutions
  • Ability to present to business and technical stakeholders
  • Maintain data dictionaries and data lineage documents for integration components of data warehouse projects

Are you interested in joining a solid, fast-growing software development company specializing in Web, Mobile, Cloud, and Data Analytics?

Here’s what we’re looking for:

  • You solve challenges with out-of-the-box thinking (status quo is not our game)
  • You love to learn and share your insights (nobody is perfect and you have to be ready to share)
  • You know what it means to be a team player (egos are left at the door)
  • You build relationships with clients based on trust (we are straightforward and honest at all times)
  • You reliably manage yourself (you build your own role and know when things need to get done)
  • You write beautiful code you are proud of and is showcase worthy (modern, sleek)

Who are We? RTS & Our Culture:

We are a solid, fast-growing software development company specializing in Web, Mobile, Data Analytics, and Salesforce. We have plenty of room for magic, enthusiasm and personal growth. Your ideas will be encouraged, and you will have the power to shape the direction of RTS Labs.

At RTS, we offer a hard-working, but casual workplace with few meetings, sincere camaraderie and functional creativity. There are no closed doors, no rigid office hours and no space for big egos.

We develop great solutions to solve each of our client’s problems, but we also build great relationships with our clients. We are straightforward and honest at all times. We work hard as a team, but we also play hard as a team. We offer competitive salaries based on experience and benefits including 401(k), health, vision, and life insurance, and short-term disability.

You’ll find yourself in a great place to work with the resources and support you need to find a long term career fit.

  • Do you prefer Mac or PC? Either way, you get to choose with us
  • 40 hours a year for professional development to use how you choose
  • Flexible PTO
  • Company values that we actually implement (https://rtslabs.com/culture/)
  • Family first approach, no micro-management

RTS Labs is committed to providing a safe and inclusive environment for all employees, contractors, vendors, and clients; where all people are honored and respected, and differences are celebrated. Proud to be an Equal Opportunity Employer, RTS Labs does not discriminate based upon race, religion, color, national origin, gender, gender identity, gender expression, sexual orientation, age, status as a protected veteran, or status as an individual with a disability.

See more jobs at RTS Labs

Apply for this job

12d

Senior Data Engineer (h/f)

api.videoRemote job, Remote
sqlDesignansibleapigitjavadockerelasticsearchpostgresqllinuxpython

api.video is hiring a Remote Senior Data Engineer (h/f)

api.video is an API-first platform that enables developers to build, scale and operate on-demand and live video streaming in their own apps and platforms in minutes, with just a few lines of code. The service handles the end-to-end workflow, from video ingestion to worldwide video delivery.

As a Senior Data Engineer, you will join our brand new Data Team to support the whole company on the Observability (Engineering) and Insight (Product) topics.

We are at the beginning of a challenging project to ingest and process data from various sources in a scalable, consistent and efficient manner. You will design and develop pipelines that will help shape the future of our Data Platform.


What will you be doing?

  • Define ingestion processes depending on the specificities of different data sources: PostgreSQL, Elasticsearch, internal and external services
  • Bootstrap, implement and test processing pipelines (batch and stream) based on industry-standard tools like Kafka and others
  • Store computed data in our Snowflake data warehouse


What can you expect at api.video?

  • Global presence with an international working environment
  • 100% Remote possible (we have an HQ in Bordeaux, and we rely on many coworking spaces) (CET timezones)
  • We offer competitive salaries
  • Flexible timetable - we value results over presence
  • Work in your preferred System and OS (Mac, Linux, Microsoft)
  • Your voice is valued and will count in our decision making
  • Personal Growth. We invest in your career development; do you need books or to attend conferences? We got you covered!

See more jobs at api.video

Apply for this job

15d

Senior Data Engineer

LeapRemote
tableauairflowsqlDesignpython

Leap is hiring a Remote Senior Data Engineer

About Leap:

Leap is building the world's largest network of branded retail stores – powered by data, systems, and scale.  The Leap Platform enables brands to deploy stores that work in concert with ecommerce more rapidly and at significantly reduced cost and risk.  Brand stores powered by Leap bring modern brands to life with compelling, immersive customer experience and data driven operations. 

At Leap, our diverse, growing team is excited by the opportunity to power the next generation of leading consumer brands with a vibrant presence in local communities throughout the country.  We're one of the fastest growing companies in the retail/ecommerce space - since launch we've powered stores for dozens of brands, and we're adding more brands and stores each week.

Our brand customers are modern brands who lead or aspire to lead their categories today and tomorrow, and *outstanding* people are literally at the core of our product.  Our organization is composed of a diverse range of talented individuals and teams.  With functions like Real Estate, Store Design & Development, Retail Operations, Marketing, Engineering, Product Management and Data Science, we're a truly unique company and our shared ambitions and core values tightly align and drive us to succeed.

Our staff are what make our organization so special and honoring our culture and values as we hire, onboard, engage, develop and support our teams is paramount.

Come take this leap with us. Your ideas, thinking, and voice are wanted.

Senior Data Engineer

Mission For The position:

The Analytics Engineering team effectively and sustainably provides data solutions and tools across the organization. We integrate, transform, and improve volumes of data at the project or enterprise level for streamlined processes, greater efficiencies, and smarter, more informed decision-making. This team is high-energy, dynamic and in a business-critical domain space. This role is an opportunity to make a real difference in the data space, and we need confident, experienced people eager to bring in solutions, with a demonstrated ability to learn fast and make that happen.

Key Responsibilities:

You'll work closely with our business teams to make sure the most awesome, high-potential brands join the Leap platform. You'll also help analyze our existing brands and store operations to make sure we're maximizing and optimizing their performance. You'll pair regularly with analytics and engineering to make sure that common analyses get automated, so that we can spend our collective brainpower on the new and unusual.

What You'll Do:

  • Support the data needs of Analytics, Machine Learning, and Business.
  • Lead technical initiatives by architecting the solution and collaborating with team members and peers to execute the solution
  • Architect, design, implement and maintaining multi-layered SQL and Python processes
  • Design flexible and scalable data models
  • Enhance the tooling and frameworks to support complex Extract Transform Load (ETL) processes
  • Troubleshooting discrepancies in existing databases, data pipelines, warehouses, and reporting
  • Function as mentor and adviser for team members
  • Advise on Best Practices and innovative designs/solutions

Our Ideal Candidate Has:

  • A Bachelor’s degree or Masters in Computer Science, Engineering, or equivalent experience.
  • 5+ years of previous experience in data engineering with a focus on database related technologies
  • Expert technical knowledge of SQL and database related technologies.
  • 2+ years of experience working with Cloud Data Warehouse Technologies such as BigQuery, Snowflake, or Redshift.
  • 1+ years of experience working with Python, dbt, Airflow or other workflow orchestration frameworks.
  • Deep understanding of relational database modeling principles and techniques.
  • Experience architecting, designing, and implementing ETL solutions with peers and stakeholders.
  • Experience with data streaming technologies (Kafka, Kinesis, Apache Flink, Apache Beam, etc).
  • Experience with data visualization technologies such as Looker, Tableau, or Microstrategy.
  • Experience supporting organizations using Machine Learning.

In your first month, you will:

  • Go to our stores, ask questions, try on products, buy products, return products, and experience being a customer first-hand
  • Plunge into our existing databases to answer straightforward analytics questions
  • Pair with our engineers to understand our existing ETLs and data 
  • Communicate constantly with an encouraging, supportive team

Leap EEO Statement

However, you identify, whatever your path to get here; Leap celebrates diversity and is committed to maintaining a safe, rewarding, and inclusive environment where Leapers thrive individually and as a team. To achieve our mission, building the world's largest network of branded retail stores – powered by data, systems, and scale; we need to work hard to foster a diverse community to support the brands and customers we serve. These aren't just words; this is who we are. We know that our differences are what make our organization special and are paramount to our culture. Your age, skin color, beliefs, sexual orientation, nationality, disability, parental status, vet status, gender identity are valued.

See more jobs at Leap

Apply for this job

15d

Data Engineer

FormstackRemote
remote-firsttableaunosqlairflowsqlDesignmysql

Formstack is hiring a Remote Data Engineer

Formstack improves people’s lives with practical solutions to their everyday work.

We are looking for a Data Engineer to help us accomplish this mission. 

 

Formstack is a remote-first company with team members who live and work across the U.S., Canada, and the globe. We offer more than just a job; we provide a community where you can learn, grow, and thrive your way. Join a dynamic and diverse team that values relationships as much as results. Come build what matters with Formstack.

 

Ozan Akcin, Data Engineer at Formstack, is looking to hire someone who will complement and strengthen the team.

 

 

Who You Are:

 

  • You are a great communicator with excellent documentation skills
  • You are a self-guided person with the ability to correctly prioritize projects given the evolving research needs of the firm's analysts
  • You are a hands-on individual who is constantly in touch with data stakeholders rather than designing solutions in isolation
  • You are a proactive thinker of data end-use scenarios and engineering the solutions required for these

 

What You’ll Do:

 

A typical day in this role would likely involve tracking usage of the Data

Team's third-party vendor data sets by the firm's analysts and offering design

improvements to schema to improve reads.  On some days you'd be reaching out to

the analysts to get feedback on their use of these datasets while on others

you would be working on scalable design improvements to the Data Team's 

throughput processes to meet these requirements. This position will report to the head of Data Engineering.

 

How You’ll Succeed:

 

  • Be functionally responsible for one or more of the Data Team's vendor specific batch and streaming ETL pipelines
  • Optimize data transformations for data end use via Looker
  • Keep abreast of analyst requirements for the company's vendor-specific data sources and make adjustments to these as needed

 

What We’re Looking For:

 

  • 2+ years experience building enterprise-quality (shippable) code in flavors of PHP/Python/Go or similar
  • 2+ years experience designing ETL (esp streaming) pipelines from scratch (not just working with out-of-the-box ETL solutions)
  • Experience with *nix systems, including shell scripting
  • Expert SQL proficiency (MySQL)
  • Thorough understanding of scalable data throughput architectures with a view to optimizing performance as balanced against costs
  • Experience working with a wide variety of APIs and authentication protocols
  • Familiarity with data infrastructure <=> BI Platform optimization concepts (in particular Looker)

 

Bonus Points:

 

  • Prior experience working with data infrastructures feeding BI tools. e.g. Spotfire, Panopticon, Tableau, Looker, etc. ( Note: we are a Looker shop)
  • Prior experience working with marketing automation vendors ( Pardot, Pendo, Google Analytics, etc. )
  • Experience with Airflow
  • Knowledge of NoSQL (DynamoDb) and/or GCP architectures a plus
  • Knowledge of Kinesis firehose, lambda functions, messaging/queueing systems a plus

 

Salary Range:

 

$110,000 - $120,000 per year (USD)

 

***This is a remote position***

 

What Formstack Offers for Full Time Employees in the US and Canada(exclude Quebec):

  • Free health plans and company-paid Dental, Vision, Disability, and Life Insurance Benefits for US and Canadian full-time employees.
  • Monthly Health & Wellness and Technology stipends
  • Half-day Fridays
  • Unlimited PTO for all employees.
  • 401k & Roth w/ safe harbor match (the US and Canada)
  • The most up-to-date technology, including company-issued Macs, the latest software, and other tools needed to excel at your job
  • Company-paid conferences and extended learning opportunities
  • Yearly company and team gatherings

Want to learn more about who we are and what we value? CLICK HERE to hear from some current Formstackers about what matters most!

Formstack is proud and dedicated to providing Equal Employment Opportunities.

Formstack maintains a policy that Equal Employment Opportunities be available to all persons without regard to race, gender, age, color, religion, national origin, ancestry, citizenship status, disability, sexual orientation, gender identity, genetic information, union affiliation, veteran status or any other characteristic protected by law. This means we do not discriminate in any aspect of employment based on any of these characteristics. This policy applies to all applicants and employees through all phases of employment, including but not limited to hiring, promotion, treatment during employment, demotion, and termination.

Salary ranges are determined by industry research and trends. Individual salaries are based on skills, experience, and geographical location.Compensation is reviewed on a regular basis and adjustments are made accordingly.

All data collected in our application process from resume collection to application questions is used for recruitment purposes only. We will store it in our applicant tracking system, JazzHR, and will not share this data with anyone else. We will keep your data until the role is filled and only continue to store it if we feel you may fit future roles.

See more jobs at Formstack

Apply for this job

17d

Sr. Data Engineer - DataBricks

ThetaBaltimore, MD Remote
Design

Theta is hiring a Remote Sr. Data Engineer - DataBricks

About the Sr. Data Engineer - DataBricks position

theta. is looking for a passionate, different-thinking Sr. Data Engineer with strong DataBricks skills looking to be of service to the public good that'll be responsible for developing, testing, improving, and maintaining new and existing data systems and applications to help users retrieve data effectively. You will have to ensure these data systems run effectively and securely daily.

We expect you to be a tech-savvy professional apt for productive collaboration with the development teams, administrators, and clients to ensure system consistency, provide technical support, and identify new requirements. It would help if you also were organized and communicative.

Besides that, we expect you to be a good team player and find optimal ways to solve problems.

U.S. Citizenship, Green Card, or EAD required

Sr. Data Engineer responsibilities are:

  • Take part in the entire data lifecycle, focusing on coding and debugging.
  • Optimize and maintain legacy systems
  • Write quality code to develop functional data applications.
  • Troubleshoot data usage issues and malfunctions
  • Update data sets according to requests and perform tests
  • Get user requirements and identify new features
  • Collaborate with developers to improve applications and establish best practices
  • Explore new data products, services, and protocols and make suggestions for their usage
  • Ensure all data applications and systems meet the client's performance requirements

Sr. Data Engineer requirements are:

  • 7+ years experience working in a Data Engineer position
  • Strong experience with DataBricks.
  • Solid experience with data management
  • Good knowledge of software development and user interface web applications
  • Strong analytical and organization skills with a problem-solving attitude
  • Excellent understanding of the entire web development process (design, development, and deployment) and application lifecycle
  • Strong analytical and time management skills
  • Good teamwork skills

Salary Range: $110,000 - $161,000

theta.- an SBA-Certified HUBZone digital & management firm based in Baltimore, MD, working to create a world where tech works for everybody. We take pride in being at the intersection of innovation & technology and the intersection of technology and its impact on the world. Our position at these intersections allows us to intimately understand these worlds' limitations and develop innovative organic solutions to unique problems.

See more jobs at Theta

Apply for this job

17d

Data Engineer

Lighthouse LabsRemote, Ontario, Canada
1 year of experiencesqlgitpython

Lighthouse Labs is hiring a Remote Data Engineer

Lighthouse Labs is looking to add a newData Engineerto help scale our next stage of growth as we expand into new markets, increase our offerings and diversify our education programs! This role will be focused on building and maintaining our Data Lake, and other existing data and machine learning pipelines. This role reports directly into the Head of Data and will work within our Data Team to build new data processing pipelines as well as data lake tables using dbt and assist in deploying machine learning models effectively to support our internal processes and to improve and personalize our existing learning products and experience.  The ideal candidate will be passionate about solving data problems using the right data engineering skills and tools.


What you’ll be doing:

  • Use specific tools and APIs to extract data from various  sources and store them in our data lake
  • Work with dbt to transform the data inside the data lake
  • Clean and prepare the data to make them easily accessible to other teams at Lighthouse Labs
  • Assist in selecting the best data engineering tools to support our growth
  • When needed, deploy and maintain machine learning models to improve the company’s decision making
  • Collaborate with tech and product teams to resolve data issues and ensure delivery and compliance 
  • Prepare appropriate  infrastructure so reports and dashboards can be easily delivered to stakeholders


What we need from you:

  • At least 1 year of experience working as a data engineer or 2 years as data scientist
  • Strong database knowledge in order to analyze and process data stored in data warehouses. 
  • Advanced SQL knowledge
  • Advanced knowledge of Python and Jupyter Notebooks
  • Understanding of command line and basic knowledge of Bash programming language
  • Good understanding of modern code development practices including DevOps/DataOps
  • Familiarity with dbt is an asset
  • Familiarity with BigQuery and other Google Cloud solution is an asset
  • Ability to work with version control tools (git)
  • Critical thinking and proven problem-solving abilities

 

Why you’ll like the job:  

What we offer:

  • A fast-paced culture focused on continuous learning and growth
  • 4 WEEKS PTO! (15 vacation days, 5 personal days)
  • Unlimited sick days
  • A remote working budget to get your home office up and running
  • A learning fund to support professional development
  • Flexible working hours
  • 100% employer-paid health benefits


About us: 

Lighthouse Labs was founded in 2013 with the mission to effectively and efficiently prepare the workforce with the analytical and technical skills necessary to succeed in a world of automation. With an initial focus on our open-enrolment developer bootcamp, we have grown into a leading provider of professional education services, delivering outstanding educational outcomes for our students. Our secret? Innovative curriculum, proprietary edtech, unique mentorship and career services and partnerships with government and industry leading organizations. We’re a bunch of quirky, inclusive and smart people who are changing lives by reimagining education - join us!


Lighthouse Labs is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All positions at this time are remote, and we welcome all applicants. Talk to us to find out about our learning fund and other perks!

See more jobs at Lighthouse Labs

Apply for this job

29d

Data Engineer

UnbounceRemote
terraformairflowsqldockerkubernetespythonAWS

Unbounce is hiring a Remote Data Engineer

Join Unbounce and help the world experience better marketing. We’re a people-first, customer-obsessed company focused on helping employees do their best work. Our landing page and conversion platform empower digital marketing teams and agencies to launch campaigns, increase conversions and get significantly better ROI on their marketing spend in a way that nobody else does today.

The Data Engineering team enables other teams by creating the ecosystem upon which data processing is built. It is a small team with lots of opportunities to create high-impact solutions and collaborate with a large part of the company.

 

A little bit about you:

  • You understand the data lifecycle – ultimately delivering high-quality data that is able to evolve and grow
  • You work as part of a team – your success is the success of the team
  • You are not afraid of taking initiative; taking on new work, improving existing documentation or exploring better ways to achieve a goal – these things start with you
  • You own your work – from idea to delivery, you’ll take your projects all the way

We believe the ideal candidate will have experience in:

  • Delivering production-grade Python
  • Writing reasonably optimized SQL
  • Deploying on containerized infrastructure, such as Docker and Kubernetes
  • Building event stream architecture and using tooling such as Kafka
  • Building batch architecture and using tooling such as Airflow
  • Writing Infrastructure as Code, using tools such as CloudFormation and Terraform
  • Creating and managing databases; designing warehouse usage for distributed systems
  • Develop and consume synchronous and asynchronous APIs
  • Script using unix-based shells and Make
  • Write and maintain permission systems such as RBAC within tools such as IAM or K8s

 

What you’ll be doing:

  • Enable our Data Analysts and Data Scientists by developing pipelines,  tooling, infrastructure, training and pioneering best practices
  • Develop and improve existing data solutions on our warehouses and event streams
  • Develop ETL pipelines and supporting infrastructure for data-driven insights
  • Write IaC for AWS services, including RDS, S3, EKS (Kubernetes manifests), Kinesis and MSK
  • Coordinate with other teams to deliver high-impact solutions

 

What’s in it for you:

  • A remote-friendly office with flexible hours – for this role we will consider all applications from those based in Canada with the option to work from our Vancouver office
  • 4 weeks vacation plus Christmas holiday closure – you're entitled to the week of Christmas off with pay through to and including Jan 1st
  • Vacation bonus - $1,000.00
  • 12 personal wellness days (this includes: personal day, moving day, sick day, etc)
  • Health and wellness budget - $500.00
  • WFH budget - $500.00
  • A paid day off for your birthday
  • One paid volunteer day per year
  • All Unbouncers are encouraged to dedicate 10% of their time to Pro-D time

Please note that we currently do not have a legal entity set up to operate as an employer of record in Quebec. We thank you for your consideration but we are unable to accept candidates from Quebec at this time.

 

Share our values:

  • Courage
  • Ambition
  • Being Real
  • Empathy
  • Diversity

 

Unbounce Welcomes You to be YOU!

At Unbounce, we want every employee to be excited to bring their full, authentic self to work. When you do this – when you bring your unique experiences, background, knowledge, perspective, and self-expression while embracing the same from others – we learn from each other, we innovate, and we co-create an environment where Unbouncers can do the best work of their careers. We’re bolder and more brilliant together.

We’re dedicated to ensuring each Unbouncer feels a sense of belonging, feels safe, cared for, respected and valued for who they are, and trusts that their unique voice is heard, embraced, and meaningfully contributes to decision-making. We’re committed to equitable employee experience, opportunity, pay and support for every employee regardless of gender identity or expression, race, ethnicity, family or marital status, religion, socio-economic status, veteran status, national origin, age, sexual orientation, education, disability, or any other characteristic that makes you unique. 

We have no tolerance for sexism, racism, xenophobia, homophobia, transphobia, ableism, ageism, or any other forms of hateful/harmful discrimination and we’re taking action against unequal pay in our community through leading the #PayUpforProgress movement.

Please let us know if you require any accommodations or support during the recruitment process.

See more jobs at Unbounce

Apply for this job

30d

Data Engineer (m/w/d)

CatenateRemote job, Remote
nosqlsqlazureAWS

Catenate is hiring a Remote Data Engineer (m/w/d)

Zum Aufbau des innovativen Bereichs Business Intelligence und Data Analyticssuchen wir innerhalb der Catenate Group für unser junges Start-Up Datalytics zum nächstmöglichsten Zeitpunkt einen zuverlässigen



Data Engineer (m/w/d)



Datalytics ist eine IT-Beratung mit dem Fokus auf Lösungen und Leistungen in den Bereichen Business Intelligence, Data Analytics und Robotic Process Automation aus München. Wir stehen dafür, gemeinsam mit unseren Kunden datenbasierte Lösungen zu implementieren, neue Erkenntnisse aus Daten zu generieren und fundierte Grundlagen für nachhaltige Geschäftsentscheidungen sowie Optimierungen zu entwickeln.

Wo du deine Erfahrungen gesammelt hast, ist uns nicht wichtig. Sowohl als Studienabsolvent, mit einer Ausbildung und relevanten Berufserfahrung oder auch Quereinsteiger bist du bei uns herzlich willkommen.



Aufgaben

Du baust Cloud-basierte DWH/BI-Lösungen auf und entwickelst diese weiter. Dies beinhaltet:

  • Anbindung neuer Datenquellen

  • Erstellung von ETL-Pipelines und Skripten inklusive Datenaufbereitung

  • Definition und Erweiterung eines Datenmodells mithilfe von ETL- und SQL-Methoden

  • Darstellung der Auswirkungen und Maßnahmen

  • Präsentieren und Vertreten deiner Arbeit gegenüber dem Kunden



+30d

Senior Data Engineer - Data Insights

LBMC, PCRemote
sqlDesignswiftazurec++python

LBMC, PC is hiring a Remote Senior Data Engineer - Data Insights

LBMC TECHNOLOGY OVERVIEW

Over the past year, LBMC has been in growth mode receiving accolades including being named 2021 Best Firms for Technology by Accounting Today and recognized as one of the 2021 Best Workplaces by Consulting & Professional Services (#35 nationally). Accounting Today also named LBMC as a Top 5 Pacesetter for Growth after adding more than 25% to our workforce during COVID. LBMC values hiring individuals with a growth mindset and are looking to add to our industry-leading technology consulting practice, so if you have an innate curiosity for solving problems and creating solutions—LBMC is the place for you!

LBMC is based in Nashville and we have industry-leading benefits including both remote and in-person work options, dynamic technology solutions, financial incentives for training and certifications, and curated professional growth organizations such as the Women’s Initiative Network, Lending Hands community initiative, Young Professionals group, and robust Talent Development offerings.

OPPORTUNITY

The LBMC Digital Transformation Team Senior Data Engineer is a great opportunity for anyone passionate about technology, innovation, and solving complex issues for enterprise organizations. LBMC Data Engineerswill be tasked with delivering high-end modern data platforms and analytics solutions to LBMC clients and new hires willcollaborate with a multi-disciplinary team of engineers, developers, and architects on a wide range of client data projects. 

SCOPE OF WORK

  • Demonstrate a strong passion for delivering data solutions to a diverse set of LBMC clients through the building of data pipelines 
  • Work with large, complex data sets to design and built end-to-end data pipelines and ETL solutions through the utilization and knowledge of SQL, Azure and Python 
  • Design and build solutions using SQL database technologies, Azure Common Data Services, SQL Server Analysis Services (Tabular model)
  • Effectively communicate with technical team members and non-technical LBMC stakeholders to ensure projects deliverables are completed effectively and on-time
  • Provide documentation to clients for custom solutions built or technologies being deployed
  • Work to find innovative solutions to enterprise problems which sometimes requires external research and a willingness to work independently and "outside the box"
  • Provide thought leadership and LBMC brand awareness to the marketplace through participation in forums, user groups, and speaking in data platform and analytics-related conferences as comfortable

IDEAL CANDIDATE PROFILE

  • Formal education or certification program
  • 3+ years’ experience in a data engineer position
  • Strong technical and functional knowledge of SQL and Python for specialized data projects 
  • Microsoft Azure Data Factory work experience
  • Understanding of any of the following areas: relational databases, data structures, data modeling, data quality initiatives (scrubbing/prep)
  • Exposure to Cloud Services (MS Azure, Azure Analytics Services, PowerBI, etc.)
  • Desire for learning, problem solving, collaboration, technical advancement, etc. This role has the opportunity to grow into an Architect or Data Manager opportunity. 
  • Ability to establish effective working relationships with LBMC employees, technical team members, project stakeholders, leadership, etc.

 

Diversity and Inclusion at LBMC

Commitment to our team members, clients, and the communities in which we work. At LBMC, our mission of delivering the best to our clients and each other every day is rooted in our unique differences.  Our engagement, growth and success are at their best when team members have equal opportunity and are included. Diversity brings value to LBMC by connecting us with our community and driving innovation.

*LBMC provides equal opportunities to all employees and applicants for employment. We recruit, employ, train, compensate and promote without regard to race, religion, color, age, sex, national origin, sexual orientation, gender identity, genetic disposition, neurodiversity, disability, veteran status or any other protected category under federal, state and local law.*

 

See more jobs at LBMC, PC

Apply for this job

+30d

Data Engineer | Eagle Genomics

Global TalentUkraine Remote
4 years of experience2 years of experienceagilescalaDesignazurescrumdockerpython

Global Talent is hiring a Remote Data Engineer | Eagle Genomics

On behalf of Eagle Genomics, GT is looking for an inspiredData Engineer to join their engineering team building products in the fields of microbiology, sustainable agriculture, food, personal care, and cosmetics, using the latest technologies to enable data-driven discovery.

Digitally reinventing life sciences to solve some of the grand challenges of our age.

Eagle Genomics is a pioneering company working at the intersection of two exciting areas: life sciences and data science. Their AI-augmented platform is revolutionizing how scientists conduct life sciences research and is bridging the gap between data and new insights in a rapid, systematic, and traceable way. They are putting data science at the fingertips of biologists to drastically reduce the time and cost of research, enabling customers to achieve radical productivity improvements and true data-driven discovery. They are also partnering with research leaders such as the Earlham Institute to bring together open science and commercial R&D to benefit society.

If you're passionate about data, algorithms, ML/AI, graph technology, and innovations, join the team and help us change what it means to do real science and drive innovations.

What you will love about Eagle Genomics:

  • Working for impact. Teaming up with smart and engaging people united by a shared purpose.
  • Collaborating with industry-shaping clients to tackle complex, real-world challenges, sparking discoveries that impact the world around us.
  • Be an active part of our growth story with the freedom to test, learn, experiment, and explore new ideas.
  • Openness. Our monthly All Hands meetings bring our global teams together and encourage discussion.
  • The opportunity to lead, learn and grow with continuous encouragement, professional development, and support.

Flexible working:

Our roles offer flexible working practices that support your, and the team’s, best delivery. We have co-working spaces in London, Cambridge, Berlin, Paris, Kyiv, Manhattan, and Hyderabad.

Diversity, equity & inclusion:

We strive to create and foster a working environment where everyone is free to be themselves. A commitment to diversity, equity and inclusion is in our DNA and spans everything we do, including our recruitment process.

You will need:

  • Looking for an overall experience of 8 + years.
  • Looking for minimum 2 to 4 years of experience in ETL/ELT tools.
  • Looking for minimum 1 to 2 years of experience with Azure data factory, data lake, data bricks, azure functions, OR equivalent cloud concepts.
  • Hands-on Experience in various data management libraries with related languages like python, Scala, R or any scripting language(s).
  • Knowledge of design concepts towards design patterns and data modelling.
  • Experience in data ingest, transform, and onboard the identified data sources with data pipelines.
  • Experience in delivery and deployment of data pipelines.
  • Experience in database technologies like Graph DB or Relational database to deliver the schema and data retrieval.
  • Have strong communication, analytical and interpersonal skills.
  • Experience in writing automated tests and testable code.
  • Experience user of GIT/GITHUB repository.

A bonus if you have:

  • Knowledge of Docker or Kubernetes.
  • Knowledge of Microservices architecture.
  • Experience of working in an Agile environment, particularly Scrum, and JIRA.

We go beyond usual perks… By working with us, you will get:

  • Monthly education allowance for courses, training, books, and events.
  • Best-in-class IT equipment.
  • Lunches coverage.
  • Mentorship club from GT executives.
  • Internal team-building events every month.
  • Lectures from experts.
  • Vacation (21 working days a year), and sick leaves.
  • Health insurance.
  • Mindfulness sessions.

Working Model:

GT builds remote engineering teams for outstanding product companies. Our future mate will work directly with Eagle Genomics. We call this the ‘Extended Team model’, and it means that each team member is integrated as deeply as possible into the client's team. You will work with the same tools and technologies as they do and communicate directly with a client without any intermediary in between. We also encourage trips to a client and joint teambuilding and after-work activities.

See more jobs at Global Talent

Apply for this job

+30d

Lead Data Engineer

causaLensLondon, United Kingdom, Remote
agilenosqlpython

causaLens is hiring a Remote Lead Data Engineer

causaLens are the pioneers of Causal AI — a giant leap in machine intelligence.

We build Causal AI-powered products that are trusted by leading organizations across a wide range of industries. Our No-Code Causal AI Platform empowers all types of users to make superior decisions through an intuitive user interface. We are creating a world in which humans can trust machines with the greatest challenges in the economy, society, and healthcare.


Summary

We are looking for a Lead Data Engineer based in London to join us in building our data platform, automatically discovering and enhancing the highest value data in our customer’s organisations. This is a full-time placement with significant opportunities for personal development in a rapidly expanding team.


Roles and Responsibilities

We are looking for an exceptional and ambitious Lead Data Engineer to help our team of world class engineers, data scientists and commercial executives develop our Causal AI platform. You’ll be an engineer first, working daily with Python, focused on data. You will take the lead in designing and building a new, proprietary data platform that exposes the causal drivers hidden in data, allowing users to discover the data that matters to their problem. Some of your responsibilities will include:

  • Designing and Engineering a data platform integrated with our causal AI techniques

  • Managing the data loading, processing and delivery process

  • Leading and guiding the team of data engineers,

  • Enabling data partners and vendors to deliver their data through our platform

  • Working with platform and delivery teams to meet their and their customer’s data needs

What You’ll Be Working On

You will be leading a team of data engineers, reporting to the Director of Platform and Data Engineering. You’ll have had lots of experience with ETL pipelines and you’d be brimming with ideas on how to build things better, and make data pipelines more fit for modern data science. You’ll predominantly work in Python and will be comfortable to build services around the data consumption and processing process. You’ll help us define what our data processing needs to be, how it needs to run and and take the lead on delivery of it.

See more jobs at causaLens

Apply for this job

+30d

Data Engineer

Monolith Nebraska LLC(Multiple states)
agileMaster’s Degreetableauazurec++AWS

Monolith Nebraska LLC is hiring a Remote Data Engineer

Monolith, headquartered in Lincoln, NE, is excited to announce its search for a Data Engineer.

At Monolith we apply scientific principles, engineering practices and a lot of hard work to solve real problems that have a global impact. We use sophisticated analysis methods, advanced manufacturing techniques, and often even our hands to build first of its kind technologies. We do not compromise on safety, quality or performance. If you want to solve tough problems, build real things, and have a big impact then you should join us.

 

Your Role:

At Monolith, we rely on powerfully insightful data to inform our systems and solutions—and we’re seeking an experienced pipeline-centric Data Engineer to put it to use. Our ideal hire will have the mathematical and statistical expertise you’d expect, combined with a rare curiosity and creativity. You’ll wear many hats in this role, but much of your focus will be building out our plans to manage large volumes of engineering and business data. Beyond technical prowess, you’ll need the soft skills it takes to communicate highly complex data trends to organizational leaders in a way that’s easy to understand. We’re looking for someone willing to jump right in to help the company get the most out of our data.

 

You Will:

  • Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes
  • Select and operate data management and analysis tools including: database, cloud data platforms, analysis tools.
  • Help initiate our data science workflows, adding value to our product offering and building out our customer lifecycle and retention models
  • Be an advocate for best practices and continued learning
  • Work closely with our engineers to help build complex algorithms that provide unique insights into our data
  • Use agile software development processes to iteratively make improvements to our back end systems
  • Model front end and back end data sources to help draw a more comprehensive picture of user flows throughout our system and enable powerful data analysis
  • Build data pipelines that clean, transform, and aggregate data from disparate source
  • Develop models that can be used to make predictions and answer questions for the overall business

 

You Have:

  • Bachelor’s degree in computer science, information technology, engineering or equivalent work experience
  • Experience with SQL/database and data visualization/exploration tools (PowerBI, Tableau, etc.)
  • Experience in several of the following areas: database architecture, ETL, business intelligence, big data, machine learning, advanced analytics
  • Familiarity with data lake and big data platforms (Azure, Teradata, SAS, Hadoop, etc.)
  • Familiarity with the Microsoft or AWS ecosystem
  • Proven ability to collaborate with multi-disciplinary teams of business analysts, developers, data scientists, and subject matter experts
  • Communication skills, especially explaining technical concepts to non-technical business leaders
  • Comfort working in a dynamic, research-oriented team with concurrent projects

Advantageous

  • Master’s degree in stats, applied math, or related discipline
  • Experience building or maintaining ETL processes
  • Professional certifications

 

You Are:

  • Committed to our values
    • Safety matters most
    • Think like a team
    • Solve the impossible, embrace reality
    • People make the difference
    • Decisions drive results
    • Generosity of spirit
    • Enjoy the ride

Work Environment

The majority of the tasks involved with this position are performed from remote locations or in an open office environment

Travel

Travel will be required up to 25% of the time.

See more jobs at Monolith Nebraska LLC

Apply for this job

+30d

Data Engineer (MS / Azure)

Default PortalLondon, GB Remote
agilesqlDesignazure.net

Default Portal is hiring a Remote Data Engineer (MS / Azure)

Azure Data Engineer

Location:Remote Working

Work Pattern:Contract

Clearance: Eligible for SC

Rate:£450 - £500 day rate outside IR35

The Company

We are a specialist Data Engineering, Cloud and Analytics consultancy focused on supporting our clients in successfully delivering on their digital transformation programmes. Our aim is to ensure we deliver value from our client’s data using innovative approaches that improve their data capabilities, analytics and data governance.

With demand for our services from our clients at an all-time high and continuous growth and success within our market sector, we are embarking on a major recruitment drive and keen to recruit talented Azure Data Engineers to join our project delivery team.

The Role

We are working for a high-profile client, analysing their existing processes, identifying and implementing opportunities to optimise these processes, and developing solutions to deliver service improvements to them.

Key Responsibilities:

  • You will provide technical guidance and advice to help in the design and development of Azure data solutions for data modelling and warehousing, data integration, and analytics;
  • You will implement Azure data services;
  • You will interact with various stakeholders to help define needs and translate into custom solutions;
  • You will design and develop scalable data ingestion frameworks to transform a wide variety of datasets;
  • You will research, analyze, and help implement technical approaches for solving challenging and complex development and integration problems, providing a strategy and roadmap.

Requirements:

  • Proven experience of working on relevant Azure data projects
  • Proven experience with data engineering/data warehousing
  • Experience implementing Azure analytics platforms for client projects
  • Experience in MS Access
  • Experience in SQL
  • Experience in the .Net framework
  • Working experience with version control platforms
  • Working knowledge of agile development including DevOps concepts
  • Experience in gathering and analysing system requirements
  • Good to have familiarity with data visualization tools (Tableau/Power BI)
  • Good to have exposure to the Cloud Data ecosystem
  • Experience of working within large, complex and geographically dispersed programmes

Interested?

Then please get in touch by applying with your most recent copy of your CV including a contact number and we will contact you directly to discuss further.

We welcome applications from all suitably qualified people regardless of gender, race, disability, age or sexual orientation. All applications are assessed purely on merit, against the capabilities and competencies required to fulfil the position.

See more jobs at Default Portal

Apply for this job

+30d

Senior Data Engineer (w/m/d) - Language Data

DeepL sucht MitarbeiterRemote job, Remote
ansiblec++pythonAWS

DeepL sucht Mitarbeiter is hiring a Remote Senior Data Engineer (w/m/d) - Language Data

ist das bekannteste KI-Unternehmen in Deutschland.  Wir entwickeln neuronale Netze, die Menschen beim Umgang mit Sprache unterstützen. Mit dem DeepL Übersetzer haben wir die international beste Computerübersetzung auf den Markt gebracht und stellen sie für jeden im Internet kostenlos zur Verfügung. In den nächsten Jahren möchten wir DeepL zum weltweit führenden Unternehmen für Sprachtechnologie ausbauen. 

Unser Ziel ist es, Sprachbarrieren zu überwinden und Kulturen einander näherzubringen.  
 

Was unterscheidet uns von anderen Unternehmen?

DeepL (früher Linguee) wurde von Entwicklern und Forschern gegründet. Die Entwicklung neuer spannender Produkte steht bei uns im Vordergrund, deswegen verwenden wir viel Zeit für die aktive Forschung an den aktuellsten Themen. Wir verstehen die Herausforderungen bei der Entwicklung neuer Produkte und versuchen diesen mit einer agilen und dynamischen Arbeitsweise zu begegnen. Unsere Arbeitskultur ist sehr offen, denn wir wollen, dass sich unsere Mitarbeiter*innen wohlfühlen. In unserer täglichen Arbeit setzen wir moderne Technologien ein - nicht nur um Texte zu übersetzen, sondern auch um die weltweit besten Wörterbücher zu schaffen oder andere sprachliche Probleme zu lösen.

Wenn wir von DeepL oder Linguee als Arbeitgeber erzählen, reagieren viele Leute sehr positiv darauf. Weil sie sich über die offenen, kostenlosen Dienste und Apps schon häufig gefreut haben. Und wir freuen uns, dass wir helfen, Sprachbarrieren zu verkleinern.


Arbeite, wo immer Du möchtest

Du kannst entscheiden, ob Du zu Hause arbeiten möchtest oder im Büro. Unsere Arbeitsweise ist ganz darauf ausgelegt, dass Du ein fester Bestandteil des Teams wirst, egal wo Du arbeitest. Daher suchen wir deutschlandweit nach herausragenden Mitarbeiter*innen.


Was machst Du zukünftig bei DeepL?

Damit unsere Übersetzungs-Technologie die Feinheiten der menschlichen Sprache lernen kann, benötigen wir enorme Mengen an linguistischen Daten. Du ergänzt ein kleines Team, welches sich um die Beschaffung, Filterung, Aufbereitung und Qualitätsbewertung dieser Daten kümmert. Dabei benutzt, verbesserst und erweiterst Du unsere Pipeline und orchestrierst Hunderte von Servern - sowohl auf dedizierter Hardware als auch in der Cloud.

See more jobs at DeepL sucht Mitarbeiter

Apply for this job

+30d

Data Governance Engineer

agilesqlDesignazure

Veracity Consulting Group is hiring a Remote Data Governance Engineer

Veracity is a digital consulting company headquartered in Richmond, Virginia. What started in 2015 as a small group of consultant trailblazers has quickly transformed into a fast-growing firm innovating fortune 500s that you see today.

Our team is made up of technologists, strategists, and creative problem solvers who have one goal in common: creating fluid solutions that support business growth. With substantial experience in more than 10 industries, we come together as one team to deliver transformative results. While we take our work seriously, we never lose our playful spirit and we pride ourselves on our fun and energetic culture.

Today we are ready to add a Data Governance Engineer to our team!  As a Data Governance Engineer you will be responsible for the following:

  • The mission is to implement an enterprise-wide model for the governance of data at this company to ensure consistency of usage and a high level of confidence in data driven analytics and decision-making capabilities
  • Work with the Data Governance team and business partners to research, evaluate, document, and maintain standards, best practices, design patterns around project requirements, and various other aspects of existing and emerging ETL technologies in support of the on cloud and Big data implementation
  • Collate requirements for projects from the perspective of Data Governance team and participate in overall project implementation
  • Perform consultative role in the assessment of policy and standards alignment, data inventories for scoped application, data content management, DQ rules, access management, data security and data risk
  • Support data governance tooling project, business metadata management and documentation
  • Appetite to learn new technologies and be ready to work on new cutting-edge cloud technologies
  • Interact with business analysts and functional analysts in requirements gathering and implementation
  • Collaborate and work with the extended project team on compiling proposals including high level technical solution,
  • estimate and project plan
  • Provide technical support to project teams throughout project lifecycle around technology platforms,
  • solution design, security, debugging, profiling, performance tuning, etc
  • Provide governance over project teams to ensure standards and best practices are being followed

Qualifications

  • Solid foundation in hands-on Data Governance (DG) and Data Quality (DQ) practices, understand complexities in managing end to end data pipelines and in-depth knowledge of data governance and data management concepts
  • To be successful in this role, the candidate would require a balance of product-centric technical expertise and a deep knowledge of DG practices, which includes but not limited to metadata management, metamodeling, lineage documentation, data stewardship, and DQ processes
  • Experience in enterprise-wide implementation of Data Governance, Data Quality or Technical Data Management programs
  • Hand on Data Engineering experience working with data analytics projects
  • Experience working with various Data Governance tools
  • Experience working enterprise complex large scale, multi-team, programs in agile development methodologies
  • Demonstrated experience in cutting-edge database technologies, and cloud services such as Azure, GCP, Azure, Data Bricks or SnowFlake; deep experience in technologies such as Spark (Scala/python/Java), ADLS, Kafka, SQL, Synapse SQL, Cosmos DB, Graph DBs
  • Strong analytic skills related to working with unstructured datasets
  • Strong SQL knowledge and data analysis skills for data anomaly detection and data quality assurance
  • Eagerness to learn new technologies on the fly and ship to production
  • Demonstrated experience in Configuration Management, DevOps automation
  • Excellent communication skills:
  • Demonstrated ability to explain complex technical content to both technical and non-technical audiences
  • Experience in Data Governance Program deliverables, Solution Design, Data modeling, performance testing and tuning; ADLS, Synapse SQL database or GCP Big Query and GCS management and Performance tuning - Partitioning / Bucketing
  • Bachelor s degree in computer science or related field

See more jobs at Veracity Consulting Group

Apply for this job

+30d

Data Engineer

DecentSeattle, WA Remote
sqlsalesforceDesignazurejavapythonAWS

Decent is hiring a Remote Data Engineer

Who We Are

Decent envisions a world where everyone has the freedom to do the work they want without sacrificing access to affordable and comprehensive healthcare. We offer more affordable health insurance options for start-up to midsize businesses by aligning incentives to improve health, reduce costs, and put members at the center of their care.

We offer an amazing compensation package consisting of a combination of competitive base salary + stock options, medical and dental benefits, career advancement opportunities in a small but quickly growing company and unlimited PTO.

Job Summary

We are looking for a Data Engineer who will lead data strategy at Decent. This role will work with various product and development teams to ensure data integrity and quality, build data models, pipelines, and backups.

You have a passion for data and data management and works well across teams to help to shape the way that Decent engages with data engineering. You really enjoy building solutions, writing code, running code, and working with complex data systems.

You will report to the Engineering Manager.

What You Will Be Doing

  • Work with our engineering manager to create the strategy of data management within an established information architecture that supports the development and secure operation of our information and digital services.
  • Ensure data integrity between platforms and products here at Decent
  • Establish technical requirements and implementation details for data solutions
  • Develop and improve performant databases, data models, integrations and ETL pipelines
  • Work with product managers to establish and measure success of Decent products
  • Become a subject matter expert for your data domain and help teams across the company to access and understand your data




What We’re Looking For

  • 1+ years of experience with Salesforce
  • 3+ years of experience building and maintaining production data pipelines, databases or web applications
  • 2+ years experience with cloud development and technologies with a focus on Google Cloud technologies, AWS or Azure knowledge are helpful
  • Strong SQL skills and proficient in at least one programming language (Java or Python)
  • Experience building ETLs against various sources, including REST endpoints
  • Knowledgeable about data architecture, data warehousing, and ETL design patterns
  • Able to identify and develop data solutions for ever changing product requirements
  • Prioritize across multiple stakeholders and communicate your plans effectively

See more jobs at Decent

Apply for this job

+30d

Senior Data Engineer

JazzHRRemote
5 years of experiencesqlDesignelasticsearchmysqlAWS

JazzHR is hiring a Remote Senior Data Engineer

JazzHR's engineering team builds cutting-edge products designed to provide a better hiring experience for both applicants and hiring teams.

We tackle big technical challenges for systems at the scale every day.

Our team is built on mutual respect and trust. We emphatically embrace a culture of continued learning and growth and are empowered to innovate, explore, and utilize our expertise to craft the right solutions for our customers.

Our team values kindness and curiosity, driven by the desire to try, reflect and adapt. This is a fully remote role with the continental United States.

We’re seeking an experienced Senior Data Engineer who will be crucial to JazzHR’s continued growth and success. The Senior Data Engineer will work closely with engineers to ensure optimal data delivery and consistent architecture across all projects. The right candidate will be self-directed and comfortable supporting the data needs of multiple teams, systems, and products across millions of records and thousands of organizations. Are you excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives? We’d love to hear from you!

What you'll do:

  • Create and maintain data pipeline architecture
  • Create data structures that meet functional / non-functional business requirements
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Support infrastructure for messaging, extraction, transformation, and loading of data to/from a wide variety of data sources using SQL and AWS databases.
  • Oversee retirement or replacement of legacy databases that may require migration.
  • Leverage analytics tools that utilize the data pipeline to provide actionable insights to our customers and our internal business partners.
  • Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secured for both our customer's multi-tenanted environments as well as regulatory compliance for international customers.
  • Work closely and support our sysops team to ensure proper monitoring, management, and maintenance of the database technologies.

What you'll bring:

  • 3-5 years of experience of SQL knowledge and experience working with relational databases such as MySql, query authoring, and query debugging and optimization.
  • Experience with AWS cloud services such as Redshift.
  • Experience with document/search technologies such as Solr and Elasticsearch and how best to troubleshoot and optimize those tools.  
  • Experience building and optimizing big data pipelines, architectures, and data sets such as Kafka and Airflow.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  •  

 

About JazzHR:

JazzHR has raised the bar in the recruiting software industry, with many of our innovations becoming industry-standard. We’re the first company to put powerful, yet easy-to-use recruiting software in the hands of startups, growing companies, and small businesses from all industries. We're proud of our accomplishments - click here to see more!

About Employ:

Employ empowers organizations of all sizes to overcome their greatest recruiting and talent acquisition challenges. Offering a combination of purpose-built intelligent software technologies, services, and industry expertise, Employ provides businesses of all sizes with powerful solutions for recruiting a diverse workforce. Through its JazzHR and Jobvite technologies, and NXTThing RPO services, Employ serves more than 12,000 customers across all industries. For more information, visit www.employinc.com.

JazzHR and Employ are equal opportunity employers. All employment decisions are solely based on business needs, job requirements and individual qualifications without regard to race, gender, religion, ethnicity, age or any other status protected by the laws and regulations where we operate.

See more jobs at JazzHR

Apply for this job

+30d

Data Engineer

OpenPhoneSan Francisco, Remote
5 years of experiencekotlintableaupostgressqlB2BDesignswiftmobileiosjavaandroidpythonAWSjavascriptbackendfrontend

OpenPhone is hiring a Remote Data Engineer

Data Engineer at OpenPhone (S18)
The new phone for business.
San Francisco, Remote / Remote
Full-time
About OpenPhone

OpenPhone is on the mission to build the world's best calling and messaging app for professionals and businesses. With over 10,000 paying customers already, we are aiming to be the #1 communications app for the 130 million professionals in North America.

Our founders are previous engineers and product managers at companies that have built software for over half a million businesses. We are backed by the industry’s best venture firms including Y Combinator, Slow Ventures, and Garage Capital, with an amazing list of advisors from Asana, Facebook, Google, and more.

About the role

Are you passionate about Data? Are you excited by the opportunity to build a beloved B2B brand? Do you want to have a significant impact at a high-growth startup?

At OpenPhone, building simple and delightful experiences is not only our competitive advantage but a value we hold dear. This philosophy applies to everything we do; from the way we work to the look and feel of our product, from the infrastructure it’s running on to our content, customer interactions, and everything else.

The Data Engineer at OpenPhone will build high quality data pipelines driving analytic solutions. The role requires deep understanding of data architecture, data engineering, data analysis, reporting, and a basic understanding of data science techniques and workflows. The ideal candidate is a skilled data / software engineer with experience creating data products supporting analytic solutions.

Here Are Some Things You'll Do:

  • Solve complex data problems to deliver insights that helps our business to achieve their goals
  • Create data products for analytics and data scientist team members to improve their productivity
  • Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions
  • Lead the evaluation, implementation and deployment of emerging tools and process for analytic data engineering in order to improve our productivity as a team
  • Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes
  • Code, test, and document new or modified data systems to create robust and scalable applications for data analytics
  • Implement security and recovery tools and techniques as required
  • Work with Data Scientist to make sure that all data solutions are consistent
  • Ensure all automated processes preserve data by managing the alignment of data availability and integration processes

About You:

  • Bachelor of Science in Computer Science, Engineering, Mathematics, Statistics or related subject
  • 2-5 years of experience in developing modern data pipelines and applications for analytics (e.g., BI, reporting, dashboards) and advanced analytics (e.g., machine learning, deep learning) use cases
  • Expertise designing and implementing data pipelines using modern data engineering approach and tools: SQL, Python, Delta Lake, Databricks, Spark, Glue, Nifi , Streamsets , cloud-native DWH ( BigQuery , Snowflake, Redshift), Kafka/Confluence, Presto/ Dremio /Athena
  • Experience with developing solutions on cloud computing services and infrastructure with AWS
  • Experience with database development
  • Worked with BI tools such as Tableau, Qlick , PowerBI or cloud-native ones such as Looker, QuickSight
  • Conceptual knowledge of data and analytics, such as dimensional modeling, ETL, reporting tools, data governance, DWH, etc.
  • Exposure to machine learning, data science, computer vision, artificial intelligence, statistics, and/or applied mathematics
  • Excellent communication, listening, and influencing skills

There's no such thing as a 'perfect' candidate. We're looking for an optimist with grit and determination, who is excited to face the challenges of a growing startup. OpenPhone is the type of company where you can grow, and we encourage you to apply to us even if you don't 100% match the exact candidate description.

About OpenPhone

OpenPhone is a new type of business phone. Our mission is to help people communicate better and be more productive.

We’re backed by Y Combinator and the best venture firms including Slow Ventures, Kindred Ventures, and others. We're serving thousands of businesses around the world and growing quickly. We take a lot of pride in providing an exceptional customer experience and a product people love. Our customers rated us #1 on all possible categories on G2 Crowd.

We're a distributed team working from San Francisco, Seattle, Ottawa, Moscow, Manila, Phoenix, and Sydney.

We are committed to creating an inclusive workplace that values diversity. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

Technology

Backend: Javascript, Node, Serverless, AWS, Postgres

Front-end: React, React Native, Swift, Java, Kotlin

Apply Now

See more jobs at OpenPhone

Apply for this job

+30d

Data Engineer(Boosters)

GenesisUkraine
scalaairflowsqlapidockerjenkinspythonAWS

Genesis is hiring a Remote Data Engineer(Boosters)

✨✨✨Вітання!✨✨✨

Нумо знайомитися? :)

Ми - продуктова команда Boosters, і ми створюємо продукти, які покращують життя людей і несуть реальну цінність. Зараз у нас є 4 продукти, давай докладно розповім про них:

  • Words Booster– додаток для вивчення іноземних мов (входить у топ-10 мовних додатків у світі)
  • Avrora– додаток для покращення сну (топ-5 додатків H&F у понад 82 країнах)
  • Manifest – додаток з афірмаціями (більше 22 тисяч репостів наших афірмацій)
  • RiseSpace – платформа з лайф коучами, це наш новий напрямок (реліз був у грудні 2021)

Наша головна перевага - це люди. Люди, які націлені на те, щоб бути кращими за себе вчорашнього і перемагати разом. Зараз у нас в команді вже 60 людей, і ми плануємо не зупинятися.

Наразі у нас відкрита позиція Data Engineer, який буде відповідати за налаштування інфраструктури для отримання, зберігання та обробки даних, з метою постачання дата аналітикам та подальшого отримання з даних інформації, що впливає на прийняття рішень та розвиток продуктів.

У тебе будуть такі завдання:

  • забезпечення постачання та трансформації даних, шляхом роботи з API Facebook, Google Ads, AppsFlyer, інших рекламних мереж
  • побудова архітектури сховища даних та налаштування доставки даних до нього
  • оптимізація існуючих процесів ETL
  • налаштування хмарних сервісів (AWS)
  • обробка, підтримка коректності та опис моделей даних

Що потрібно, щоб приєднатися до нас:

  • від 1 року досвіду на аналогічній позиції
  • впевнені знання Python, досвід створення data pipelines та роботи з dataframe (з використанням Pandas, PySpark)
  • вміння працювати з API
  • відмінні знання SQL, основ архітектури БД та досвід роботи з різними СУБД
  • досвід роботи зі сховищами даних та хмарними сервісами (переважно з Amazon Web Services – S3, Athena, Glue, RDS, Lambda etc.)
  • вміння працювати з Docker
  • досвід роботи з VCS, розуміння принципів CI/CD

Буде перевагою, якщо ти:

  • маєш досвід роботи з Apache AirFlow, Jenkins
  • маєш досвід розробки на GoLang, Scala
  • працював з аналізом продуктивності, навантажувальним тестуванням та оптимізацією модулів/системи

Що ми пропонуємо?

  • Роботу в команді професіоналів та з аудиторією більше одного мільйону в місяць;
  • Філософію та умови для твого постійного росту та розвитку;
  • Великий простір для втілення власних ідей і впливу на продукт.

Також, ми пропонуємо такі бенефіти:

  • Корпоративний лікар та медичне страхування;
  • Допомога з релокейтом для співробітника та сім’ї;
  • Компенсація додаткового навчання на зовнішніх тренінгах і семінарах та Business і Management School для співробітників;
  • Велика електронна бібліотека та доступ до платних онлайн-курсів і конференцій, внутрішні бесіди і воркшопи, курси англійської.

Залишай своє резюме і приєднуйся до Boosters!

See more jobs at Genesis

Apply for this job

+30d

Lead Data Engineer / Data Engineer

Bachelor's degreeairflowsqlB2BdockerkubernetespythonAWS

Job Offers .NET, Java, DevOps, QA and more · MOTIFE is hiring a Remote Lead Data Engineer / Data Engineer

It's INNOVATION! It's REVOLUTION! It's UNICORN!
And it's time for YOU to join this amazing $1 billion project.


We are seeking for Lead Data Engineers and Senior Data Engineers who will play an important part to develop the startup from Dubai. If you've ever wanted to know what the dark kitchen concept is - this is an opportunity to find out more!


The Lead Data Engineer will be an integral part of Tech teams, working as part of the Data department. Leading a team of 7-8 date engineers for the other data engineers from the same tribe.
Data Engineers are the data-providing partner for the organization's Product and Business teams. These engineers focus on helping the Product teams to build the best in class products to support growth, operations, and customer experience.


Key takeaways
Stack: 
Python, SQL/NoSQL, ETL, Kafka, AWS, mpp
Salary: 
Depending on the level of expertise, our client offers:

Lead Data Engineer salary: 28 000 - 32 000 net. B2B (or up to 26 500 gross UoP) + ESOP + 26 days holidays paid,

Data Engineer salary: 18 000 – 26 000 net. B2B (or up to 22 000 gross UoP)+ESOP + 26 days holidays paid,
Location:
 Remote, Hybrid (1 or 2 days a week from office in Cracow), possibility to work from Dubai,

Recruitment process: 3-step online process: technical interview + live coding + bar raiser interview.

Responsibilities 

  • Building Data Warehouses/Data Lakes, ideally based on cloud solutions, 
  • Building Data Pipelines based on Kafka events, working with Snowflake, DBT, Gitlab. 

See more jobs at Job Offers .NET, Java, DevOps, QA and more · MOTIFE

Apply for this job