Data Engineer Remote Jobs

379 Results

5h

Associate Data Engineer

Privia HealthArlington, VA, USA, Remote
agilesqlslackpythonAWS

Privia Health is hiring a Remote Associate Data Engineer

Company Description

Privia Health™ is a national physician platform transforming the healthcare delivery experience. We provide tailored solutions for physicians and providers, creating value and securing their future. Through high-performance physician groups, accountable care organizations, and population health management programs, Privia works in partnership with health plans, health systems, and employers to better align reimbursements to quality and outcomes.

Job Description

The Associate Data Engineer works on a team of Data Engineers and supports a data platform with a diverse set of tools including Google Cloud, SQL Server, and other solutions. This platform holds clinical, financial, and population health data for our 6M+ patients, and serves business critical needs for our product and analytics teams. Our warehouse is an integral part of Privia’s results oriented culture, having leveraged our platform to achieve industry-best revenue cycle awards and earned ongoing benefits from 1-sided and 2-sided risk performance.

Primary Job Duties: 

  • Contributes to the development of high-volume, low-latency applications for mission-critical systems, delivering high-availability and performance

  • Follows best practices established by the team and contributes new ideas

  • Collaborates with  team members, supports the maintenance of existing data processes, solutions, and supports other engineers

  • Contribute in all phases of the development lifecycle

  • Deliver timely, well written, documented, well designed, testable, and efficient code

  • Write code in compliance with specifications 

  • Develops and designs with a dev-ops mindset

  • Support continuous improvement 

  • Perform other duties as assigned

Minimum Qualifications:

  • Some knowledge of data warehouse architectures and a basic understanding of data modeling

  • Some experience in Python or another similar scripting language 

  • Experience working with APIs

  • Experience with cloud technologies (AWS, Google Cloud, Kafka, Spark, etc. ) preferred 

  • Some experience with with ETL jobs and SQL, including stored procedures

  • Experience completing development projects with high quality results

  • A strong desire to learn new technologies

  • A strong belief in automated testing, and experience with version control

  • Experience with Agile SDLC

  • Must comply with HIPAA rules and regulations 

  • Must be willing and able to communicate with the team via webcam, webconf, and phone.

  • Must have access to private, quiet work space with high-speed internet to effectively work remotely

Interpersonal Skills & Attributes:

  • Ability to work collaboratively in a multi-location, cross-functional team with a wide range of experience levels

  • Excellent communication skills (verbal and written) necessary to effectively interact with data engineering staff, product owners, and stakeholders

  • Able to support and contribute to multiple competing projects 

  • Excellent analytical and problem solving skills

  • Strong attention to detail and problem-solving skills

  • Adaptable and flexible

Communication Methods Used:

  • Slack

  • Video calls

  • Emails

 

Physical Demands:

Works constantly at a computer or other workstation

  • Ability to constantly remain in a stationary position

  • Ability to constantly operate a computer and other office productivity machinery, such as computer and printer

  • Ability to read and use close vision, including the ability to do so on a computer screen

  • Ability to frequently communicate and exchange information

  • Ability to frequently adjust focus

Additional Information

All your information will be kept confidential according to EEO guidelines.

Technical Requirements (for remote workers):

In order to successfully work remotely, supporting our patients and providers, we require a minimum of 5 MBPS for Download Speed and 3 MBPS for the Upload Speed. This should be acquired prior to the start of your employment. The best measure of your internet speed is to use online speed tests like https://www.speedtest.net/. This gives you an update as to how fast data transfer is with your internet connection and if it meets the minimum speed requirements. Work with your internet provider if you have questions about your connection. Employees who regularly work from home offices are eligible for expense reimbursement to offset this cost.

See more jobs at Privia Health

Apply for this job

10h

Data Engineer with Secret Clearance - REMOTE

4 years of experience2 years of experienceagilesqljavapython

Maania Consultancy Services is hiring a Remote Data Engineer with Secret Clearance - REMOTE

What we’re looking for:
Someone with a solid background developing solutions for high volume, low latency applications and can operate in a fast-paced, highly collaborative environment.
A candidate with distributed computer understanding and experience with SQL, Spark, ETL.

Clearance: Secret Clearance or higher

Basic Qualifications:

  • 4+ years of experience with SQL
  • 4+ years of experience developing data pipelines using modern Big Data ETL technologies like NiFi or StreamSets. 
  • 4+ years of experience with a modern programming language such as Python or Java
  • 4 years of experience working in a big data and cloud environment

Additional Qualifications:
2 years of experience working in an agile development environment

See more jobs at Maania Consultancy Services

Apply for this job

10h

Senior Data Engineer

AvaloqAyala Ave, Makati, Metro Manila, Philippines, Remote
agilesqloracleDesignscrumpythonjavascript

Avaloq is hiring a Remote Senior Data Engineer

Company Description

Writing the future. Together. 

Avaloq is a value driven, fast-paced financial technology and services company and we are committed to developing the banking solutions of tomorrow. 

By joining Avaloq, you’ll become a key part of our effort to power the digital transformation of the financial services industry. Our ambition is big and bold – to provide full end-to-end digital solutions by combining our leading efficiency with a flexible, responsible digital user experience. Headquartered in Zurich, Avaloq has over 2,000 employees globally. More information is available at www.avaloq.com  

Job Description

Your Team

We are the Analytics domain teams helping financial institutions to easily adopt and use, as well as gather and analyze data from our Avaloq products, within a wide ecosystem. We believe our colleagues comes first, thinking different is an asset and innovation comes by putting customer experience first in our design thinking.

We are looking for a strong resource who has both business acumen and technical experience, to become part of these domains, driving the interactions and collaboration with existing and potential clients.

Your mission

  • Closely collaborate with the Product Owner, Business Analysts and other Software Engineers
  • Design, develop and maintain thought-out solutions within the team’s responsibility
  • Improve and optimize existing functionalities
  • Develop and maintain automated tests
  • Take ownership and responsibility for your area of expertise
  • Ensure high quality on the delivery and efficient communication

Qualifications

What you need

  • Proven capability to develop and optimize PL/SQL and SQL based solutions
  • Experience in Data Warehousing and Data Modeling
  • Strong analytical, problem solving and conceptual skills
  • Competent in one or more programming and scripting languages
  • Fluent in spoken and written English
  • Able to work alone and in a team
  • At least 2-3 years of work experience in the fintech or financial sector
  • University degree in Computer Science/Physics/Engineering/Mathematics or comparable education

You will get extra points for

  • Oracle Data Integrator Know-How
  • Banking know-how or experience working on financial solutions
  • Knowledge of Public Cloud
  • Knowledge of Snowflake
  • Knowledge of JavaScript and Python
  • German, Italian or French language skills
  • Work Experience in team with Agile Scrum
  • Knowledge and experience on Avaloq Banking Suite

Additional Information

What you can expect:

It’s all about getting to know our teams and to e-meet with us. We will use video interviews to give you the opportunity to meet your future colleagues and get a first insight into Avaloq’s unique culture.

What we will offer you

We offer competitive base salaries and a benefits package with private health and dental care as well as a generous pension. If you go the extra mile, you might be entitled to an extraordinary achievement reward.
Avaloq aims to share its success with all its employees by paying out “Success Share Units” depending on its performance in a given year.

Don’t be shy – apply!

Please only apply online. 

See more jobs at Avaloq

Apply for this job

1d

Data Engineer – Store Insights

Logic20/20 Inc.Washington, DC, USA, Remote
agilesqloracleDesignazurepython

Logic20/20 Inc. is hiring a Remote Data Engineer – Store Insights

Company Description

We’re a six-time “Best Company to Work For,” where intelligent, talented people come together to do outstanding work—and have a lot of fun while they’re at it. Because we’re a full-service consulting firm with a diverse client base, you can count on a steady stream of opportunities to work with cutting-edge technologies on projects that make a real difference.

Logic20/20's Global Delivery Model creates a connected experience for Logicians across geographies. You'll have access to projects in different locations, the technology to support Connected Teams, and in-person and online culture events in our Connected Hub cities.

Job Description

This is an opportunity to work with a fun, iconic brand to deliver business insights, build pipelines, and automate data generation across multiple systems. Your efforts will result in ground-level improvements across the company’s tens of thousands of brick-and-mortar locations.

As the Data Engineer on this project, you’ll deliver client value and ensure high client satisfaction that will carry impact to each neighborhood location. You should be adept at recognizing, subscribing, and applying best practices, methodology, tools, and techniques to meet client requirements, timelines, and budgets. You will have the ability to take big-picture data and ideas and turn them into actionable improvements, informing business decisions on a nationwide scale.

About the team 

The Logic20/20 Advanced Analytics team is where skilled professionals in data engineering, data science, and visual analytics join forces to build simple solutions for complex data problems. We make it look like magic, but for us, it’s all in a day’s work. As part of our team, you’ll collaborate on projects that help clients spin their data into a high-performance asset, all while enjoying the company of kindred spirits who are as committed to your success as you are. And when you’re ready to level up in your career, you’ll have access to the training, the project opportunities, and the mentorship to get you where you want to go. 

“We build an environment where we really operate as one team, building up each other’s careers and capabilities.” – Adam Cornille, Director, Advanced Analytics 

About you

  • You build applications that provide measurable business value
  • You thrive in a fast-paced, agile project environment with small, focused teams delivering product regularly
  • You’ve worked in production-grade systems with real-time processing
  • You’re a natural leader with strong full-stack experience and guidance to spare on topics like application design and architecture, integration design and architecture, and enforcement of technical standards
  • You’re a pro at designing, developing, testing, reviewing, deploying, and supporting custom applications with a big impact
  • You have a holistic perspective, seeing the big picture (not just back-end components) and leverage this skill to drive overall success of engagement
  • You actively learn about and evaluate leading-edge technologies

Qualifications

  • 8+ years of BI-related development experience
  • Strong background in SQL Server with proficiency in SSIS, SSRS, SSAS, views, and stored procedures
  • Experienced in databricks, Azure, Python, and PySpark
  • Well-versed in data visualization and storytelling
  • Adept at pulling data from multiple resources
  • Advanced data analysis including SQL query capabilities and deep experience in data modeling and relational database design
  • Strong knowledge of data manipulation using Python and Excel
  • Experience in OLAP Cube and MDX query development
  • Microsoft Certified Architect: Database, MCTS in SQL Server Business Intelligence, or MCDBA desired
  • Ability to research, analyze, recommend, and document technical solutions
  • Ability to provide professional, polished presentations to the company as a whole
  • MS/CS or equivalent work experience
  • Experience setting up data encryption / managing certificates

We’d also be super impressed if you have

  • Experience setting up a redundant infrastructure
  • Working knowledge of Redshift / Oracle
  • Past experience helping business owners to better understand their data

Additional Information

All your information will be kept confidential according to EEO guidelines.

Core Values 

At Logic20/20, we are guided by three core values: Drive toward Excellence, Act with Integrity & Foster a Culture of We. These values were generated and agreed upon by our employees—and they help us pursue our goal of being one of the best companies to work for and to work with. Learn more at https://www.logic2020.com/company/our-values

Equal Opportunity Statement 

We believe that people should be celebrated: for their talents, ideas, and skills, but most of all, for what makes them unique. We prohibit harassment and/or discrimination based on age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status, or any other basis as protected by federal, state, or local law. 

To learn more about our DE&I initiatives, please visit: https://www.logic2020.com/company/diversity-equity-inclusion  

About Logic20/20 

To learn more about Logic20/20, please visit: https://www.logic2020.com/careers/life-at-logic  

Privacy Policy 

During the recruitment and hiring process, we gather, process, and store some of your personal data. We consider data privacy a priority. For further information, please view our company privacy policy

See more jobs at Logic20/20 Inc.

Apply for this job

1d

Senior Data Engineer with Scala & Google Cloud

Accesa IT Systems SRLEmployees can work remotely, CJ, Romania, Remote
tableauscalanosqlairflowpostgressqlDesignazureUXqajavac++pythonbackendfrontend

Accesa IT Systems SRL is hiring a Remote Senior Data Engineer with Scala & Google Cloud

Company Description

Part of the Ratiodata Group, Accesa is a leading technology company headquartered in Cluj-Napoca, with offices in Zurich, Oradea and Munich. Over the past 16 years, the company has been establishing itself as an employer of choice for IT professionals who are passionate about problem-solving through technology and want to have a measurable impact through their work. 

A trusted partner for major brands in Retail, Consumer Goods, Manufacturing, and Automotive, Accesa helps businesses embrace flexibility, adaptability and evolution within their digital journey, through a large spectrum of tailored IT services, leveraging mainstream, niche, as well as legacy technologies. With more than 700 IT professionals in its 20+ competence centers, Accesa is building a distinctive people-first culture that enables their people to thrive, their clients’ business to evolve and end users to succeed.

About the project

Our projects can range between 8 and 20 weeks, while an account usually addresses several projects with different deliverables. We also love to get involved in any kind of AI related activities, be there in the discovery, prototyping, or implementing phase.

Often we also deliver joined-effort projects, either for internal purposes or to help customer reach their goal, relying on the collaboration with other teams: IoT, SAP, Hybris, RPA.

The projects we deliver are mainly focused on Digital Manufacturing Industry, but sometimes opportunities come from other industries such as Financial or Retail.

Your team

The team involved in delivering AI solutions and services often consists of Data Engineers, Data Scientists and Machine Learning Engineers, as part of the Delivery Team in which several other roles are present: Project Manager, Business Analyst, UX Designer, Application/DevOps Architect, Frontend and Backend Developers, QA Engineer.

 

Job Description

As part of our Artificial Intelligence Team, you will help out shaping the future of our software.

You will develop, test and also maintain data architectures to keep this data accessible and ready for analysis. Among your tasks, you will do Data Modelling, ETL (Extraction Transformation and Load), Data Architecture Construction and Development, and also Testing of the Database Architecture.

Your role

  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Identify, design, and implement process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Cloud ‘big data’ technologies.
  • Build/use analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Real impact one step at a time 

The impact will imply the project's context and will also go beyond this, with the Competence Area community that you will be part of, with a strong focus on your technical skills. 

Professional Opportunities

You will have access to AI Community trainings and programs emphasizing skills on the technical and tactical side, while you will be engaged within new projects and opportunities landing in our business line.

Community insights

The community consists of Data Scientists and Machine Learning Engineers, along with Data Engineers sharing knowledge and projects' insights on a regular basis. We engage in projects pertaining to Computer Vision, NLP, Advanced Analytics, Preventions and Trends Analysis.

Qualifications

 Must have

  • 3+ years of professional experience 
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets
  • Build processes supporting data transformation, data structures, metadata, dependency, and workload management
  • Knowledge of manipulating, processing, and extracting value from large disconnected datasets
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ datastore
  • Experience with:
  1. big data tools: Apache Spark (preferred), Hadoop, Kafka, etc.
  2. Google (preferred), Azure - Cloud services
  3. Stream-processing systems: Storm, Spark-Streaming, etc.
  4. Object function scripting/ object oriented languages: Scala (preferred), Python, Java, C++, etc. 

 Willing to develop

  • Relational SQL and NoSQL databases, including Postgres and Cassandra
  • Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • Extensive knowledge of Visualization tools: PowerBI, Tableau, etc

Additional Information

At Accesa & RARo you can:

Enjoy our holistic benefits program that covers the four pillars that we believe come together to support our wellbeing, covering social, physical, emotional wellbeing, as well as work-life fusion.

  • Physical: premium medical package for both our colleagues and their children, dental coverage up to a yearly amount, eyeglasses reimbursement every two years, voucher for sport equipment expenses, in-house personal trainer
  • Emotional: individual therapy sessions with a certified psychotherapist, webinars on self-development topics
  • Social: virtual activities, sports challenges, special occasions get-togethers
  • Work-life fusion: yearly increase in days off, flexible working schedule, birthday, holiday and loyalty gifts for major milestones, work from home bonuses

See more jobs at Accesa IT Systems SRL

Apply for this job

1d

Director, Data Engineering

VoskerMontreal, QC, Canada, Remote
5 years of experiencescalaDesignazurejavapythonAWS

Vosker is hiring a Remote Director, Data Engineering

Company Description

VOSKER is one of the global technology leaders in remote area surveillance.

Our desire to surpass ourselves and push back the limits allows us to revolutionize the field of artificial intelligence and the Internet of Things (IoT), thanks to our pioneering products.

 WE are motivated by performanceinnovation, and family.

We are looking for a Director, Data Engineering.

Job Description

As a Data Engineering Director, you will be responsible for designing and delivering innovative solutions on Amazon Web Services, Azure, and Google Cloud using leading cloud data warehouse and lake tools, Hadoop, Spark, Event Stream platforms, and other megadata related technologies. This position will be expected to participate in the development of projects in their initial phases and deliver them as part of a team. He/she will be expected to provide objective advice, expertise and specialized skills with the goal of creating value, maximizing growth or improving our performance.

Responsibilities

  • Lead the analysis, architecture, design and development of cloud data warehouses and lakes and business intelligence solutions;
  • Actively contribute to the cloud and megadata community at Vosker, and drive the advancement of new capabilities;
  • Oversee delivery to ensure technical excellence and design for high volume, large scale, security and reliability;
  • Provide technical and architectural guidance for various technology areas in the data lake, data warehouse, artificial intelligence/machine learning, and visual analytics space;
  • Contribute to testing activities with quality engineering partners;
  • Mentor and advise other team members.

 

Qualifications

  • At least 3 years experience in a management or supervisory position, combined to 3 to 5 years of experience in Data engineering; 
  • Experience with Amazon Web Services, (AWS) cloud platforms;
  • Hands-on development of data usage and migration to cloud platforms;
  • Proficiency in relational database design and development;
  • Proficiency and hands-on experience with megadata technologies;
  • Experience with languages such as Python, Java, Scala or Go;
  • Analytical approach to problem solving; ability to use technology to solve business problems;
  • Bilingual English and French (oral and written).

Additional Information

Why should you choose VOSKER?

  • A work environment where, performance, innovation, and family are valued!
  • A work-life balance;
  • Schedule flexibility for early and late risers;
  • No traffic, you can mainly work from home;
  • 24/7 free access to an online doctor;
  • A diversified company with a variety of challenges: you can’t get bored;
  • A group insurance, because we want to take care of our people.

Equal access to employment:

At VOSKER, we value the essence of each person and celebrate the diversity that allows us to redefine what is possible. We are committed to collaboration by providing a healthy and inclusive work environment where all voices are heard.

Please do not hesitate to contact us if you have specific needs to make this recruitment process more accessible to you.

See more jobs at Vosker

Apply for this job

2d

Principal Data Engineer

Procore TechnologiesRemote, CA, United States, Remote
scalanosqlairflowDesignUXjavapostgresqlkubernetesjenkinspythonAWS

Procore Technologies is hiring a Remote Principal Data Engineer

Job Description

Procore Data Platform Team is looking for a Principal Data Engineer to build a data platform that enables product and business teams to move faster and achieve greater results. This role is a tremendous opportunity to shape the future of the data platform at Procore

As a Principal Data Engineer, you’ll use your expert-level technical skills to craft innovative solutions while influencing and mentoring other senior technical leaders. To be successful in this role, you’re passionate about distributed systems, including caching, streaming, and indexing technologies on the cloud, with a strong bias for action and outcomes. If you’re an inspirational leader comfortable translating vague problems into pragmatic solutions that open up the boundaries of technical possibilities—we’d love to hear from you!

This position reports to Director Data Infrastructure and can be based in our Austin, TX, Carpinteria, CA, or New York City office or work remotely from a US location. We’re looking for someone to join us immediately.

What you’ll do: 

  • Design and build the next generation data platform for the construction Industry

  • Actively participate with our engineering team in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing and roll-out, and support

  • Stay connected with other architectural initiatives and craft a data platform architecture that supports and drives our overall platform

  • Liaise with peer teams and architects within and outside Data Platform to define technical interfaces for system dependencies

  • Provide technical leadership to efforts around building a robust and scalable data pipeline to support billions of events

  • Help identify and propose solutions for technical and organizational gaps in our data pipeline by running proof of concepts and experiments working with Data Platform Engineers on implementation

  • Work alongside our Product, UX, and IT teams, leveraging your experience and expertise in the data space to influence our product roadmap, developing innovative solutions that add additional capabilities to our tools

  • Lead the design and development of big data predictive analytics using object-oriented analysis, design and programming skills, and design patterns

  • Deliver observable, reliable, and secure software, embracing the “you build it, you run it” mentality, and focus on automation and GitOps

  • Coordinate with other IT stakeholders to mitigate the impact on enterprise data systems during change control procedures in order to ensure the overall health of the data lake, data warehouse, and analytical environments

  • Support the Data Governance strategy towards improving data management standards,  policies for data access, compliance, and processes enhancing the quality and completeness of the organization’s data

What we’re looking for: 

  • BS degree in Computer Science, a similar technical field of study, or equivalent practical experience is required; MS or Ph.D. degree in Computer Science or a related field is preferred

  • 10+ years of experience building and operating cloud-based, highly available, and scalable online serving or streaming systems utilizing large, diverse data sets in production

  • Experience implementing the data strategy through a platform-oriented approach that invests in tools that capture processes and decisions that scale with a growing multi-organizational platform user-base.

  • Expertise with diverse DB technologies like Presto, Athena, Hive, PostgreSQL, Graph DBs, NoSQL DBs, Snowflake, etc.

  • Comfortable using the majority of commonly used data technologies and languages such as Python, Java or Scala, Kafka, Spark, Flink, Airflow, Splunk, Datadog, Kubernetes, Jenkins, or similar

  • Expertise with all aspects of data systems, including ETL/ELT, aggregation strategy, performance optimization, and technology trade-off

  • Understanding of data access patterns, streaming technology, data validation, data modeling, data performance, cost optimization

  • Experience defining data engineering/architecture best practices at a department and organizational level and establishing standards for operational excellence and code and data quality at a multi-project level

  • Current on new industry trends (e.g., DS, ML, Data Engineering) and designs solutions for complex systems to meet ambiguous business needs

  • Possess strong knowledge or familiarity with Apache Beam, or AWS managed services for data (Glue, Athena, Redshift, Data Pipeline, Flink, Spark) and Snowflake

  • Experience in processing structured and unstructured data into a form suitable for analysis and reporting with integration with a variety of data metric providers ranging from advertising, web analytics, and consumer devices

  • Functional knowledge of common algorithms, data structures, object-oriented programming, and design

Additional Information

If you'd like to stay in touch and be the first to hear about new roles at Procore, join our Talent Community.

About Us

Procore Technologies is building the software that builds the world. We provide cloud-based construction management software that helps clients more efficiently build skyscrapers, hospitals, retail centers, airports, housing complexes, and more. At Procore, we have worked hard to create and maintain a culture where you can own your work and are encouraged and given resources to try new ideas. Check us out on Glassdoor to see what others are saying about working at Procore. 

We are an equal opportunity employer and welcome builders of all backgrounds. We thrive in a diverse, dynamic, and inclusive environment. We do not tolerate discrimination against employees on the basis of age, color, disability, gender, gender identity or expression, marital status, national origin, political affiliation, race, religion, sexual orientation, veteran status, or any other classification protected by law.

Perks & Benefits

You are a person with dreams, goals, and ambitions—both personally and professionally. That's why we believe in providing benefits that not only match our Procore values (Openness, Optimism, and Ownership) but enhance the lives of our team members. Here are just a few of our benefit offerings: generous paid vacation, employee stock purchase plan, enrichment and development programs, and friends and family events.

See more jobs at Procore Technologies

Apply for this job

2d

Director, Data Engineering

NielsenIQAustin, TX, USA, Remote
agileMaster’s Degreebackendfrontend

NielsenIQ is hiring a Remote Director, Data Engineering

Job Description

Director, Data Engineering

Location: Remote (Must be US based)

Reference ID: REF11735U

About this job

At NielsenIQ, we process over 250 terabytes of data every single day, manage it in the latest cloud technology, and build best-in-class platforms and analytical tools to add transparency and efficiency to the consumer goods market at a scale that no other company is able to achieve.

Responsibilities

  • Hands on leadership and end to end responsibility of Backend Platform and Frontend applications working directly with large retailer
  • Support and lead technical conversations, including architecture review, code review, and drive SDLC process as required  
  • Lead full cycle development of strategic initiatives and provide overall technical guidance to team 
  • Work closely with the internal Product, Technology and Business teams to drive business outcomes for retail client

How you'll invest in talent 

  • Develop and inspire a world class team fostering the highest quality talent while scaling to meet business challenges 
  • Cultivate strong culture and engagement, ensuring associates are committed and excited about their work 
  • Identify and develop NielsenIQ’s future leaders and senior technologists acting as a talent multiplier 

Projects:

  • Data Engineering Key Work: Data Enablement and Architecture
  • App & Dev Key Work: Build & maintain custom BI applications

A little bit about you

We’re looking for a passionate, analytical, innovative leader who wants to figure out how to measure things in a complex world. You are familiar with the development of BI solutions, managing cloud environment, and managing ETL. Experienced in strategically developing Data Architecture, ensuring that we can be a great technical partner with our large retail client’s technical leadership.

The role requires a leader who is customer oriented, with business acumen and who acts as a true partner to the business to achieve success. If you have a passion for building scalable products and extensive experience in scalable platforms, we’re looking for you!

Qualifications

  • Bachelor’s or Master’s degree in Computer Science or related field
  • Deep experience in developing scalable and cloud native data processing frameworks using microservices and cutting-edge technologies
  • Extensive experience in leading delivery teams in agile software development practices and DevOps
  • Advanced knowledge of data structures, algorithms and designing for performance, scalability and availability
  • Demonstrated experience in designing and building multithreaded distributed systems
  • Thrives in a fast paced, high-pressure environment and enjoys being challenged
  • Excellent communication skills, with the ability to effectively influence across cross-functional technology teams, the business and outside the organization

About NielsenIQ

We’re in tune with what the world is buying. If you can think of it, we’re measuring it. We sift through the small stuff and piece together big pictures to provide a comprehensive understanding of what’s happening now and what’s coming next for our clients. Today’s data is tomorrow’s marketplace revelation.

We like to be in the middle of the action. That’s why you can find us at work in over 90 countries. From global industry leaders to small businesses, consumer goods manufacturers to retailers, we work with them all. We’re bringing in data 24/7 and the possibilities are endless.

Become part of NielsenIQ at: www.niq.com.

Additional Information

All your information will be kept confidential according to EEO guidelines.

About NielsenIQ 

NielsenIQ is a global measurement and data analytics company that provides the most complete and trusted view available of consumers and markets worldwide. We provide consumer packaged goods manufacturers/fast-moving consumer goods and retailers with accurate, actionable information and insights and a complete picture of the complex and changing marketplace that companies need to innovate and grow. Our approach marries proprietary NielsenIQ data with other data sources to help clients around the world understand what’s happening now, what’s happening next, and how to best act on this knowledge.  We like to be in the middle of the action. That’s why you can find us at work in over 90 countries, covering more than 90% of the world’s population. For more information, visit www.niq.com.

NielsenIQ is committed to hiring and retaining a diverse workforce. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class.

See more jobs at NielsenIQ

Apply for this job

2d

Data Engineer (R4081)

AvalaraRemote, United States
agileBachelor's degreejirasqlDesignpythonAWS

Avalara is hiring a Remote Data Engineer (R4081)

The Global Analytics and Insights Team is looking for a Senior Data Engineer to function as a lead and internal evangelist for Avalara's Data Platform. This role will be heavily involved in the build and architecture of our Snowflake hosted Data Platform, from both a data and technology perspective. This role will also act as an internal lead to provide leadership and guidance to integrate internal applications that need to leverage the Data Platform- ensuring the interface and data support the requirements. This candidate must have deep experience in reporting platforms, cloud technologies, modern data warehousing techniques, and modern data stacks.

Responsibilities :

  • Collaborate with Data Platform team to build a modern Analytics stack for Avalara's Enterprise data

  • Act as a lead to ‘onboard' consuming applications- providing technical guidance and functional leadership

  • Build data models to enable advanced reporting

  • Collect business requirements and convert to technical user stories for engineers

  • Design and build scalable data orchestration, transformation, and reporting streams

  • Inject SDLC best practices into the data stack, and provide guidance to other data engineers

  • Implement CI/CD pipelines with automated testing, code instrumentation, and real time monitoring

  • Create data visualizations for data consumers, as well as providing transparency of system processes to engineering org

  • Practice/Implement data security, encryption and masking policies across various data sets and data sources

  • Work with business and engineering teams to identify scope, constraints, dependencies, and risks

  • Identify risks and opportunities across the business and drive solutions

Qualifications:

  • Minimum of 4 years work experience in data engineering field

  • Bachelor's degree in Computer Science or Engineering, or relevant work experience

  • Advanced SQL proficiency

  • Advanced Python proficiency

  • Experience with Data Visualization tools (Power BI a plus)

  • Working knowledge of Source Control, CI / CD, and DevOps (CI / CD Pipeline experience a plus)

  • Working knowledge of Agile frameworks and Jira

  • Proven ability to communicate effectively with technical and non-technical stakeholders across multiple business units

  • Excellent problem-solving skills

  • Demonstrated ability to debug complex environments and data pipelines


Preferred Qualifications:
  • Minimum of 8 years work experience in data engineering field
  • Minimum of 4 years work experience in data pipelines (ETL / ELT)

  • Minimum of 2 years with building cloud solutions (AWS and Snowflake)

  • Minimum of 2 years work experience with data modeling



About Avalara

Avalara helps businesses of all sizes achieve compliance with transactional taxes, including VAT, sales and use, excise, communications, and other tax types. We deliver comprehensive, automated, cloud-based solutions that are fast, accurate, and easy to use.

Avalara offers hundreds of pre-built connectors into leading accounting, ERP, ecommerce and other business applications. Each year, the company processes billions of tax transactions for customers and users, files hundreds of thousands of tax compliance documents and tax returns and manages millions of exemption certificates and other compliance related documents.

Avalara’s headquarters are in Seattle, WA and it has offices across the U.S. and in Brighton and London, England; Brussels, Belgium; and Pune, India. More information at: www.avalara.com

Avalara is an Equal Opportunity Employer. All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law.

Avalara is an Equal Opportunity Employer. All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law.

See more jobs at Avalara

Apply for this job

2d

Data Engineer Consultant - Contract (Inside IR35)

Version1Newcastle upon Tyne, UK, Remote
agileterraformnosqlsqloracleDesignmongodbazureapiAWS

Version1 is hiring a Remote Data Engineer Consultant - Contract (Inside IR35)

Company Description

Version 1 is celebrating 25 years in the IT industry this year and we continue to be trusted by global brands to deliver IT solutions that drive customer success. 

Version 1 is not just a Microsoft Gold Partner, an AWS Premier Consulting Partner and an Oracle Platform Partner; we are also an award-winning employer and our employees are at the heart of Version 1. We invest in a strong culture of wellness through programs that help our employees create their journey toward optimal wellbeing. This framework is based on the ‘Strength in Balance‘ theme and this seen again in our Diversity, Inclusion and Belonging Team motto “Bring Your Difference“. 

Job Description

This is an exciting opportunity for an experienced developer of large scale data solutions. A contract position, you will join a team delivering a transformative cloud hosted data platform for a key Version 1 customer. 

The ideal candidate will have a proven track record in implementing data ingestion and transformation pipelines for large scale organisations. We are seeking someone with deep technical skills in a variety of technologies to play an important role in developing and delivering early proofs of concept and production implementation.

You will ideally have at least 5-7 years experience in building solutions using a variety of open source tools & Microsoft Azure services, and a proven track record in delivering high quality work to tight deadlines.

Your main responsibilities will be:

  • Designing and implementing highly performant data ingestion & transformation pipelines from multiple sources using a variety of technologies
  • Delivering and presenting proofs of concept to of key technology components to prospective customers and project stakeholders.
  • Developing scalable and re-usable frameworks for ingestion and transformation of large data sets
  • Master data management system and process design and implementation.
  • Data quality system and process design and implementation.
  • Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times
  • Working with event based / streaming technologies to ingest and process data
  • Working with other members of the project team to support delivery of additional project components (Reporting tools, API interfaces, Search)
  • Evaluating the performance and applicability of multiple tools against customer requirements
  • Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.

Qualifications

You will have:

  • Microsoft Azure Big Data Architecture certification.
  • Hands on experience designing and delivering solutions using the Azure Data Analytics platform (Cortana Intelligence Platform) including Azure Storage, Azure SQL Database, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics
  • Direct experience of building data piplines using Azure Data Factory and Apache Spark (preferably Databricks).
  • Experience building data warehouse solutions using ETL / ELT tools such as SQL Server Integration Services (SSIS), Oracle Data Integrator (ODI), Talend, Wherescape Red.
  • Experience with Azure Event Hub, IOT Hub, Apache Kafka, Nifi for use with streaming data / event-based data
  • Experience with other Open Source big data products eg Hadoop (incl. Hive, Pig, Impala)
  • Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J)
  • Experience working with structured and unstructured data including imaging & geospatial data.
  • Comprehensive understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching.
  • Experience working in a Dev/Ops environment with tools such as Microsoft Visual Studio Team Services, Chef, Puppet or Terraform

Additional Information

Day Rate:£500

We offer employee recognition in the form of Excellence Awards and V1Ps which is awarded by your peers. Engagement is incredibly important with local engagement teams driving our engagement events!

3d

Data Engineer

SovTechRemote job, Remote
10 years of experiencescalaDesignazurejavapythonAWS

SovTech is hiring a Remote Data Engineer


Hello from SovTech????

Are you looking to become part of a team that is changing the way businesses across the world build software? Our mission is to design, build, deploy and maintain innovative custom software that allows our clients to start, run and grow world-class businesses with globally distributed teams based in Johannesburg, London, Nairobi and Cape Town ????

We invest in people who can see the future & who work hard to achieve it. SovTech has a young, dynamic, and fast-growing team. We’re only looking for outstanding people – those unique individuals who are brilliant, always happy to help, socialize, get involved, work hard, and enjoy what they do! We have a continuous learning culture that allows our people to grow and develop in the opportunities across our various teams. Keep reading to learn about what else we have to offer ????

About the role:

Who are we looking for?

We are searching high and low for our next Data Engineer to join our Worldclass team.


What will you be doing?

You will be helping to build data-centric solutions for a Client's customers, applying engineering discipline to ensure high quality. Responsible for building data pipelines to bring together information from different source systems - to integrate, consolidate, cleanse and structure data.
This isn’t a backroom role, and you can expect to spend time discussing design approaches and requirements with your colleagues and customers in addition to being engaged in development activities. 


About our Culture:

We are Fluid ????

Our teams are globally distributed so we have adopted a Fluid approach to remote vs office-based work, encouraging freedom, fluidity of working location, collaboration & exploration. At SovTech, teams define their remote days to encourage collaboration & knowledge sharing whilst still creating the flexibility of remote working.


Want to know more about our culture ????

Have a look through our Careers page & our The SovTech Spex.

Check out our latest Blog posts.

Finally, if that does not give you enough insight into SovTech, check out our Humans of SovTech Instagram page ????


You know what to do next… Click that Apply button ????

See more jobs at SovTech

Apply for this job

5d

Senior Data Engineer

tableaunosqlairflowpostgressqlDesignmongodbelasticsearchmysqllinuxpython

FreightWaves, Inc. is hiring a Remote Senior Data Engineer

Are you smart, driven, curious, resourceful, and not afraid to fail, then we want to meet you! Our team of
bold, innovative, and creative teammates is what makes us the top startup to work for.  FreightWaves delivers news and commentary, data, analytics, risk management tools, and actionable market insights to the logistics and supply chain industry. If you are ready to join our team, it is time for you to apply!

FreightWaves is on the hunt for a curious, tenacious, team-oriented Senior Data Engineer to join our fast paced engineering team. The ideal candidate is curious, versatile, team oriented, thrives on change, and has a positive attitude. If you are ready to be challenged, learn new and exciting technologies, and have the unique opportunity to work with some of the most talented developers in the country, we want you to apply! 

What you will be doing:

  • Working closely with Software Engineering and DevOps management to design and implement big data solutions
  • Defining and implementing data processing and testing strategies
  • Defining and implementing data project access boundaries
  • Create and ensure data automation stability with associated monitoring tools
  • Review existing and proposed infrastructure for architectural enhancements
  • Working closely with Data Science and facilitating advanced data analysis (like Machine Learning)

What you bring to the table:

  • Certification: Professional Google Cloud Certified Data Engineer is strongly desired
  • Working knowledge of Apache Airflow
  • Strong in Linux environments and experience in scripting languages 
  • Strong in Python
  • Experience in any RDBMS (MySQL, Postgres, SQL Server, etc.) with strong SQL skills
  • Experience with NoSQL (Elasticsearch, MongoDB, or another flavor)
  • Experience modeling data and access patterns
  • Strong communication skills
  • Comfortable talking with customers or business partners as a consultant or technical advisor related to data management
  • Must have working knowledge of cloud based functions and storage
  • Comfort level with the following (or transitioning from equivalent platform services):
    • Cloud Storage
    • Cloud Pubsub
    • BigQuery
    • Cloud Composer
    • DataFlow
    • Cloud Functions

Bonus knowledge/experience:

  • Transforming similar data from disparate sources to create canonical data structures
  • Surfacing data to BI platforms such as Data Studio, Looker, or Tableau
  • Data migration, especially from other platforms (cloud and on prem) to GCP
  • DBT (Data Build Tool)  - big plus!  Ramp up on this, if you haven’t heard of it

Our Benefits:  

  • An excellent work environment, flat hierarchies, and short decision paths.
  • Competitive salary
  • Work from home
  • A generous benefits package including 100% employer-paid health, dental, vision and Life insurance, STD, LTD
  • Stock options
  • 401k with up to 3.5% match
  • Training programs and career development opportunities
  • Student-loan reimbursement 
  • Annual life achievement bonus of $2000 for having a baby, buying a house, or getting married (max one per year) 
  • No set days off Vacation policy (our team takes time off as needed with supervisor approval)
  • Gym Membership (or virtual membership while COVID is still a part of our daily lives)
  • Audible or Kindle Unlimited subscription 
  • FreightWaves strives for sustainability. We offset our carbon emissions. 
  • Discount on Ford vehicles

See more jobs at FreightWaves, Inc.

Apply for this job

6d

Cloud Data Engineer (R4871)

AvalaraRemote, United States
Bachelor's degreenosqlsqljavac++python

Avalara is hiring a Remote Cloud Data Engineer (R4871)

The Cloud Data Engineer within the BI and Analytics team will be a major contributor and architect of Avalara's cloud based centralized reporting platform.  Hence strong experience in large scale reporting platforms, MPP cloud data technologies, advanced ETL/ELT and data streaming tools and deep understanding of long term benefits and pitfalls of various data structures (NOSQL vs SQL) is an absolute must.

A major responsibility for this role is to scale and support our customer facing reporting logic which includes providing end to end data delivery of custom and standard compliance tax logic to thousands of our customers via HTTPS and other secure protocols.

Responsibilities :

  • Build a long term 'data analytics stack' for Avalara
  • Work with the product managers, DBA teams and broader engineering teams build scalable data orchestration, transformation and reporting streams that can capture and prepare billions of transactions per day for customer reporting
  • Build, implement and monitor a data quality framework with required ci/cd pipeline, code instrumentation and real time monitoring tools
  • Practice/Implement data security, encryption and masking policies across various data sets and data sources
  • Develop and manage end-to-end project plans and ensure on-time delivery
  • Communicate status and big picture to the project team and management
  • Work with business and engineering teams to identify scope, constraints, dependencies, and risks
  • Identify risks and opportunities across the business and drive solutions

Qualifications:

  • Minimum of 8 years work experience combined in data engineer, object oriented languages (python, C#, or JAVA), DevOps, distributed cloud architecture, high availability production system deployment experience
  • Advanced SQL with big data optimization skills
  • Bachelor's degree in Computer Science or Engineering, or equivalent work experience
  • Experience managing efforts in distributed systems and/or developing large scale web/ cloud SaaS applications
  • Proven ability to combine business acumen, technical acumen and process expertise to define client (internal/external) engagement and program execution
  • Ability to communicate effectively with technical and non-technical stakeholders across multiple business units
  • Experience with Google Cloud Platform strongly desired


About Avalara:

Avalara helps businesses of all sizes achieve compliance with transactional taxes, including VAT, sales and use, excise, communications, and other tax types. We deliver comprehensive, automated, cloud-based solutions that are fast, accurate, and easy to use.

Avalara offers hundreds of pre-built connectors into leading accounting, ERP, ecommerce and other business applications. Each year, the company processes billions of tax transactions for customers and users, files hundreds of thousands of tax compliance documents and tax returns and manages millions of exemption certificates and other compliance related documents.

Avalara’s headquarters are in Seattle, WA and it has offices across the U.S. and in Brighton and London, England; Brussels, Belgium; and Pune, India. More information at: www.avalara.com

Avalara is an Equal Opportunity Employer. All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law.

Avalara is an Equal Opportunity Employer. All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law.

See more jobs at Avalara

Apply for this job

6d

Senior Data Engineer

tableauscalanosqlairflowsqlDesignmobileazurejavac++postgresqlpythonAWS

Truvian Sciences is hiring a Remote Senior Data Engineer

ABOUT

Want to work for a fast-paced and disruptive company that is working to revolutionize blood testing? Truvian is a healthcare company at the intersection of diagnostics and consumer tech. We are developing an automated, benchtop diagnostic system to provide lab-accurate results in 20 minutes for a comprehensive suite of health tests. Our proprietary approach, for which we are seeking FDA clearance, is intended to fulfill the promise of delivering accessible and affordable blood testing from one small blood sample, in minutes, in a retail setting or private clinic.

To us, our work at Truvian is more than a job – It’s a mission.  We are a culture dedicated to discovery and empowerment. We are trailblazers on the path to put health information where it belongs - in the hands of the individual. We are partners in the belief that talented people, working as a team, can make every day an adventure. Come join us as we realize our vision to make routine health testing convenient, affordable, and actionable for today’s connected consumers!

JOB SUMMARY

Truvian is looking for a Sr. Data Engineer as we enter our next phase of our evolution. This is a highly technical role with the candidate expected to serve as a critical contributor to building compelling data-centric solutions to support Truvian’s platforms.

The individual will contribute to the data management and integration work, development of data models, data warehousing and transformation. This role will work closely with software/product development of the instrument software, cloud, mobile and other informatics development programs, and support creating data management requirements, deployment of data pipelines, integration of various data sources. She/he will also be responsible for supporting on-market products post-launch.

The individual is responsible for implementing the data architecture and design, executing the vision, and addressing relevant cross-cutting concerns that drive the integration architecture. She or he is responsible for identifying components, frameworks and best-in-class patterns that will enable high quality artifacts and time to market.  The candidate will interact closely with cross-functional teams to understand, refine user needs and implement compelling solutions for healthcare consumers.

HERE’S WHY YOU’LL LOVE THIS JOB:

  • You'll work with a rock star team of people who are passionate about the work they do and our ability to disrupt healthcare with our innovative products
  • You’ll be a key player on a team responsible for the company’s growth and product launch
  • You thrive in a fast-paced and dynamic environment where you can implement fresh ideas, new processes, and make things happen quickly without a bunch of red tape
  • You’ll have great perks such as: Generous Benefits (Medical/Dental/Vision/EAP/Paid Life Insurance/LTD/401K), paid parental leave, and flexible PTO.

WHAT YOU WILL DO:

  • Defining and building data pipelines to support internal and external product efforts
  • Capturing, documenting, and maintaining data architecture, pipelines, and data inventory/warehousing
  • Ensure our data practices are in line with industry best practices and regulatory requirements
  • Work with product owners in capturing and refining data service requirements for implementation
  • Bring deep understanding of data management and governance strategies that help ensure consistency, quality, and lineage across our all our data
  • Create and execute unit, system, and integration tests.

WHAT YOU WILL BRING:

  • BS degree in Computer Science/Computer Engineeringor related degree and 8-11 years of experience as a Data Engineer or similar role
  • Strong knowledge around various data storage and management technologies(SQL, NoSQL, PostgreSQL, Columnar, Data Warehouse, Data Lake, etc.) and their applicability
  • Strong understanding of data managementincluding data cataloging, governance, and lineage
  • Experience with designing and implementing databasesand distributed data pipelines
  • Proficiencyin data extraction, transformation, and loading (SQL/ETL, Python, etc.)
  • Experiencewith Tableau, Power BI or similar
  • Experience with managing data in a cloud-based environment(AWS, Azure, Google Cloud, etc.)
  • Excellent communication skillsboth in explaining methodology and functionality and in listening to end users and matching implementation to customer needs
  • Self-starter who understands startup mentality(i.e. no job too small)

Preferred skills:

  • Experience with one or more pipeline orchestration technologies (Argo, Airflow, Luigi, Spark, etc.)
  • Proficiency in at least one modern programming language such as Scala, Java, or C#
  • Experience with Azure data technologies like Data Factory, Data Bricks, Synapse Analytics
  • Experience with IoT telemetry
  • Experience in regulated environments (FDA, IVD, Medical Device)
  • Experience in working with regulated data (HIPAA, PHI, GDPR, etc.)
  • Experience with healthcare industry standards (FHIR/HL7)
  • Demonstrated leadership and mentoring skills

SUPERVISORY ROLE 

• No

TRAVEL AND/OR PHYSICAL DEMANDS

  • Prolonged periods of sitting at a desk and working on a computer
  • This role is eligible to work 100% remote with very little domestic travel. The ideal home base location would be Southern California.
     

 

If you want to stand out, please include a cover letter

 

Truvian provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws.

This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.

 

See more jobs at Truvian Sciences

Apply for this job

7d

Data Engineer - SQL/ADF/SSIS

loanDepotRemote, Plano, Texas
scalanosqlsqlmongodbazureapijavapythonAWS

loanDepot is hiring a Remote Data Engineer - SQL/ADF/SSIS

Position at loanDepot

We are at the forefront of change in this rapidly evolving lending market. mello™, the Greek word for “future,” was the product of a recent $80+ million dollar investment in research & development to transform & streamline the home buying process into a digital experience like no other competitor offers.  But mello™ is just the beginning… loanDepot will continue to invest in developing our own advanced technology ecosystem built around serving our customers & enabling our valued employees to provide exceptional service. We have funding, we have opportunities, you have ideas—it’s a perfect match. Come join us!

loanDepot — We are America’s Lender.

Position Summary:

Responsible for delivering leading innovative, compelling, coherent software solutions for our consumer, internal operations, and value chain constituents across a wide variety of enterprise applications through the creation of discrete business services and their supporting components. The job duties and requirements are defined for backend. Provides technical leadership and mentorship to junior team members. This position ensures the performance of all duties in accordance with the company’s policies and procedures, all U.S. state and federal laws and regulations, wherein the company operates.

 

Responsibilities:

  • Designs, develops, and delivers solutions that meet business line and enterprise requirements.
  • Creates enterprise-grade application services.
  • Participates in rapid prototyping and POC development efforts.
  • Advances overall enterprise technical architecture and implementation best practices.
  • Assists in efforts to develop and refine functional and non-functional requirements.
  • Participates in iteration and release planning.
  • Performs functional and non-functional testing.
  • Contributes to overall enterprise technical architecture and implementation best practices.
  • Informs efforts to develop and refine functional and non-functional requirements.
  • Performs other duties and projects as assigned.

Requirements:

  • Bachelor’s Degree is required, and/or a minimum of four (4) + related work experience.
  • Proficient in at least one major language (e.g., Java, Scala, Python)
  • SQL - must have strong SQL experience.
  • 3+ years of experience with at least one of the following cloud platforms: Microsoft Azure, Amazon Web Services (AWS), Google Cloud Platform (GCP), others.
  • Experience designing enterprise database systems using Microsoft SQL Server.
  • Experience working with ETL tools such as SSIS and Azure Data Factory.
  • Experience with advanced queries, stored procedures, views, triggers, etc.
  • Experience with indexing and normalization.
  • Experience of performance tuning queries.
  • Experience of both DDL and DML.
  • Experience of database administration
  • Experience with Data Modeling, ETL construction with advanced job scheduling.
  • Experience working in a big data & NoSQL environment is an added advantage (Spark, HBase, Cassandra, MongoDB)
  • Knowledge data integration patterns leveraging API, streaming technologies (Kafka), ETL, ELT.
  • Experience developing API’s (REST) in support of data as a service-based application.

 

Why work for #teamloanDepot:

  • Aggressive earning potential and 401K with robust company match           
  • Inclusive, diverse and collaborative culture where people from all backgrounds can thrive
  • Work with other passionate, purposeful and customer-centric people
  • Extensive internal growth and professional development opportunities including tuition reimbursement
  • Comprehensive benefits package including Medical/Dental/Vision
  • Wellness program to support both mental and physical health
  • Generous paid time off for both exempt and non-exempt positions

We are an equal opportunity employer and value diversity in our company. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.


See more jobs at loanDepot

Apply for this job

7d

Data Engineer

Points111 Richmond St W, Toronto, ON M5H 2G4, Canada, Remote
tableaunosqlairflowsqlDesignscrumdockerpostgresqlpython

Points is hiring a Remote Data Engineer

Company Description

As a trusted partner to the world’s leading loyalty programs, Points builds, powers, and grows new ways for members to get and use their favorite loyalty currency. 

More than 1 billion loyalty program members touch our products through brands like Hilton, Air Canada, Lyft, British Airways, United Airlines, Air France-KLM, Chase Bank, Etihad Airways, and many more. Our team of 250+ people across 5 global offices works together to build and launch new solutions, solve complex challenges for our partners, and create a one-of-a-kind company culture.

Job Description

This role is remote across Canada but reports into our Toronto office. 

Points is looking for an intermediate or senior Data Engineer to join our Data Engineering team for a permanent position in our downtown Toronto office.

We’re an industry-leading web-based organization that is continuously reshaping how consumers interact with their loyalty programs. We work with the world’s largest airline, hotel, financial, and retail rewards programs, to tackle complex challenges and come up with innovative e-commerce solutions; with the Data Engineering team playing a critical role in. If you’d like to be a part of it, we’d love to hear from you.

Reporting to the Team Lead, Data Engineering, you will:

  • Work in a scrum-based team that is passionate about enabling a data culture throughout the organization.
  • Design and develop scalable and robust pipelines for data consumption by downstream applications in support of advanced analytics, AI/ML products, and system interoperability.  
  • Improve upon existing ETL processes, through the use of automated testing and monitoring, to continually enhance data integrity and accuracy.
  • Support production systems to deliver a high degree of data availability, consistency, and accuracy.
  • Actively participate in solution design and modelling to ensure data products are developed according to best practices, standards, and architectural principles. 
  • Automate the boring manual stuff!

Qualifications

 

  • Excellent hands-on experience in working with SQL and NoSQL data sets.

  • Experience using GUI ETL tools (we use Talend).

  • Experience with data streaming architectures, such as Kafka.

  • Working knowledge of DevOps principles such as CI/CD.

  • Data consumer focused, constantly driven to exceed stakeholder data and information needs.

  • Effective communicator and collaborator, within the immediate team as well as across other organizational units.

     

Nice to haves

  • Strong knowledge of general software engineering principles and practices.

  • Experience with columnar-oriented databases, such as Vertica or Snowflake.

  • Experience integrating with services, such as Dataiku and NetSuite.

  • Experience with containers and related infrastructure, such as Docker and Kubernetes. 

  • Experience with developing data products using data visualization / dashboarding tools like Tableau.

  • Experience with RESTful APIs. 

 

Technologies we use and teach:

  • Vertica, PostgreSQL, CouchDB

  • Talend Cloud

  • Kafka

  • Airflow

  • Python

  • GitLab

  • Tableau, JReports

  • Docker

Additional Information

The health and safety of Points’ employees, guests and business partners is a very high priority. Our view that maximizing COVID-19 vaccination rates among employees is one very important strategy to lessen the hazard of COVID-19 in our physical workspace. As such, all new Pointsters are required to be fully vaccinated in accordance with their regional guidelines.

Points is an equal opportunity employer and is committed to providing an accessible recruitment process.

We welcome applications from all qualified individuals and are committed to equal employment opportunity regardless of gender identity or expression, race, ethnic origin, creed, place of origin, age, sex, marital status, physical or mental disability, sexual orientation, and any other category protected by law. Upon request we will provide accommodation for applicants with disabilities.

All your information will be kept confidential.

See more jobs at Points

Apply for this job

8d

Senior Data Engineer - Azure

MuteSix10 Triton St, London NW1 3BF, UK, Remote
agiletableauscalasqlB2BsalesforceDesignazureAWS

MuteSix is hiring a Remote Senior Data Engineer - Azure

Company Description

About Merkle

Dentsu are merging the media skills of DWA, the creative & content skills of Gyro, the research skills of B2Bi and the data & platform skills of Merkle to create a global marketing services network that is dedicated to the end-to-end Customer Experience needs of global B2B clients.

The UK office of the merged network is currently 162 people strong and growing. It has international clients such as Cisco, Ericsson, Facebook, Salesforce, AWS, Grundfos, Brother, Fujitsu, Sky, Digital Realty, Goldman Sachs, Workplace and Shell.

Job Description

The role

We are looking for exceptional candidates to join our growing Data Engineering team at Merkle to focus on delivering solutions to clients. Successful candidates will understand modern data platform architecture and have solved complex technical problems to deliver to clients’ requirements. You will be a competent team player, developing simple and straightforward solutions that are scalable, utilizing appropriate technologies and engineering best practices.

Under the direction of a Solutions Architect, the postholder will

·      Design great data solutions for our clients

·       Work closely with local leadership as well as our global teams, bringing their key industry and product expertise and knowledge, to shape and deliver client solutions.

·      Support our Data Scientists and Analytical Consultants working on projects

·      Provide technical expertise and consultancy to clients to help them understand the power of big data analytics for their organization

Life as a Senior Data Engineer at Merkle

In this role, you will have the opportunity to work with clients from a wide range of sectors, understand their specific requirements, liaise with data scientists and analytical consultants, and design, develop and deliver technical solutions. You will typically work with a few clients across stages of maturity, allowing variety in your work and the opportunity to pick up new skills as needed. You’ll work alongside Azure cloud technologies, BI platforms like Tableau, Power BI, Thoughtspot, and Data Science platforms DataBricks, Qubole, Lytics, and many more. You will do this as part of a dynamic, highly-skilled, and flexible team working on exciting and challenging projects.

Qualifications

What we are looking for in you

 

You will have a deep understanding of distributed computing, data and application architectures, basic networking, security, and infrastructure with a strength in writing great code to produce production-strength data pipelines and scalable systems.

A knowledge of Big Data with a focus on data engineering; optimising data, pipelines, ETLs, data modelling and analytical tools are a plus.

Essential:

·       Implementation of Data Lake patterns in Azure

·       Understanding of Data Lake partitioning policies and role-based access controls

·       Experience of DataFactory + SQL PaaS

·       Experience of Spark, Scala, or similar technologies

·       A strong understanding of data modelling, data structures, databases, and data ingestion and transformation

·       ETL Development using well-architected frameworks

·       Building solutions using optimised code, repeatable, reusable techniques

·       Dev Ops such as Continuous Integration, Automation, Infrastructures as Code

·       CI / CD experience within ETL / data transformation environment

·       Cross functional team work internally and with external clients

·       Strong project Management and organisation skills

·       Proficiency working in an Agile environment

Beneficial

·       Microsoft certification /  demonstratable path towards certification

·       Experience of Synpase / Snowflake

·       Cosmos DB experience

Additional Information

At the point of application, the candidate must have the legal right to work in the UK as we are unable to sponsor visas as this time.

Merkle does not discriminate against job applicants on the basis of age, disability, gender reassignment, marital or civil partner status, pregnancy or maternity, race, colour, nationality, ethnic or national origin, religion or belief, sex or sexual orientation. Experience stipulated in this job description serves as a guide only and all applications will be considered on their merits, irrespective of experience.

As part of our Diversity and Inclusion agenda, and as an Equal Opportunities employer, if you require reasonable adjustments during the selection process please engage directly with your Recruiter.

8d

Data Engineer

GraphiteHQRemote job, Remote
kotlinscalamobilejavadockerpythonAWSjavascriptbackend

GraphiteHQ is hiring a Remote Data Engineer

Graphite is a boutique digital marketing firm that builds scalable growth engines for consumer technology companies such as Ticketmaster, MasterClass, BetterUp, and Honey. We specialize in search engine optimization (SEO), content strategy, mobile app growth, and conversion optimization. We are a fully distributed team that is dedicated to creating an environment where you do the best work of your career. With headquarters in San Francisco and team members across North America, Latin America, Europe, and Asia, we are ready to welcome our next team member!


We’re looking for a remote Data / Backend Engineer who will be responsible for building data systems in collaboration with Graphite’s Data Scientists and Growth Managers. This role requires you to be creative, motivated, and resourceful. You are eager to be the best at what you do. You are obsessed with learning and developing new skills. You feel comfortable taking full ownership of your projects and working in a performance-driven environment while cultivating a strong sense of team and collaboration.

VetCentric is hiring a Remote Data Automation - Concierge Cybersecurity Engineer

About Us:

VetCentric is focused on delivering outstanding services to the federal government.  We have extensive experience in the fields of cyber security, supply chain & logistics management, strategy, business analytics, and IT services such as system design, continuous improvement, virtualization, and data center management.  VetCentric is an SBA certified HUBZone company and VA CVE certified Service-Disabled Veteran Owned Small Business (SDVOSB). We operate in 15 states with offices in Washington DC and Northern Virginia. ​

Perks Working with Us:

  • Competitive compensation
  • Comprehensive health, vision, dental benefits
  • 15 days leave and 10 days of paid Federal Holidays  
  • 401(k) with matching plan
  • Annual training budget
  • Fantastic company culture

Location(s): Anywhere, US. Candidates from HUBZones preferred.

Employment Eligibility: Eligible to work for any employer in the United States without requiring sponsorship. Sponsorship is not available currently.

Position Description: Concierge Cyber Security Engineer

As a Cyber Security Engineer on our team, you’ll use your experience to work with the VA to discover their cyber risks, understand applicable policies, and develop a mitigation plan. You’ll review technical, environmental, and personnel details from technical SME’s to assess the entire threat landscape. Then, you’ll guide your client through a plan of action with presentations, white papers, and milestones. You’ll work with your client to translate network design and security concepts, so they can make the best decisions to secure their mission critical systems.  This is your opportunity to act as an information security subject matter expert while broadening your skills in security principles, concepts, policy and regulations. Join us as we protect the Veterans Affairs infrastructure.  This position requires up to 50% of nationwide travel. This position is open to remote delivery anywhere within the U.S., to include the District of Columbia.

Must Have:

  • Experience with handling large amounts of data and performing data analysis
  • Experience with identifying risks in security systems, work with technical experts to resolve security issues, and perform security assessments on systems with external connections.
  • Knowledge of and experience utilizing CDM technologies for network monitoring and remediation, ForeScout experience is a plus. Includes experience with building and deploying policy engine.
  • Knowledge of ITIL best practices and experience with incident management and tracking
  • Ability to effectively leverage detailed knowledge of security principles, concepts, policy and regulations to conduct assessments, develop risk mitigation strategies, and design new processes, as needed.   
  • Ability to identify key concepts, factors and risks based on conversations and document these in clear and concise narrative or graphic reports. 
  • Ability to provide clear/concise guidance and collaboration with facility personnel
  • Ability to draft and update technical documentation as required
  • Ability to obtain and maintain a Public Trust or Suitability/Fitness determination based on client requirements

Nice If You Have: 

  • Experience with managing firewalls, server security, cloud storage security desired
  • Experience with database development is a plus  
  • Experience Team leadership experience is highly desired
  • Knowledge of intrusion detection systems, vulnerability scanning tools, critical security practices, and all aspects of Information Security, preferred 
  • Knowledge of Risk Management Framework and project management desired 
  • Knowledge of VA Handbook/Directive 6500, Federal Information Security Management Act (FISMA), Federal Information Process Standards (FIPS), and National Institute of Standards and Technology (NIST) is preferred

See more jobs at VetCentric

Apply for this job

9d

Lead Python Developer | Data Engineer (open to PERM or Contract)

StreamhubBangalore, IN Remote
dockerelasticsearchkubernetespythonAWS

Streamhub is hiring a Remote Lead Python Developer | Data Engineer (open to PERM or Contract)

Job Type:PERM/Contract, Remote

Industry:
Audience profiling, Analytics, Ad Tech, Streaming

Company size:
11-50

Company type:
Private

Working at Streamhub brings you unparalleled access to cutting edge big data technologies and martech / media-tech exposure in a rewarding entrepreneurial culture. Our purpose is to understand how video shapes our everyday life, and our mission to be the most actionable data platform for the video business. We foster an environment where we can innovate and achieve goals both as a team and individually. We are at an exciting growth phase and are making a number of new hires - who will form the founding team for our tech hub in India.


The Role

This role will give you a challenging opportunity to contribute to an innovative data product as Python (Data) Engineer of our core engineering team. You will be joining our highly-productive engineering team, thriving work culture through individual's growth, well-being, strong ethics, and entrepreneurial approach.

Your day-to-day work would comprise designing and building our reporting platform, building versatile data pipelines that deal with volume, and diversity of data, and work on our service layer which interacts with the core data platform.

You have

- Solid knowledge and experience in Python.
- Some experience with data pipelines
- Understanding of containerized applications (like Docker) and container services (like Kubernetes or ECS).
- Some experience with document-based databases or search engines, like Elasticsearch or MongoDB.
- Experience with AWS will be a great plus!
- Exposure or interest in data technologies like Spark, Databases, Streaming.

The Package

  • Immense learning, exposure to niche data technologies, and handling TBs of data.
  • Work with an open, diverse, and autonomous team. Entrepreneurial team culture.
  • Remote working / flexible timing.

If you've read all the way to the bottom of this description, thank you for your interest in Streamhub!

We are committed to equal employment opportunities regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender, gender identity or expression, or veteran status. We are proud to be an equal opportunity workplace.