Data Engineer Remote Jobs

109 Results

+30d

Data Engineer

Creative CFO (Pty) LtdCape Town, ZA - Remote
LambdanosqlpostgressqlDesignazureapiAWS

Creative CFO (Pty) Ltd is hiring a Remote Data Engineer

Become part of a vibrant, quality-focused team that leverages trust and autonomy to deliver exceptional services to diverse, high-growth clients. Receive recognition for your committed, results-producing approach to problem-solving, and opportunities for learning to realise your own passion for personal growth. All while working with some of the country’s most exciting growing businesses - from local entertainers, gin distilleries and ice-cream parlours, to enterprises revolutionising traditional spaces like retail, property and advertising or treading on the cutting edge of fintech.

About the company

At Creative CFO (Pty) Ltd, we provide companies with the best financial tools and services to plan, structure, invest and grow. We believe that innovative solutions are an interconnected web of small problems solved brilliantly. By walking through all the financial processes for each company and solving problems along the way, we have developed a full-service solution that business owners really appreciate.

We are committed to solving the challenges that small business owners face. Accounting and tax is one part of this, but we also cover business process analysis, financial systems implementation and investment support. We unlock value by creating a platform through which business owners can manage and focus their creativity, energy and financial resources.

Position Summary

As a Data Engineer at Creative CFO, you will be at the forefront of architecting, building, and maintaining our data infrastructure, supporting data-driven decision-making processes.

We are dedicated to optimising data flow and collection to enhance our financial clarity services for high-growth businesses. Join our dynamic and rapidly expanding team, committed to building a world where more SMEs thrive.

The Ideal Candidate

To be successful in the role you will need to:

Build and optimise data systems:

    • Design, construct, install, test, and maintain highly scalable data management systems.
    • Ensure systems meet business requirements and industry practices.

    Expertly process data:

    • Develop batch processing and real-time data streaming capabilities.
    • Read, extract, transform, stage, and load data to selected tools and frameworks as required.

    Build data Integrations

    • Work closely with data analysts, data scientists, and other stakeholders to assist with data-related technical issues and support their data infrastructure needs.
    • Collaborate with data analytics and BI teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision-making across the organisation.

    Be versatile with technology

    • Exhibit proficiency in ETL tools such as Apache NiFi or Talend and a deep understanding of SQL and NoSQL databases, including Postgres and Cassandra.
      Demonstrate expertise in at least one cloud services platform like Microsoft Azure, Google Cloud Platform/Engine, or AWS.

    Focus on quality assurance

    • Implement systems for monitoring data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.

    Have a growth mindset

    • Stay informed of the latest developments in data engineering, and adopt best practices to continuously improve the robustness and performance of data processing and storage systems.

    Requirements

    Key skills & qualifications:

    • Bachelor’s degree in Statistics, Data Science, Computer Science, Information Technology, or Engineering.
    • Minimum of 2 years of professional experience in a Data Engineering role, with a proven track record of successful data infrastructure projects.
    • Proficiency in data analysis tools and languages such as SQL, R, and Python.
    • Strong knowledge of data modeling, data access, and data storage techniques.
    • Proficiency in at least one of Microsoft Azure, Google Cloud Platform/Engine, or AWS Lambda environments.
    • Familiarity with data visualisation tools such as PowerBI, Pyramid, and Google Looker Studio.
    • Excellent analytical and problem-solving skills.
    • Effective communication skills to convey technical findings to non-technical stakeholders.
    • Eagerness to learn and adapt to new technologies and methodologies

    Relevant experience required:

    • Previous roles might include Data Engineer, Data Systems Developer, or similar positions with a focus on building and maintaining scalable, reliable data infrastructures.
    • Strong experience in API development, integration, and management. Proficient in RESTful and SOAP services, with a solid understanding of API security best practices
    • Experience in a fast-paced, high-growth environment, demonstrating the ability to adapt to changing needs and technologies.


    Why Apply

    Vibrant Community

    • Be part of a close-knit, vibrant community that fosters creativity, wellness, and regular team-building events.
    • Celebrate individual contributions to team wins, fostering a culture of recognition.

    Innovative Leadership

    • Work under forward-thinking leadership shaping the future of data analytics.
    • Receive intentional mentorship for your professional and personal development.

    Education and Growth

    • Receive matched pay on education spend and extra leave days for ongoing education.
    • Enjoy a day's paid leave on your birthday - a celebration of you!

    Hybrid Work Setup

    • Experience the flexibility of a hybrid work setup, with currently one in-office day per month.
    • Choose to work in a great office space, if preferred.

    Professional and Personal Resources

    • Use the best technology, provided by the company
    • Benefit from Parental and Maternity Leave policies designed for our team members.

    See more jobs at Creative CFO (Pty) Ltd

    Apply for this job

    +30d

    Data Engineer

    BrushfireFort Worth, TX, Remote
    DevOPSBachelor's degreetableausqlFirebaseazurec++typescriptkubernetespython

    Brushfire is hiring a Remote Data Engineer

    Job Description

    The primary responsibility of this position is to manage our data warehouse and the pipelines that feed to/from that warehouse. This requires advanced knowledge of our systems, processes, data structures, and existing tooling. The secondary responsibility will be administering our production OLTP database to ensure it runs smoothly and using standard/best practices.

    The ideal candidate for this position is someone who is extremely comfortable with the latest technology, trends, and favors concrete execution over abstract ideation. We are proudly impartial when it comes to solutions – we like to try new things and the best idea is always the winning idea, regardless of the way we’ve done it previously.

    This is a full-time work-from-home position.

    Qualifications

    The following characteristics are necessary for a qualified applicant to be considered:

    • 3+ years of experience working with data warehouses (BigQuery preferred, Redshift, Snowflake, etc)

    • 3+ years of experience working with dbt, ETL (Fivetran, Airbyte, etc), and Reverse ETL (Census) solutions 

    • 3+ years of experience with Database Administration (Azure SQL, Query Profiler, Redgate, etc)

    • 1+ years of experience with BI/Visualization tools (Google Data/Looker Studio, Tableau, etc)

    • Experience with PubSub databases, specifically Google Cloud Firestore and Firebase Realtime Database

    • Experience with Github (or other similar version control systems) and CI/CD pipeline tools like Azure Devops and Github actions

    • Ability to communicate fluently, pleasantly, and effectively—both orally and in writing, in the English language—with customers and co-workers.

    • Concrete examples and evidence of work product and what role the applicant played in it

    The following characteristics are not necessary but are highly desired:

    • Experience with Kubernetes, C#, TypeScript, Python

    • Bachelor's degree or higher in computer science or related technical field

    • Ability to contribute to strategic and planning sessions around architecture and implementation

    See more jobs at Brushfire

    Apply for this job

    +30d

    Senior Data Engineer

    EquipmentShareRemote; Chicago; Denver; Kansas City; Columbia MO
    LambdaagileairflowsqlDesignc++postgresqlpythonAWS

    EquipmentShare is hiring a Remote Senior Data Engineer

    EquipmentShare is Hiring a Senior Data Engineer.

    Your role in our team

    At EquipmentShare, we believe it’s more than just a job. We invest in our people and encourage you to choose the best path for your career. It’s truly about you, your future, and where you want to go.

    We are looking for a Senior Data Engineer to help us continue to build the next evolution of our data platform in a scalable, performant, and customer-centric architecture.

    Our main tech stack includes Snowflake, Apache Airflow, AWS cloud infrastructure (e.g., Kinesis, Kubernetes/EKS, Lambda, Aurora RDS PostgreSQL), Python and Typescript.

    What you'll be doing

    We are typically organized into agile cross-functional teams composed of Engineering, Product, and Design, which allows us to develop deep expertise and rapidly deliver high-value features and functionality to our next-generation T3 Platform.

    You’ll be part of a close-knit team of data engineers developing and maintaining a data platform built with automation and self-service in mind to support analytics and machine learning data products for the next generation of our T3 Fleet that enableend-users to track, monitor and manage the health of their connected vehicles and deployed assets. 

    We'll be there to support you as you become familiar with our teams, product domains, tech stack and processes — generally how we all work together.

    Primary responsibilities for a Senior Data Engineer

    • Collaborate with Product Managers, Designers, Engineers, Data Scientists and Data Analysts to take ideas from concept to production at scale.
    • Design, build and maintain our data platform to enable automation and self-service for data scientists, machine learning engineers and analysts.
    • Design, build and maintain data product framework to support EquipmentShare application data science and analytics features.
    • Design, build and maintain CI/CD pipelines and automated data and machine learning deployment processes.
    • Develop data monitoring and alerting capabilities.
    • Document architecture, processes and procedures for knowledge sharing and cross-team collaboration.
    • Mentor peers to help them build their skills.

    Why We’re a Better Place to Work

    We can promise that every day will be a little different with new ideas, challenges and rewards.

    We’ve been growing as a team and we are not finished just yet— there is plenty of opportunity to shape how we deliver together.

    Our missionis to enable the construction industry with tools that unlock substantial increases to productivity. Together with our team and customers, we are building the future of construction.

    T3is the only cloud-based operating system that brings together construction workflows & data from constantly moving elements in one place.

    • Competitive base salary and market leading equity package.
    • Unlimited PTO.
    • Remote first.
    • True work/life balance.
    • Medical, Dental, Vision and Life Insurance coverage.
    • 401(k) + match.
    • Opportunities for career and professional development with conferences, events, seminars and continued education.
    • On-site fitness center at the Home Office in Columbia, Missouri, complete with weightlifting machines, cardio equipment, group fitness space, racquetball courts, a climbing wall, and much more!
    • Volunteering and local charity support that help you nurture and grow the communities you call home through our Giving Back initiative.
    • Stocked breakroom and full kitchen with breakfast and lunch provided daily by our chef and kitchen crew.

    About You

    You're a hands-on developer who enjoys solving complex problems and building impactful solutions.  Most importantly, you care about making a difference.

    • Take the initiative to own outcomes from start to finish — knowing what needs to be accomplished within your domain and how we work together to deliver the best solution.
    • You are passionate about developing your craft — you understand what it takes to build quality, robust and scalable solutions.
    • You’ll see the learning opportunity when things don’t quite go to plan — not only for you but for how we continuously improve as a team.
    • You take a hypothesis-driven approach — knowing how to source, create and leverage data to inform decision making, using data to drive how we improve, to shape how we evaluate and make platform recommendations.

    So, what is important to us?

    Above all, you’ll get stuff done. More importantly, you’ll collaborate to do the right things in the right wayto achieve the right outcomes.

    • 7+ years of relevant data platform development experience building production-grade solutions.
    • Proficient with SQL and a high-order object-oriented language (e.g., Python).
    • Experience with designing and building distributed data architecture.
    • Experience building and managing production-grade data pipelines using tools such as Airflow, dbt, DataHub, MLFlow
    • Experience building and managing production-grade data platforms using distributed systems such as Kafka, Spark, Flink and/or others.
    • Familiarity with event data streaming at scale.
    • Proven track record learning new technologies and applying that learning quickly.
    • Experience building observability and monitoring into data products. 
    • Motivated to identify opportunities for automation to reduce manual toil.

    EquipmentShare is committed to a diverse and inclusive workplace. EquipmentShare is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status.

     

    #LI-Remote

     

    See more jobs at EquipmentShare

    Apply for this job

    +30d

    Data Engineer

    KalderosRemote, United States
    Bachelor's degreeslackazurec++

    Kalderos is hiring a Remote Data Engineer

    About Us

    At Kalderos, we are building unifying technologies that bring transparency, trust, and equity to the entire healthcare community with a focus on pharmaceutical pricing.  Our success is measured when we can empower all of healthcare to focus more on improving the health of people. 

    That success is driven by Kalderos’ greatest asset, our people. Our team thrives on the problems that we solve, is driven to innovate, and thrives on the feedback of their peers. Our team is passionate about what they do and we are looking for people to join our company and our mission.

    That’s where you come in! 

    What You’ll Do:

    For the position, we’re looking for a Data Engineer II to solve complex problems in the Drug Discounting space. Across all roles, we look for future team members who will live our values of Collaboration, Inspired, Steadfast, Courageous, and Excellence. 

    We’re a team of people striving for the best, so naturally, we want to hire the best! We know that job postings can be intimidating, and don’t expect any candidate to check off every box we have listed (though if you can, AWESOME!). We encourage you to apply if your experience matches about 70% of this posting.

    • Work with product teams to understand and develop data models that can meet requirements and operationalize well
    • Build out automated ETL jobs that reliably process large amounts of data, and ensure these jobs runs consistently and well
    • Build tools that enable other data engineers to work more efficiently
    • Try out new data storage and processing technologies in proof of concepts and make recommendations to the broader team
    • Tune existing implementations to run more efficiently as they become bottlenecks, or migrate existing implementations to new paradigms as needed
    • Learn and apply knowledge about the drug discount space, and become a subject matter expert for internal teams to draw upon


    General Experience Guidelines

    We know your experience extends beyond what you have on paper. The following is a guideline of general experience we’re looking for in this role:

    • Bachelor’s degree in computer science or similar field
    • 4+ years work experience as a Data Engineer in a professional full-time role
    • Experience building ETL pipelines and other services for the healthcare industry 
    • Managing big data implementations – have worked on various implementations for how to scale vertically and horizontally data implementations that manage millions of records. 

    Set Yourself Apart

    • Experience with dbt and Snowflake
    • Professional experience in application programming with an object oriented language. 
    • Experience with streaming technologies such as Kafka or Event Hubs 
    • Experience with orchestration frameworks like Azure Data Factory
    • Experience in the healthcare or pharmaceutical industries
    • Experience in a startup environment 

     

    Anticipated compensation range for this role is $110-150K/year USD plus bonus.

    ____________________________________________________________________________________________

    Highlighted Company Perks and Benefits

    • Continuous growth and development: Annual continuing education stipend supporting all employees on their ongoing knowledge journey.
    • Celebrating employee milestones: bi-annual stipend for all full-time employees to help you celebrate your exciting accomplishments throughout the year.
    • Work From Home Reimbursement: quarterly funds provided through the pandemic to help ensure all employees have what they need to be productive and effective while working from home.
    • A fair PTO system that allows for a healthy work-life balance. There’s no maximum, but there is a minimum; we want you to take breaks for yourself.
    • 401K plan with a company match.
    • Choose your own Technology: We’ll pay for your work computer. 


    What It’s Like Working Here

    • Competitive Salary, Bonus, and Equity Compensation. 
    • We thrive on collaboration, because we believe that all voices matter and we can only put our best work into the world when we work together to solve problems.
    • We empower each other: from our DEI Council to affinity groups for underrepresented populations we believe in ensuring all voices are heard.
    • We know the importance of feedback in individual and organizational growth and development, which is why we've embedded it into our practice and culture. 
    • We’re curious and go deep. Our slack channel is filled throughout the day with insightful articles, discussions around our industry, healthcare, and our book club is always bursting with questions.

    To learn more:https://www.kalderos.com/company/culture

    Kalderos is an equal opportunity workplace.  We are committed to equal opportunity regardless of race, color, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or veteran status.

     

    See more jobs at Kalderos

    Apply for this job

    +30d

    Senior Data Engineer

    Modern HealthRemote - US
    DevOPSMaster’s DegreeterraformscalanosqlsqlDesignmongodbazurejavadockerpostgresqlMySQLkubernetespythonAWS

    Modern Health is hiring a Remote Senior Data Engineer

    Modern Health 

    Modern Healthis a mental health benefits platform for employers. We are the first global mental health solution to offer employees access to one-on-one, group, and self-serve digital resources for their emotional, professional, social, financial, and physical well-being needs—all within a single platform. Whether someone wants to proactively manage stress or treat depression, Modern Health guides people to the right care at the right time. We empower companies to helpalltheir employees be the best version of themselves, and believe in meeting people wherever they are in their mental health journey.

    We are a female-founded company backed by investors like Kleiner Perkins, Founders Fund, John Doerr, Y Combinator, and Battery Ventures. We partner with 500+ global companies like Lyft, Electronic Arts, Pixar, Clif Bar, Okta, and Udemy that are taking a proactive approach to mental health care for their employees. Modern Health has raised more than $170 million in less than two years with a valuation of $1.17 billion, making Modern Health the fastest entirely female-founded company in the U.S. to reach unicorn status. 

    We tripled our headcount in 2021 and as a hyper-growth company with a fully remote workforce, we prioritize our people-first culture (winning awards including Fortune's Best Workplaces in the Bay Area 2021). To protect our culture and help our team stay connected, we require overlapping hours for everyone. While many roles may function from anywhere in the world—see individual job listing for more—team members who live outside the Pacific time zone must be comfortable working early in the morning or late at night; all full-time employees must work at least six hours between 8 am and 5 pm Pacific time each workday. 

    We are looking for driven, creative, and passionate individuals to join in our mission. An inclusive and diverse culture are key components of mental well-being in the workplace, and that starts with how we build our own team. If you're excited about a role, we'd love to hear from you!

    The Role

    The Senior Data Engineer will play a critical role in designing, developing, and maintaining our data infrastructure. This role requires a deep understanding of data architecture, data modeling, and ETL processes. The ideal candidate will have extensive experience with big data technologies, cloud platforms, and a strong ability to collaborate with cross-functional teams to deliver high-quality data solutions.

    This position is not eligible to be performed in Hawaii.

    What You’ll Do

    • Design, develop, and maintain scalable data pipelines and ETL processes to support data integration and analytics
    • Architect and implement data storage solutions, including data warehouses, data lakes, and databases
    • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs
    • Optimize and tune data systems for performance, reliability, and scalability
    • Ensure data quality and integrity through rigorous testing, validation, and monitoring
    • Develop and enforce data governance policies and best practices
    • Stay current with emerging data technologies and industry trends, and evaluate their potential impact on our data infrastructure
    • Troubleshoot and resolve data-related issues in a timely manner

    Our Stack

    • AWS RDS
    • Snowflake
    • Fivetran
    • DBT
    • Prefect
    • Looker
    • Datadog

    Who You Are

    • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field
    • 5+ years of experience in data engineering in a modern tech stack
    • Proficiency in programming languages such as Python, Java, or Scala
    • Strong experience with big data technologies
    • Expertise in SQL and experience with relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra)
    • Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud
    • Familiarity with data warehousing solutions like Redshift, BigQuery, or Snowflake
    • Knowledge of data modeling, data architecture, and data governance principles
    • Experience with IaaS technologies (e.g. terraform)
    • Excellent problem-solving skills and attention to detail
    • Strong communication and collaboration skills

    Bonus Points

    • Experience with containerization and orchestration tools like Docker and Kubernetes
    • Knowledge of machine learning and data science workflows
    • Experience with CI/CD pipelines and DevOps practices
    • Certification in cloud platforms (e.g., AWS Certified Data Analytics, Google Professional Data Engineer)

    Benefits

    Fundamentals:

    • Medical / Dental / Vision / Disability / Life Insurance 
    • High Deductible Health Plan with Health Savings Account (HSA) option
    • Flexible Spending Account (FSA)
    • Access to coaches and therapists through Modern Health's platform
    • Generous Time Off 
    • Company-wide Collective Pause Days 

    Family Support:

    • Parental Leave Policy 
    • Family Forming Benefit through Carrot
    • Family Assistance Benefit through UrbanSitter

    Professional Development:

    • Professional Development Stipend

    Financial Wellness:

    • 401k
    • Financial Planning Benefit through Origin

    But wait there’s more…! 

    • Annual Wellness Stipend to use on items that promote your overall well being 
    • New Hire Stipend to help cover work-from-home setup costs
    • ModSquad Community: Virtual events like active ERGs, holiday themed activities, team-building events and more
    • Monthly Cell Phone Reimbursement

    Equal Pay for Equal Work Act Information

    Please refer to the ranges below to find the starting annual pay range for individuals applying to work remotely from the following locations for this role.


    Compensation for the role will depend on a number of factors, including a candidate’s qualifications, skills, competencies, and experience and may fall outside of the range shown. Ranges are not necessarily indicative of the associated starting pay range in other locations. Full-time employees are also eligible for Modern Health's equity program and incredible benefits package. See our Careers page for more information.

    Depending on the scope of the role, some ranges are indicative of On Target Earnings (OTE) and includes both base pay and commission at 100% achievement of established targets.

    San Francisco Bay Area
    $138,500$162,900 USD
    All Other California Locations
    $138,500$162,900 USD
    Colorado
    $117,725$138,500 USD
    New York City
    $138,500$162,900 USD
    All Other New York Locations
    $124,650$146,600 USD
    Seattle
    $138,500$162,900 USD
    All Other Washington Locations
    $124,650$146,600 USD

    Below, we are asking you to complete identity information for the Equal Employment Opportunity Commission (EEOC). While we are required by law to ask these questions in the format provided by the EEOC, at Modern Health we know that gender is not binary, and we recognize that these categories do not reflect our employees' full range of identities.

    See more jobs at Modern Health

    Apply for this job

    +30d

    Data Engineer II

    Agile SixUnited States, Remote
    MLagileDesignapigitc++pythonbackend

    Agile Six is hiring a Remote Data Engineer II

    Agile Six is a people-first, remote-work company that serves shoulder-to-shoulder with federal agencies to find innovative, human-centered solutions. We build better by putting people first. We are animated by our core values of Purpose, Wholeness, Trust, Self-Management and Inclusion. We deliver our solutions in autonomous teams of self-managed professionals (no managers here!) who genuinely care about each other and the work. We know that’s our company’s purpose – and that we can only achieve it by supporting a culture where people feel valued, self-managed, and love to come to work.

    The role

    Agile Six is looking for a Data Engineer for an anticipated role on our cross-functional agile teams. Our partners include: the Department of Veteran Affairs (VA), Centers for Medicare & Medicaid Services (CMS), Centers for Disease Control and Prevention (CDC) and others. 

    The successful candidate will bring their experience in data formatting and integration engineering to help us expand a reporting platform. As part of the team, you will primarily be responsible for data cleaning and data management tasks, building data pipelines, and data modeling (designing the schema/structure of datasets and relationships between datasets). We are looking for someone who enjoys working on solutions to highly complex problems and someone who is patient enough to deal with the complexities of navigating the Civic Tech space. The successful candidate for this role is an excellent communicator, as well as someone who is curious about where data analysis, backend development, data engineering, and data science intersect.

    We embrace open source software and an open ethos regarding software development, and are looking for a candidate who does the same. Most importantly, we are looking for someone with a passion for working on important problems that have a lasting impact on millions of users and make a difference in our government!

    Please note, this position is anticipated, pending contract award response.

    Responsibilities

    • Contribute as a member of a cross functional Agile team using your expertise in data engineering, critical thinking, and collaboration to solve problems related to the project
      • Experience with Java/Kotlin/Python, command line, and Git is required
      • Experience with transport protocols including: REST, SFTP, SOAP is required
      • Experience with HL7 2.5.1 and FHIR is strongly preferred
    • Extract, transform, and load data. Pull together datasets, build data pipelines, and turn semi-structured and unstructured data into datasets that can be used for machine learning models.
    • Evaluate and recommend
    • We expect the responsibilities of this position to shift and grow organically over time, in response to considerations such as the unique strengths and interests of the selected candidate and other team members and an evolving understanding of the delivery environment.

    Basic qualifications

    • 2+ years of hands-on data engineering experience in a production environment
    • Experience with Java/Kotlin/Python, command line, and Git
    • Demonstrated experience with extract, transform, load (ETL) and data cleaning, data manipulation, and data management
    • Demonstrated experience building and orchestrating automated data pipelines in Java/Python
    • Experience with data modeling: defining the schema/structure of datasets and the relationships between datasets
    • Ability to create usable datasets from semi-structured and unstructured data
    • Solution-oriented mindset and proactive approach to solving complex problems
    • Ability to be autonomous, take initiative, and effectively communicate status and progress
    • Experience successfully collaborating with cross-functional partners and other designers and researchers, seeking and providing feedback in an Agile environment
    • Adaptive, empathetic, collaborative, and holds a positive mindset
    • Has lived and worked in the United States for 3 out of the last 5 years
    • Some of our clients may request or require travel from time to time. If this is a concern for you, we encourage you to apply and discuss it with us at your initial interview

    Additional desired qualifications

    • Familiarity with the Electronic Laboratory Reporting workflows and data flow
    • Knowledge of FHIR data / API standard, HL7 2.5.1
    • Experience building or maintaining web service APIs
    • Familiarity with various machine learning (ML) algorithms and their application to common ML problems (e.g. regression, classification, clustering)
    • Statistical experience or degree
    • Experience developing knowledge of complex domain and systems
    • Experience working with government agencies
    • Ability to work across multiple applications, components, languages, and frameworks
    • Experience working in a cross-functional team, including research, design, engineering, and product
    • You are a U.S. Veteran. As a service-disabled veteran-owned small business, we recognize the transition to civilian life can be tricky, and welcome and encourage Veterans to apply

    At Agile Six, we are committed to building teams that represent a variety of backgrounds, perspectives, and skills. Even if you don't meet every requirement, we encourage you to apply. We’re eager to meet people who believe in our mission and who can contribute to our team in a variety of ways.

    Salary and Sixer Benefits

    To promote equal pay for equal work, we publish salary ranges for each position.

    The salary range for this position is $119,931-$126,081

    Our benefits are designed to reinforce our core values of Wholeness, Self Management and Inclusion. The following benefits are available to all employees. We respect that only you know what balance means for your life and season. While we offer support from coaches, we expect you to own your wholeness, show up for work whole, and go home to your family the same. You will be seen, heard and valued. We expect you to offer the same for your colleagues, be kind (not controlling), be caring (not directive) and ready to participate in a state of flow. We mean it when we say “We build better by putting people first”.

    All Sixers Enjoy:

    • Self-managed work/life balance and flexibility
    • Competitive and equitable salary (equal pay for equal work)
    • Employee Stock Ownership (ESOP) for all employees!
    • 401K matching
    • Medical, dental, and vision insurance
    • Employer paid short and long term disability insurance
    • Employer paid life insurance
    • Self-managed and generous paid time off
    • Paid federal holidays and Election day off
    • Paid parental leave
    • Self-managed professional development spending
    • Self-managed wellness days

    Hiring practices

    Agile Six Applications, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, national origin, ancestry, sex, sexual orientation, gender identity or expression, religion, age, pregnancy, disability, work-related injury, covered veteran status, political ideology, marital status, or any other factor that the law protects from employment discrimination.

    Note: We participate in E-Verify. Upon hire, we will provide the federal government with your Form I-9 information to confirm that you are authorized to work in the U.S. Unfortunately, we are unable to sponsor visas at this time.

    If you need assistance or reasonable accommodation in applying for any of these positions, please reach out to careers@agile6.com. We want to ensure you have the ability to apply for any position at Agile Six.

    Please read and respond to the application questions carefully. Interviews are conducted on a rolling basis until the position has been filled.

     

    Apply for this job

    +30d

    Data Engineer H/F

    SocotecPalaiseau, France, Remote
    S3LambdanosqlairflowsqlgitkubernetesAWS

    Socotec is hiring a Remote Data Engineer H/F

    Description du poste

    SOCOTEC Monitoring France, leader dans le domaine de l'inspection et de la certification, offre des services dans les secteurs de la construction, des infrastructures et de l'industrie.

    Le Data & AI Hub SOCOTEC, composé de spécialistes en Data Engineering et Data Science, est chargé non seulement de la gestion et de l'optimisation des données, mais aussi de la mise en place de traitements et d'analyses de données. Nous développons des applications basées sur les données pour soutenir les activités métiers de SOCOTEC.

    Nous recherchons un(e) alternant(e) Data Engineer pour intégrer notre équipe Data SOCOTEC.

    En intégrant l'équipe, vous participerez activement à la maintenance et à l'optimisation de notre Datalake, ainsi qu'à la création et à la mise à jour des flux de données. Vous serez responsable de la documentation et de la validation de ces flux, ainsi que de la création et de la mise en place d'outils de reporting tels que Power BI. Vous proposerez également de nouvelles solutions, participerez aux qualifications techniques et contribuerez à l'amélioration continue de notre infrastructure data.

     

    Vous travaillerez sur trois missions principales :

    • Au sein de l’entité Socotec Monitoring France (20%), vous participerez à la définition de la stratégie optimale de données pour Socotec Monitoring (structuration, processus, open data, achats de données externes)
    • Pour le compte du groupe Socotec (60%), vous participez à la construction du Data Lake à l’échelle monde. Votre objectif sera de développer les flux de donner pour leur analyse en lien avec les équipes BI et Data Science. Vous apprendrez à organiser et ordonnancer les flux d’extraction, de transformation et de chargement des données en garantissant leur fiabilité, leur disponibilité, etc.
    • Auprès des clients (20%), vous participerez au pilotage de A à Z de projets finaux : collecte des données, pipeline de prétraitement, modélisation et déploiement.

    Vous ferez preuve d’autonomie, de sagacité et de qualités certaines dans la rédaction et la communication de codes et documentations techniques.

    Le stack technique utilisée :

    • Amazon Web Services (AWS)
    • Apache Airflow comme ordonnanceur
    • Spark pour les pipelines ETL
    • Gitlab pour versionner les sources
    • Kubernetes
    • DeltaLake
    • S3
    • Gérer les metadata avec OpenMetadata
    • Power BI, l’outil de BI, géré avec les équipes BI

    Qualifications

    • Master en Big Data ou diplôme d'ingénieur en informatique avec une forte appétence pour la data
    • Maîtrise des bases de données SQL et NoSQL, ainsi que des concepts associés
    • Connaissance de la stack Big Data (Airflow, Spark, Hadoop)
    • Expérience avec les outils collaboratifs de développement (Git, GitLab, Jupyter Notebooks, etc.)
    • Connaissance appréciée des services AWS (Lambda, EMR, S3)
    • Intérêt marqué pour les technologies innovantes
    • Esprit d'équipe
    • Anglais courant, y compris un bon niveau technique

    See more jobs at Socotec

    Apply for this job

    +30d

    Senior Data Engineer

    PostscriptRemote, Anywhere in North America
    Lambda8 years of experience6 years of experience4 years of experience5 years of experience3 years of experience10 years of experienceterraformnosqlRabbitMQDesignc++pythonAWSbackend

    Postscript is hiring a Remote Senior Data Engineer

    Postscript is redefining marketing for ecommerce companies. By introducing SMS as an entirely new channel for ecommerce stores to engage, retain, and convert their customer base, brands are seeing huge ROI with Postscript. Backed by Greylock, Y Combinator and other top investors, Postscript is growing fast and looking for remarkable people to help build a world class organization. 

    Postscript Description

    Postscript is redefining marketing for ecommerce companies. By introducing SMS as an entirely new channel for ecommerce stores to engage, retain, and convert their customer base, brands are seeing huge ROI with Postscript. Backed by Greylock, Y Combinator and other top investors, Postscript is growing fast and looking for remarkable people to help build a world class organization. 

     

    Job Description

    As a Senior Data Engineer for the Data Platform team at Postscript, you will provide the company with best in class data foundations to support a broad range of key engineering and product initiatives. The Data Platform team at Postscript focuses on data integration through various sources like our production application and various 3rd party integrations. You will focus on designing and building end to end data pipeline solutions: data ingestion, propagation, persistence, and services to support both our product and our internal BI organization. This role is critical in ensuring data and events are reliable and actionable throughout the Postscript Platform.

     

    Primary duties

    • Design and build performant and scalable data systems with high scale
    • Architect cloud native data solutions in AWS
    • Write high quality code to make your software designs a reality
    • Build services to support our product with cross domain data
    • Advise the team and organization on Data Engineering best practices to level up our competency in the organization
    • Mentor and support your fellow engineers via code reviews, design reviews and peer feedback

    What We’ll Love About You

    • You’re a polyglot technologist who is passionate about data problems at scaleYou have a proven track record designing and implementing complex data systems from scratch
    • You’ve built data engineering solutions in an AWS environment and have working experience with several AWS services (Lambda, Redshift, Glue, RDS, DMS, etc.)
    • You have several years (5+) of experience writing high quality production code, preferably in Python or Go
    • You have a broad range of experience with data persistence technologies (RDBMS, NoSQL, OLAP, etc.) and know how to select the right tool for the job
    • You’ve worked in event driven systems and have experience with technologies like Kafka, Kinesis, RabbitMQ, etc.
    • You’ve gotten your hands dirty with infrastructure and have used infrastructure as code technologies like Terraform
    • You’re comfortable with ambiguity and like to dig into the problems as much as you love creating solutions

    What You’ll Love About Us

    • Salary range of USD $170,000-$190,000 base plus significant equity (we do not have geo based salaries) 
    • High growth startup - plenty of room for you to directly impact the company and grow your career!
    • Work from home (or wherever)
    • Fun - We’re passionate and enjoy what we do
    • Competitive compensation and opportunity for equity
    • Flexible paid time off
    • Health, dental, vision insurance

     

    What to expect from our hiring process :

    • Intro Call:You’ll hop on a quick call with the Recruiter so we can get to know you better — and you can learn a little more about the role and Postscript. 
    • Hiring Manager Intro:You’ll hop on a quick call with the Hiring Manager so your future Manager can get to know you better — This is a great time to learn more about the team & position. 
    • Homework Assignment:We will send over an exercise that challenges you to solve a problem & come up with a creative solution, or outline how you've solved a problem in the past. Get a feel for what you’ll be doing on a daily basis!
    • Virtual Onsite Interviews: You’ll be meeting with 2-4 team members on a series of video calls. This is your chance to ask questions and see who this role interacts with on a daily basis.
    • Final FEACH Interview:This is our interview to assess your ability to represent how you work via our FEACH values. As we bui in ld the #1 team in Ecommerce, we look for individuals who embody FEACH professionally and personally. We want to hear about this in your final interview!
    • Reference Checks: We ask to speak with at least two references who have previously worked with you, at least one should be someone who has previously managed your work.
    • Offer:We send over an offer and you (hopefully) accept! Welcome to Postscript!

    You are welcome here. Postscript is an ever-evolving place of equal employment for talented individuals.

    See more jobs at Postscript

    Apply for this job

    +30d

    Junior/Mid Data Analytics Engineer

    EXUSAthens,Attica,Greece, Remote

    EXUS is hiring a Remote Junior/Mid Data Analytics Engineer

    EXUS is an enterprise software company, founded in 1989 with the vision to simplify risk management software. EXUS launched its Financial Suite (EFS) in 2003 to support financial entities worldwide to improve their results. Today, our EXUS Financial Suite (EFS) is trusted by risk professionals in more than 32 countries worldwide (MENAEUSEA). We introduce simplicity and intelligence in their business processes through technology, improving their collections performance.

    Our people constitute the source of inspiration that drives us forward and helps us fulfill our purpose of being role models for a better world.
    This is your chance to be part of a highly motivated, diverse, and multidisciplinary team, which embraces breakthrough thinking and technology to create software that serves people.

    Our shared Values:

    • We are transparent and direct
    • We are positive and fun, never cynical or sarcastic
    • We are eager to learn and explore
    • We put the greater good first
    • We are frugal and we do not waste resources
    • We are fanatically disciplined, we deliver on our promises

    We are EXUS! Are you?

    Join our dynamic Data Analytics Teamas we expand our capabilities into data Lakehouse architecture. We are seeking a Junior/Mid Data Analytics Engineer who is enthusiastic about creating compelling data visualizations, effectively communicating them with customers, conducting training sessions, and gaining experience in managing ETL processes for big data.

    Key Responsibilities:

    • Develop and maintain reports and dashboards using leading visualization tools, and craft advanced SQL queries for additional report generation.
    • Deliver training sessions on our Analytic Solution and effectively communicate findings and insights to both technical and non-technical customer audiences.
    • Collaborate with business stakeholders to gather and analyze requirements.
    • Debug issues in the front-end analytic tool, investigate underlying causes, and resolve these issues.
    • Monitor and maintain ETL processes as part of our transition to a data lakehouse architecture.
    • Proactively investigate and implement new data analytics technologies and methods.

    Required Skills and Qualifications:

    • A BSc or MSc degree in Computer Science, Engineering, or a related field.
    • 1-5 years of experience with data visualization tools and techniques. Knowledge of MicroStrategy and Apache Superset is a plus.
    • 1-5 years of experience with Data Warehouses, Big Data, and/or Cloud technologies. Exposure to these areas in academic projects, internships, or entry-level roles is also acceptable.
    • Familiarity with PL/SQL and practical experience with SQL for data manipulation and analysis. Hands-on experience through academic coursework, personal projects, or job experience is valued.
    • Familiarity with data Lakehouse architecture.
    • Excellent analytical skills to understand business needs and translate them into data models.
    • Organizational skills with the ability to document work clearly and communicate it professionally.
    • Ability to independently investigate new technologies and solutions.
    • Strong communication skills, capable of conducting presentations and engaging effectively with customers in English.
    • Demonstrated ability to work collaboratively in a team environment.
    • Competitive salary
    • Friendly, pleasant, and creative working environment
    • Remote Working
    • Development Opportunities
    • Private Health Insurance

    Privacy Notice for Job Applications: https://www.exus.co.uk/en/careers/privacy-notice-f...

    See more jobs at EXUS

    Apply for this job

    +30d

    Data Engineer

    Maker&Son LtdBalcombe, United Kingdom, Remote
    golangtableauairflowsqlmongodbelasticsearchpythonAWS

    Maker&Son Ltd is hiring a Remote Data Engineer

    Job Description

    We are looking for a highly motivated individual to join our team as a Data Engineer.

    We are based in Balcombe [40 mins from London by train, 20 minutes from Brighton] and we will need you to be based in our offices at least 3 days a week.

    You will report directly to the Head of Data.

    Candidate Overview

    As a part of the Technology Team your core responsibility will be to help maintain and scale our infrastructure for analytics as our data volume and needs continue to grow at a rapid pace. This is a high impact role, where you will be driving initiatives affecting teams and decisions across the company and setting standards for all our data stakeholders. You’ll be a great fit if you thrive when given ownership, as you would be the key decision maker in the realm of architecture and implementation.

    Responsibilities

    • Understand our data sources, ETL logic, and data schemas and help craft tools for managing the full data lifecycle
    • Play a key role in building the next generation of our data ingestion pipeline and data warehouse
    • Run ad hoc analysis of our data to answer questions and help prototype solutions
    • Support and optimise existing ETL pipelines
    • Support technical and business stakeholders by providing key reports and supporting the BI team to become fully self-service
    • Own problems through to completion both individually and as part of a data team
    • Support digital product teams by performing query analysis and optimisation

     

    Qualifications

    Key Skills and Requirements

    • 3+ years experience as a data engineer
    • Ability to own data problems and help to shape the solution for business challenges
    • Good communication and collaboration skills; comfortable discussing projects with anyone from end users up to the executive company leadership
    • Fluency with a programming language - we use NodeJS and Python but looking to use Golang
    • Ability to write and optimise complex SQL statements
    • Familiarity with ETL pipeline tools such as Airflow or AWS Glue
    • Familiarity with data visualisation and reporting tools, like Tableau, Google Data Studio, Looker
    • Experience working in a cloud-based software development environment, preferably with AWS or GCP
    • Familiarity with no-SQL databases such as ElasticSearch, DynamoDB, or MongoDB

    See more jobs at Maker&Son Ltd

    Apply for this job

    +30d

    Principal Data Engineer

    MLairflowsqlB2CRabbitMQDesignjavac++pythonAWS

    hims & hers is hiring a Remote Principal Data Engineer

    Hims & Hers Health, Inc. (better known as Hims & Hers) is the leading health and wellness platform, on a mission to help the world feel great through the power of better health. We are revolutionizing telehealth for providers and their patients alike. Making personalized solutions accessible is of paramount importance to Hims & Hers and we are focused on continued innovation in this space. Hims & Hers offers nonprescription products and access to highly personalized prescription solutions for a variety of conditions related to mental health, sexual health, hair care, skincare, heart health, and more.

    Hims & Hers is a public company, traded on the NYSE under the ticker symbol “HIMS”. To learn more about the brand and offerings, you can visit hims.com and forhers.com, or visit our investor site. For information on the company’s outstanding benefits, culture, and its talent-first flexible/remote work approach, see below and visit www.hims.com/careers-professionals.

    ​​About the Role:

    We're looking for an experienced Principal Data Engineer to join our Data Platform Engineering team. Our team is responsible for enabling H&H business (Product, Analytics, Operations, Finance, Data Science, Machine Learning, Customer Experience, Engineering) by providing a platform with a rich set of data and tools to leverage.

    You Will:

    • Serve as a technical leader within the Data Platform org. Provide expert guidance and hands-on development of complex engineering problems and projects
    • Collaborate with cross-functional stakeholders including product management, engineering, analytics, and key business representatives to align the architecture, vision, and roadmap with stakeholder needs
    • Establish guidelines, controls, and processes to make data available for developing scalable data-driven solutions for Analytics and AI
    • Create and set best practices for data ingestion, integration, and access patterns to support both real-time and batch-based consumer data needs
    • Implement and maintain data governance practices to ensure compliance, data security, and privacy.
    • Design and lead development on scalable, high-performance data architecture solutions that supports both the consumer side of the business as well as analytic use cases
    • Plan and oversee large-scale and complex technical migrations to new data systems and platforms
    • Drive continuous data transformation to minimize technical debt
    • Display strong thought leadership and execution in pursuit of modern data architecture principles and technology modernization
    • Define and lead technology proof of concepts to ensure feasibility of new data technology solutions
    • Provide technical leadership and mentorship to the members of the team, fostering a culture of technical excellence
    • Create comprehensive documentation for design, and processes to support ongoing maintenance and knowledge sharing
    • Conduct design reviews to ensure that proposed solutions address platform and stakeholder pain points, as well as meet business, and technical requirements, with alignment to standards and best practices
    • Prepare and deliver efficient communications to convey architectural direction and how it aligns with company strategy. Be able to explain the architectural vision and implementation to executives

    You Have:

    • Bachelor's or Master's degree in Computer Science or equivalent, with over 12 years of Data Architecture and Data Engineering experience, including team leadership
    • Proven expertise in designing data platforms for large-scale data and diverse data architectures, including warehouses, lakehouses, and integrated data stores.
    • Proficiency and hands-on knowledge in a variety of technologies such as SQL, Bash, Python, Java, Presto, Spark, AWS, data streaming like Kafka, RabbitMQ,
    • Hands-on experience and proficiency with data stacks including Airflow, Databricks, and dbt, as well as data stores such as Cassandra, Aurora, and ZooKeeper
    • Experience with data security (including PHI and PII), as well as data privacy regulations (CCPA and GDPR)
    • Proficient in addressing data-related challenges through analytical problem-solving and aligning data architecture with organizational business goals and objectives
    • Exposure to analytics techniques using ML and AI to assist data scientists and analysts in deriving insights from data
    • Analytical and problem-solving skills to address data-related challenges and find optimal solutions
    • Ability to manage projects effectively, plan tasks, set priorities, and meet deadlines in a fast-paced and ever changing environmen

    Nice To Have:

    • Experience working in healthcare or in a B2C company

    Our Benefits (there are more but here are some highlights):

    • Competitive salary & equity compensation for full-time roles
    • Unlimited PTO, company holidays, and quarterly mental health days
    • Comprehensive health benefits including medical, dental & vision, and parental leave
    • Employee Stock Purchase Program (ESPP)
    • Employee discounts on hims & hers & Apostrophe online products
    • 401k benefits with employer matching contribution
    • Offsite team retreats

     

    #LI-Remote

     

    Outlined below is a reasonable estimate of H&H’s compensation range for this role for US-based candidates. If you're based outside of the US, your recruiter will be able to provide you with an estimated salary range for your location.

    The actual amount will take into account a range of factors that are considered in making compensation decisions including but not limited to skill sets, experience and training, licensure and certifications, and location. H&H also offers a comprehensive Total Rewards package that may include an equity grant.

    Consult with your Recruiter during any potential screening to determine a more targeted range based on location and job-related factors.

    An estimate of the current salary range for US-based employees is
    $210,000$250,000 USD

    We are focused on building a diverse and inclusive workforce. If you’re excited about this role, but do not meet 100% of the qualifications listed above, we encourage you to apply.

    Hims is an Equal Opportunity Employer and considers applicants for employment without regard to race, color, religion, sex, orientation, national origin, age, disability, genetics or any other basis forbidden under federal, state, or local law. Hims considers all qualified applicants in accordance with the San Francisco Fair Chance Ordinance.

    Hims & hers is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation due to a disability, you may contact us at accommodations@forhims.com. Please do not send resumes to this email address.

    For our California-based applicants – Please see our California Employment Candidate Privacy Policy to learn more about how we collect, use, retain, and disclose Personal Information. 

    See more jobs at hims & hers

    Apply for this job

    +30d

    Data Quality Engineer

    MLMid LevelFull TimeBachelor's degreesqlmobileuiqa

    Pixalate, Inc. is hiring a Remote Data Quality Engineer

    Data Quality Engineer - Pixalate, Inc. - Career PageSee more jobs at Pixalate, Inc.

    Apply for this job

    +30d

    Data Engineer

    IncreasinglyBengaluru, India, Remote
    S3LambdaDesigngitjenkinspythonAWS

    Increasingly is hiring a Remote Data Engineer

    Job Description

    Working experience in data integration and pipeline development

    Qualifications

    3+ years of relevant experience with AWS Cloud on data integration with Databricks,Apache Spark, EMR, Glue, Kafka, Kinesis, and Lambda in S3, Redshift, RDS, MongoDB/DynamoDB ecosystems

    Strong real-life experience in python development especially in pySpark in the AWS Cloud environment.

    Design, develop, test, deploy, maintain and improve data integration pipeline.

    Experience in Python and common python libraries.

    Strong analytical experience with the database in writing complex queries, query optimization, debugging, user-defined functions, views, indexes, etc.

    Strong experience with source control systems such as Git, Bitbucket, and Jenkins build and continuous integration tools.

     

    See more jobs at Increasingly

    Apply for this job

    +30d

    Data Engineer

    AmpleInsightIncToronto, Canada, Remote
    DevOPSairflowsqlpython

    AmpleInsightInc is hiring a Remote Data Engineer

    Job Description

    We are looking for a data engineer who is passionate about analytics and helping companies build and scale data. You enjoy working with data and are motivated to produce high quality data tools and pipelines that help empower other data scientists. You are experienced in architecting data ETL workflows and schemas. Critical thinking and problem-solving skills are essential for this role.

    Qualifications

    • BS (or higher, e.g., MS, or PhD) in Computer Science, Engineering, Math, or Statistics
    • Hands on experience working with user engagement, social, marketing, and/or finance data
    • Proficient in Python (i.e. Pandas, Numpy, scikit-learn, etc), R, TensorFlow, amongst other data science related tools and libraries
    • Extensive experience working on relational databases, designing complex data schemas, and writing SQL queries
    • Deep knowledge on performance tuning of ETL Jobs, SQL, and databases
    • Working knowledge of Snowflake
    • Experience working with Airflow is a strong plus
    • Devops experiences is a plus

    See more jobs at AmpleInsightInc

    Apply for this job

    +30d

    Data Engineer

    JLIConsultingVaughan, Canada, Remote
    oracleazureapigitAWS

    JLIConsulting is hiring a Remote Data Engineer

    Job Description

    Data Engineer Job Responsibilities:

     

    •       Work with stakeholders to understand data sources and Data, Analytics and Reporting team strategy in supporting within our on-premises environment and enterprise AWS cloud solution

    •       Work closely with Data, Analytics and Reporting Data Management and Data Governance teams to ensure all industry standards and best practices are met

    •       Ensure metadata and data lineage is captured and compatible with enterprise metadata and data management tools and processes

    •       Run quality assurance and data integrity checks to ensure accurate reporting and data records

    •       Ensure ETL pipelines are produced with the highest quality standards, metadata and validated for completeness and accuracy

    •       Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity.

    •       Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.

    •       Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.

    •       Writes unit/integration tests, contributes to engineering wiki, and documents work.

    •       Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.

    •       Defines company data assets (data models), spark, sparkSQL jobs to populate data models.

    •       Designs data integrations and data quality framework.

    •       Designs and evaluates open source and vendor tools for data lineage.

    •       Works closely with all business units and engineering teams to develop strategy for long term data platform architecture.

    •       Focusing on structured problem solving

    •       Phenomenal communication and business awareness

    •       Working with ETL tools, Querying languages, and data repositories

    •       Support of technical Data Management Solutions

    •       Provide support to the development and testing teams to resolve data issues

    Qualifications

    •       Experience in database, storage, collection and aggregation models, techniques, and technologies – and how to apply them in business

    •       Working knowledge of source code control tool such as GIT

    •       Knowledge about file formats (e.g. XML, CSV, JSON), databases (e.g. Redshift, Oracle) and different type of connectivity is also very useful.

    •       Working experience with the following Cloud platforms is a plus: Amazon Web Services, Google Cloud Platform, Azure

    •       Working experience with data modeling, relational modeling, and dimensional modeling

    •       The interpersonal skills: You have a way of speaking that engages your audience and instills confidence and credibility. You know how to leverage communication tools and methodologies. You can build relationships internal and external team members, positioning yourself as a trusted advisor. You are always looking for ways to improve processes, and you always ensure your communications have been received and are clearly understood. Your commitment and focus influence those around you to do better.

    See more jobs at JLIConsulting

    Apply for this job

    +30d

    Data Center Design Engineer

    CloudflareHybrid or Remote
    jiraDesign

    Cloudflare is hiring a Remote Data Center Design Engineer

    About Us

    At Cloudflare, we are on a mission to help build a better Internet. Today the company runs one of the world’s largest networks that powers millions of websites and other Internet properties for customers ranging from individual bloggers to SMBs to Fortune 500 companies. Cloudflare protects and accelerates any Internet application online without adding hardware, installing software, or changing a line of code. Internet properties powered by Cloudflare all have web traffic routed through its intelligent global network, which gets smarter with every request. As a result, they see significant improvement in performance and a decrease in spam and other attacks. Cloudflare was named to Entrepreneur Magazine’s Top Company Cultures list and ranked among the World’s Most Innovative Companies by Fast Company. 

    We realize people do not fit into neat boxes. We are looking for curious and empathetic individuals who are committed to developing themselves and learning new skills, and we are ready to help you do that. We cannot complete our mission without building a diverse and inclusive team. We hire the best people based on an evaluation of their potential and support them throughout their time at Cloudflare. Come join us! 

    Available Location: Lisbon, Portugal; London, UK; Singapore or Remote US 

    About the Role

    We are seeking a Data Center Design Engineer to design Cloudflare’s pending and future infrastructure deployments for generational improvement in cost, quality, and speed of deployment. We are looking for someone who excels at progressing many projects in parallel, managing dynamic day to day priorities with many stakeholders, and has experience implementing and refining data center design best practices in a high growth environment.  Getting stuff done is a must!

    The Data Center Strategy team is part of Cloudflare’s global Infrastructure (INF) team. The INF team grows and manages Cloudflare’s global data center/PoP footprint, enabling Cloudflare’s tremendous growth with compute, network, and data center infrastructure, utilizing an extensive range of global partner vendors.  

    What you get to do in this role:

    • Translate data center capacity requirements into actionable white space design and/or rack plans within individual data center contract constraints for power, cooling
    • Manage implementation phase of cage projects with data center providers
    • Design low voltage structured cabling, fiber, cross-connect & conveyance infrastructure as well as any supporting infrastructure on the data center ceiling/floor
    • Work with supply chain team on rack integration plans and location deployment qualification
    • Work cross-functionally with Cloudflare data center engineering team and other internal teams (capacity planning, network strategy, security) to verify scope and solution and create repeatable standard installation procedures
    • Take ownership of and lead projects to design and implement data center expansions or new data centers on tight deadlines with minimal oversight 
    • Technical support in negotiations with external data center partners
    • Assist in RFP preparation, review and cost/engineering analysis
    • Review one-line diagrams and cooling equations for new and existing data centers (Data Center M&E)
    • Power component (PDU) review/approval for Hardware sourcing team
    • Implement, document and maintain power consumption tracking tools and fulfil ad-hoc reporting requests
    • Research new and innovative power efficiency technologies and programs
    • Travel up to 25% to perform infrastructure audits, validate data center construction work and buildouts, and participate in commercial processes.
    • Other duties as assigned

    Requirements

    • Bachelors or equivalent experience plus 5+ years of experience in data center mechanical and electrical design and operations/deployment/installation, P.E. certification or equivalent a plus
    • Experience in HVAC, Chilled Water Systems, Condenser Water Systems, Pump controls, Glycool/Glycols, AHU units (DX, split, RTU, CRAC, etc.), CRAH, Raised Floor Systems, HOT/COLD aisle containment and Building Management Systems
    • Understanding of basic electrical theory (voltage, current, power), basic circuit design & analysis, and single- and three-phase power systems
    • Familiarity with Data Center M&E infrastructure design concepts, electrical/UPS topologies, cooling methodologies (central plant, room cooling, high density thermal strategies)
    • Familiarity with industry standards for resilient Data Center design and Uptime Institute Tier Classifications
    • Excellent verbal, written communication and presentation skills
    • Experience working with multiple time zones and multiple cross-functional teams
    • Experience working on time sensitive projects with delivery responsibility under pressure

    Bonus Points

    • Degree in electrical/mechanical engineering or IT a plus
    • Experience in large-scale mission critical facility infrastructure design, construction, commissioning, and/or operations a plus
    • Experience with industry standards, building codes and safety standards including UMC, NFPA, ASHRAE, UBC, UMC and LEED, Uptime Institute
    • Knowledge of programming languages a plus
    • JIRA, Confluence admin-level experience a plus
    • AutoCAD experience a plus
    • Experience with FLOTHERM or Tileflow a plus

    What Makes Cloudflare Special?

    We’re not just a highly ambitious, large-scale technology company. We’re a highly ambitious, large-scale technology company with a soul. Fundamental to our mission to help build a better Internet is protecting the free and open Internet.

    Project Galileo: We equip politically and artistically important organizations and journalists with powerful tools to defend themselves against attacks that would otherwise censor their work, technology already used by Cloudflare’s enterprise customers--at no cost.

    Athenian Project: We created Athenian Project to ensure that state and local governments have the highest level of protection and reliability for free, so that their constituents have access to election information and voter registration.

    1.1.1.1: We released 1.1.1.1to help fix the foundation of the Internet by building a faster, more secure and privacy-centric public DNS resolver. This is available publicly for everyone to use - it is the first consumer-focused service Cloudflare has ever released. Here’s the deal - we don’t store client IP addresses never, ever. We will continue to abide by our privacy commitmentand ensure that no user data is sold to advertisers or used to target consumers.

    Sound like something you’d like to be a part of? We’d love to hear from you!

    This position may require access to information protected under U.S. export control laws, including the U.S. Export Administration Regulations. Please note that any offer of employment may be conditioned on your authorization to receive software or technology controlled under these U.S. export laws without sponsorship for an export license.

    Cloudflare is proud to be an equal opportunity employer.  We are committed to providing equal employment opportunity for all people and place great value in both diversity and inclusiveness.  All qualified applicants will be considered for employment without regard to their, or any other person's, perceived or actual race, color, religion, sex, gender, gender identity, gender expression, sexual orientation, national origin, ancestry, citizenship, age, physical or mental disability, medical condition, family care status, or any other basis protected by law.We are an AA/Veterans/Disabled Employer.

    Cloudflare provides reasonable accommodations to qualified individuals with disabilities.  Please tell us if you require a reasonable accommodation to apply for a job. Examples of reasonable accommodations include, but are not limited to, changing the application process, providing documents in an alternate format, using a sign language interpreter, or using specialized equipment.  If you require a reasonable accommodation to apply for a job, please contact us via e-mail athr@cloudflare.comor via mail at 101 Townsend St. San Francisco, CA 94107.

    See more jobs at Cloudflare

    Apply for this job

    +30d

    Sr. Data Engineer - Data Analytics

    R.S.ConsultantsPune, India, Remote
    SQSLambdaBachelor's degreescalaairflowsqlDesigntypescriptpythonAWSNode.js

    R.S.Consultants is hiring a Remote Sr. Data Engineer - Data Analytics

    Job Description

    We are looking for a Sr. Data Engineer for an International client. This is a 100% remote job. The person will be working from India and will be collaborating with global team. 

    Total Experience: 7+ Years

    Your role

    • Have key responsibilities within the requirements analysis, scalable & low latency streaming platform solution design, architecture, and end-to-end delivery of key modules in order to provide real-time data solutions for our product
    • Write clean scalable code using Go, Typescript / Node.js / Scala / python / SQL and test and deploy applications and systems
    • Solve our most challenging data problems, in real-time, utilizing optimal data architectures, frameworks, query techniques, sourcing from structured and unstructured data sources.
    • Be part of an engineering organization delivering high quality, secure, and scalable solutions to clients
    • Involvement in product and platform performance optimization and live site monitoring
    • Mentor team members through giving and receiving actionable feedback.

    Our tech. stack:

    • AWS (Lambda, SQS, Kinesis, KDA, Redshift, Athena, DMS, Glue,Go/Typescript, Dynamodb), Airflow, Flink, Spark, Looker, EMR
    • A continuous deployment process based on GitLab

    A little more about you:

    • A Bachelor's degree in a technical field (eg. computer science or mathematics). 
    • 3+ years experience with real-time, event driven architecture
    • 3+ years experience with a modern programming language such as Scala, Python, Go, Typescript
    • Experience of designing complex data processing pipeline
    • Experience of data modeling(star schema, dimensional modeling etc)
    • Experience of query optimisation
    • Experience of kafka is a plus
    • Shipping and maintaining code in production
    • You like sharing your ideas, and you're open-minded

    Why join us?

    ???? Key moment to join in term of growth and opportunities

    ????‍♀️ Our people matter, work-life balance is important

    ???? Fast-learning environment, entrepreneurial and strong team spirit

    ???? 45+ Nationalities: cosmopolite & multi-cultural mindset

    ???? Competitive salary package & benefits (health coverage, lunch, commute, sport

    DE&I Statement: 

    We believe diversity, equity and inclusion, irrespective of origins, identity, background and orientations, are core to our journey. 

    Qualifications

    Hands-on experience in Scala / Python with Data Modeling, Real Time / Streaming Data. Experience of complex data processing pipeline and Data Modeling.

    BE/ BTech in Computer Science

    See more jobs at R.S.Consultants

    Apply for this job

    +30d

    Senior Data Engineer

    Balsam BrandsMexico City, Mexico, Remote
    postgressqloracleDesignapiMySQLpython

    Balsam Brands is hiring a Remote Senior Data Engineer

    Job Description

    In this hands-on role as a Senior Data Engineer, your primary responsibility will be to partner with key business partners, data analysts and software engineers to design and build a robust, scalable, company-wide data infrastructure to move and translate data that will be used to inform strategic business decisions. You will ensure performance, stability, cost-efficiency, security, and accuracy of the data on the centralized data platform. The ideal candidate will possess advanced knowledge and hands-on experience in data integration, building data pipelines, batch processing frameworks, and data modeling techniques to facilitate seamless data movement. You will collaborate with various technology and business stakeholders to define requirements and design and deliver data products that meet user needs. The candidate should demonstrate intellectual acumen, excel in engineering best practices, and have a strong interest in developing enterprise-scale solutions using industry-recognized cloud platforms, data warehouses, data integration and orchestration tools.

    This full-time position reports to the Senior Manager, Data Engineering and requires in-office presence twice a week (Tuesdays and Wednesdays) to facilitate effective collaboration with both local and remote team members. Some flexibility in the regular work schedule is necessary, as most teams have overlapping hours in the early morning and/or early evening PST. Specific scheduling needs for this role will be discussed in the initial interview.

    What you’ll do:

    • Data Infrastructure Design: Develop and maintain robust, scalable, and high-performance data infrastructure to meet the company-wide data and analytics needs
    • Data Lifecycle Management: Manage the entire data lifecycle, including ingestion, modeling, warehousing, transformation, access control, quality, observability, retention, and deletion
    • Strategic Data Movement: Define and implement data integration strategies to collect and ingest various data sources. Design, build and launch efficient and reliable data pipelines to process data of different structures and sizes using Python, APIs, SQL, and platforms like Snowflake
    • Collaboration and Consultation: Serve as a trusted partner to collaborate with technical and cross-functional teams to support their data needs, address data-related technical issues, and provide expert consultation
    • Process Efficiency and Stability: Apply engineering best practices to streamline manual processes, optimize data pipelines, and establish observability capabilities to monitor and alert data quality and infrastructure health and stability
    • Innovative Solutions: Stay updated on the latest technologies and lead the evaluation and deployment of cutting-edge tools to enhance data infrastructure and processes
    • Coaching and Mentorship: Foster a culture of knowledge sharing by acting as a subject matter expert, leading by example, and mentoring others

    What you bring to the table:

    • Must be fluent in English, both written and verbal
    • 8+ years of professional experience in the data engineering
    • Extensive hands-on experience with designing and maintaining scalable, efficient, secure and fault tolerant distributed database on Snowflake Cloud Data Platform. In-depth knowledge on cloud platforms, particularly GCP and Microsoft
    • Proficient in designing and implementing data movement pipelines for diverse data sources including databases, external data providers, and streaming sources, for both inbound and outbound data workflows
    • Deep understanding of relational database (SQL Server, Oracle, Postgres, and MySQL) with advanced SQL and Python skills for building API integration, ETLs, and data models
    • Proven experience in building efficient and reliable data pipelines with comprehensive data quality checks, workflow management, and CI/CD integration
    • Excellent analytical thinking skills for performing root cause analysis on external and internal processes and data, resolving data incidents, and identifying opportunities for improvement
    • Effective communication skills for articulating complex technical details in simple business terms to non-technical audience from a various business function
    • Strong understanding of coding standards, best practices, and data governance

    Location and Travel: At Balsam Brands, we believe that time spent together, in-person, collaborating and building relationships is important. To be considered for this role, it is preferred that candidates live within the Mexico City, Guadalajara, or Monterrey metropolitan areas in order to attend occasional team meetings, offsites, or learning and development opportunities that will be planned in a centralized location. Travel to the U.S. may be required for companywide and broader team retreats.

    Notes: This is a full-time (40 hours/week), indefinite position with benefits. Candidates must be Mexican nationals to be eligible for this position; this screening question will be asked during the application process. Velocity Global is the Employer of Record for Balsam Brands' Mexico City location, and you will be employed and provided benefits under their payroll. Balsam Brands has partnered with Velocity Global to act as your Employer of Record to ensure your employment will comply with all local laws and regulations and you will receive an exceptional employment experience.

    Benefits Offered:

    • Competitive compensation; salary is reviewed yearly and may be adjusted as part of the normal compensation review process
    • Career development and growth opportunities; access to online learning solutions and annual stipend for continuous learning
    • Fully remote work and flexible schedule
    • Collaborate in a multicultural environment; learn and share best practices around the globe
    • Government mandated benefits (IMSS, INFONAVIT, SAR, 50% vacation premium)
    • Healthcare coverage provided for the employee and dependents
    • Life insurance provided for the employee
    • Monthly grocery coupons
    • Monthly non-taxable amount for the electricity and internet services 
    • 20 days Christmas bonus
    • Paid Time Off: Official Mexican holidays and 12 vacation days (increases with years of service), plus additional wellness days available at start of employment 

     

     

    Qualifications

    See more jobs at Balsam Brands

    Apply for this job

    +30d

    Data Engineer

    Zensark Tecnologies Pvt LtdHyderabad, India, Remote
    S3EC2nosqlpostgressqloracleDesignjavapythonAWS

    Zensark Tecnologies Pvt Ltd is hiring a Remote Data Engineer

    Job Description

    Job Title:              Data Engineer

    Department:      Product Development

    Reports to:         Director, Software Engineering

     

    Summary:

    Responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. Support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. Responsible for optimizing or even re-designing Tangoe’s data architecture to support our next generation of products and data initiatives.

     

    Responsibilities:

    • Create and maintain optimal data pipeline architecture.
    • Assemble large, complex data sets that meet functional / non-functional business requirements.
    • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater performance and scalability, etc.
    • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
    • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
    • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
    • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
    • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
    • Work with data and analytics experts to strive for greater functionality in our data systems.

     

    Skills & Qualifications:

    • 5+ years of experience in a Data Engineer role
    • Experience with relational SQL and NoSQL databases, including Postgres, Oracle and Cassandra.
    • Experience with data pipeline and workflow management tools.
    • Experience with AWS cloud services: S3, EC2, EMR, RDS, Redshift.
    • Experience with stream-processing systems: Storm, Spark-Streaming, Amazon Kinesis, etc.
    • Experience with object-oriented/object function scripting languages: Python, Java, NodeJs.
    • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
    • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
    • Strong analytic skills related to working with both structured and unstructured datasets.
    • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
    • A successful history of manipulating, processing and extracting value from large disconnected datasets.
    • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
    • Strong project management and organizational skills.
    • Experience supporting and working with cross-functional teams in a dynamic environment.

     

     

    Education:

    • Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.

     

    Working conditions: 

    • Remote

     

    Tangoe reaffirms its commitment to providing equal opportunities for employment and advancement to qualified employees and applicants. Individuals will be considered for positions for which they meet the minimum qualifications and are able to perform without regard to race, color, gender, age, religion, disability, national origin, veteran status, sexual orientation, gender identity, current unemployment status, or any other basis protected by federal, state or local laws. Tangoe is an Equal Opportunity Employer -Minority/Female/Disability/Veteran/Current Unemployment Status.

     

    Qualifications

    • Bachelor’s degree in Computer Science, Engineering or a related subject

    See more jobs at Zensark Tecnologies Pvt Ltd

    Apply for this job

    +30d

    Data Engineer--US Citizens/Green Card

    Software Technology IncBrentsville, VA, Remote
    Lambdanosqlsqlazureapigit

    Software Technology Inc is hiring a Remote Data Engineer--US Citizens/Green Card

    Job Description

    I am a Lead Talent Acquisition Specialist at STI (Software Technology Inc) and currently looking for a Data Engineer.

    Below is a detailed job description. Should you be interested, please feel free to reach me via call or email. Amrutha.duddula@ AT tiorg.com/732-664-8807

    Title:  Data Engineer
    Location: Manassas, VA (Remote until Covid)
    Duration: Long Term Contract

     Required Skills:

    •             Experience working in Azure Databricks, Apache Spark
    •             Proficient programming in Scala/Python/Java
    •             Experience developing and deploying data pipelines for streaming and batch data sources getting from multiple sources
    •             Experience creating data models and implementing business logic using tools and languages listed
    •             Working knowledge in Kafka, Structured Streaming, DataFrame API, SQL, NoSQL Database
    •             Comfortable with API, Azure Datalake, Git, Notebooks, Spark Cluster, Spark Jobs, Performance tuning
    •             Must have excellent communication skills
    •             Familiarity with Power BI, Delta Lake, Lambda Architecture, Azure Data Factory, Azure Synapse a plus
    •             Telecom domain experience not necessary but really helpful

    Thank you,
    Amrutha Duddula
    Lead Talent Acquisition Specialist
    Software Technology Inc (STI)

    Email: amrutha.duddula@ AT tiorg.com
    Phone : 732-664-8807
    www.stiorg.com
    www.linkedin.com/in/amruthad/

    Qualifications

    See more jobs at Software Technology Inc

    Apply for this job