airflow Remote Jobs

347 Results

2d

Principal Engineer - Cloud Data Platforms

Blue Orange DigitalNetherlands Remote
5 years of experience10 years of experienceterraformscalanosqlairflowsqlDesignazurejavadockermysqlpythonAWS

Blue Orange Digital is hiring a Remote Principal Engineer - Cloud Data Platforms

This is a full-time fully remote position. Must be willing to work within +/- 2 hours of the Central European Time Zone (CET).

Resume must demonstrate professional English ability.

-

Blue Orange is seeking a Principal Engineer to lead a data engineering team working on building data platforms in the cloud.

Our Platform Engineers require a diverse skill set including system administration, DevOps, infrastructure automation, data modeling, and workflow orchestration. Blue Orange builds enterprise data platforms and systems for a variety of clients, so this candidate should have experience with supporting modern data technologies. The ideal candidate will have experience with multiple data engineering technologies across multiple clouds and deployment scenarios. In particular, we’re looking for someone with strong technical skills in the areas of data orchestration (e.g. Airflow & dbt), data streaming (e.g. Apache Kafka, Spark, & Flink), and a variety of databases (e.g. MySQL, Snowflake & Cassandra).

Core Responsibilities & Impact

  • Lead one or more engineering teams working on high-impact projects for our most strategically important clients
  • Design reference architectures, implement PoCs and production grade MVPs, write how-to documents and standard operating procedures, and develop the culture of the delivery team on your projects
  • Provide technical consulting to clients on architectural and design issues and guide them to successful implementation and deployment of data solutions
  • Provide leadership in applying software development principles and best practices, including Continuous Integration, Continuous Delivery/Deployment, Infrastructure as Code, and Automated Testing across multiple software applications.
  • Train and supervise software engineers and help support their career development
  • Work hands-on as needed to ensure the successful delivery of your projects

Skills & Qualifications

  • BA/BS degree in Computer Science or a related technical field, or equivalent practical experience
  • At least 5 years of experience building and supporting cloud data platforms on AWS, Azure, and/or GCP
  • At least 5 years of experience with streaming technologies such as Kafka, Flink, and Spark
  • Experience with modern data stack tools and technologies such as Airflow, dbt, & Terraform
  • At least 5 years of hands-on experience with database systems - on-prem & cloud, OLTP & OLAP, SQL & NoSQL
  • At least 10 years of experience writing code in modern programming languages (e.g. Python, Java, Node, Golang, etc.)
  • At least 5 years of experience working on distributed systems - architectural design, implementation, debugging and optimization
  • Experience with technical project delivery - managing scope and timelines, writing documentation, white-boarding skills, etc.
  • Experience working with clients and managing conflicts
  • Experience with SaaS data warehouses and Snowflake in particular
  • Advanced level Python, SQL, and Bash scripting
  • Experience designing and building robust CI/CD pipelines
  • Comfortable with Docker, configuration management, and monitoring tools
  • Experience with microservices-based architectures and deployment of containerized applications
  • Experience designing event-driven architectures and using event based data orchestration tools
  • Knowledge of best practices related to security, performance, and disaster recovery
  • Excellent verbal and written English communication
  • Experience with leading delivery teams - prioritizing features, making build vs buy decisions, optimizing time-to-value, etc.
  • Interacts with others using sound judgment, good humor, and consistent fairness in a fast-paced environment

Bonus Points:

  • Certifications for AWS, Airflow, dbt, or Snowflake
  • Experience programming with Scala
  • Broad understanding of distributed systems concepts (e.g. race conditions, CAP Theorem, Paxos algorithm, etc.)
  • Experience with CDC and database replication
  • Experience with Identity and Access Management (IAM) and IAM-integration
  • Experience with data observability tools and solutions

EEO Statement:

Blue Orange Digital is proud to be an equal opportunity employer. We do not discriminate in hiring or any employment decision based on race, color, religion, national origin, age, sex (including pregnancy, childbirth, or related medical conditions), marital status, ancestry, physical or mental disability, genetic information, veteran status, gender identity or expression, sexual orientation, or other applicable legally protected characteristics. Blue Orange Digital is committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures.

See more jobs at Blue Orange Digital

Apply for this job

10d

Lead Data Engineer

InnovateEDURemote, New York, United States
jiraterraformairflowsqlslackdockerpythonbackend

InnovateEDU is hiring a Remote Lead Data Engineer

About InnovateEDU

InnovateEDU is a non-profit whose mission is to eliminate the opportunity gap by accelerating innovation in standards-aligned, next generation learning models and tools that serve, inform, and enhance teaching and learning. InnovateEDU is committed to massively disrupting K-12 public education by focusing on the development of scalable tools and practices that leverage innovation, technology, and new human capital systems to improve education for all students and close the opportunity gap.


About the Project

InnovateEDU strives to create real tooling and projects that greatly assist a school/district/state in moving toward embracing data standards, a data-driven culture, and data interoperability. Landing Zone, a project at InnovateEDU, provides school districts with a comprehensive cloud based data infrastructure through the implementation of an Ed-Fi Operational Data Store (ODS), data mart for analytics in Google BigQuery, and the necessary data workflows in Apache Airflow to connect previously siloed, disparate educational data systems. Landing Zone simplifies the process a district must go through to implement an Ed-Fi ODS, connecting Ed-Fi certified data sources, and consuming non Ed-Fi certified data once it has been aligned to the standard. This project has a heavy focus on data engineering, backend work, dev ops, and using data analytics tools to verify data.


Who You Are

You are a mission-driven individual and believe in working to close the educational opportunity gap through the use of data and technical solutions. You are excited about bringing order to disparate data, and writing data pipelines, and don’t mind being relentless in pursuing data accuracy. You’ve previously worked with SQL and Python and written code that interacts with APIs.


You are an optimistic problem-solver. You believe that together we can create real solutions that help the entire education sector move forward despite its complexity. You are excited to join a small but growing team working on an early-stage product and are looking forward to working on many different pieces of that product. You are open to feedback, bring your best every day, and are ready to grow in all areas of your work. You want to join a team of folks who share your vision for mission-driven work at the intersection of education and technology.  Finally, you know that sharing often is key to this work, and are ready to document everything that you do so that data people in schools everywhere can benefit. This is not a big data project; we have smaller amounts of data across many domains.  


Experience and Skills

You are a good fit if you:

  • Have worked as a data analyst or data engineer in the past and are familiar with validating data and tools like Google BigQuery and Google Data Studio
  • Have strong computer science fundamentals and experience with Python and specifically with Apache Airflow 
  • Experience with dbt
  • Experience with ETL and tools like Pandas and Jupyter Notebooks, 
  • Consider yourself as having a very high attention to detail
  • Have strong communication skills with both technical and non-technical people
  • Are passionate about making an impact in K-12 education
  • Are comfortable doing many different types of tasks and having to context switch between tasks relativity often
  • Are passionate about building the best version of whatever you’re working on
  • Are highly motivated to work autonomously, with strong organizational and time management skills


You’ll have an edge if you:

  • Experience and knowledge of Kubernates, Docker, and Terraform
  • Have worked in K-12 education in the past


Responsibilities


The Lead Data Engineer’s primary professional responsibilities will include, but not be limited to:

  • Managing and supervising a team of engineers and data analysts
  • Establishing a team culture and communication cadence which includes daily standup, code reviews, and ensuring timely customer responses
  • Collaborating with the Customer Success Lead and team to ensure cohesion between engineering and implementation
  • Leading estimates and work scope development for custom engineering work for customers 
  • Mentoring, teaching, and aiding in the professional development of team members
  • Implementing and maintaining Landing Zone for new and returning customers
  • Leading the creation, troubleshooting, and maintenance of data processing pipelines in Apache Airflow (ETL work)
  • Running reports and exports in edTech source systems as well as Landing Zone infrastructure to perform data validation checks and communicate those back to our customers
  • Maintaining Landing Zone documentation to ensure it is always up-to-date and reflective of how integrations function
  • Deploying code updates across the Landing Zone customer base
  • Leading the deployment of infrastructure on the Google Cloud Platform for new customers
  • Leading in the development of a historical/longitudinal data storage system (data warehouse)
  • Responding to customer support tickets (this is a shared responsibility on our team)
  • Working with internal systems such as JIRA, Asana, Slack to stay organized and ensure communication with team members
  • Other duties as assigned 


What to expect in the hiring process:

  • An introductory phone call with a Manager
  • A coding project that will take about 2 hours. This will be in Python and be related to processing data
  • A project review and feedback call with the team 
  • Final round interviews, likely including our Executive Director


The range for this position will be $110,000 to $148,000.  Salary is commensurate with education and experience.  


Application Instructions

Please submit an application on this platform.Applications without both a resume and cover letter will not be considered.



See more jobs at InnovateEDU

Apply for this job

10d

Senior Data QA Engineer

Falkon AISeattle, WA Remote
marketoairflowsqlsalesforceqapythonbackend

Falkon AI is hiring a Remote Senior Data QA Engineer

At Falkon we're building a revenue growth platform. Our mission is to transform business operations through insights and automation.

Our target customer is revenue growth teams (sales, marketing, customer success and customer support). Falkon's capabilities help them understand their prospect and customer behavior so they can market and sell more effectively to those customers. By combining data analytics with data science, our platform provides powerful tools to solve use cases like:

  • Understanding what marketing/sales tactics are working and what are not
  • Understanding what content is leading to activated, converted and engaged customers
  • Understanding where the sales pipeline is coming from - and what revenue is forecasted for the month and quarter
  • Understanding what rep behaviors are truly making a difference to the sales pipeline
  • Understanding whether the company is on track on meet its revenue targets, and drill down into issues that are holding back revenue
  • Targeting the most likely candidate customers for deal expansion
  • And many more...

Our team consists of product, engineering and research veterans from Microsoft, Amazon, Dropbox, Amperity and Zulily.

We are looking for a results-oriented data QA engineer to join the Falkon team and help us deliver the platform to our customers. The day-to-day work includes:

Writing SQL queries and data transforms that bring customer data into the Falkon platform

  • Performing data validation - ensuring data in Falkon matches data in customer's systems.
  • Validating implementations and finding/fixing bugs in SQL queries and data transforms
  • Supporting the CSM team in responding to ongoing customer requests for new metrics and reports.
  • Making improvements to existing metrics, reports, DBT data transforms and SQL queries.
  • Writing Python scripts to automate data validation, data transforms and customer implementations.
  • scripting SQL queries using Jinja.
  • Improving the performance of complex SQL queries running in Snowflake.
  • Our backend infrastructure is extremely modern and fully containerized on top of Kubernetes. We're big fans of ELT and use tools like DBT and Airflow to power our data pipelines.

What you will do

  • Partner with CSM to implement customer tenants and respond to customer requests
  • Debug/fix problems in SQL queries and data models
  • Validate customer implementations for data correctness
  • Optimize data pipelines and speed up implementations.
  • Build new data transforms and improve existing data transforms.
  • Help shape Falkon's culture, and build the workplace of your dreams.

What you will need to succeed

  • Extremely comfortable writing /debuggingcomplex SQL queries that run in data warehouses
  • Very comfortable writing data transformations using tools like DBT
  • Experience scripting SQL using Jinja, and writing automation scripts/tools in Python
  • Experience improving the performance of complex/expensive SQL queries.
  • Ability to work on customer-facing projects, delivering data implementations on a deadline
  • Solid computer science fundamentals.
  • A willingness to put in the hard work it takes to make a startup successful.
  • Alignment on Falkon principles - Think big, Deliver results with urgency, Be radically transparent, Follow the golden rule, Get better every day

Very nice-to-haves:

  • Prior experience working with data from revenue tools like Salesforce, Hubspot and Marketo is a strong plus.
  • Experience working with large data ingestion/processing pipelines.
  • Experience working with Redshift, Snowflake and Bigquery

If you're interested in rapid career growth, there is no better place to be than Falkon.

Growth comes from Impact x Learning

At Falkon you'll do your best work, develop new skills, learn from the best, discover what technical areas you're truly passionate about and help our customers grow their businesses. If you're interested in starting your own business, you'll get the opportunity to see how a venture-funded business is built from the ground up. As an early and critical member of our growing team you will help shape our business, our processes and our culture.

See more jobs at Falkon AI

Apply for this job

15d

Senior Data Engineer

PlayvoxAustralia Remote
airflowsqlDesignmongodbdockerkubernetespythonAWS

Playvox is hiring a Remote Senior Data Engineer

We believe that a great customer experience starts with people! Playvox provides cloud native, digital first Workforce Engagement Management software comprising seven modules - Workforce Management, Quality, Coaching, Performance, Learning, Motivation and Customer AI. Our solutions are designed to get the very best out of your workforce, while improving every part of your customer and employee engagement.

Due to rapid company growth, our Australia based team is now looking for a like-minded Data Engineer to join us as key contributors during this exciting growth phase. Our Aussie team is small but fast evolving and pull together to produce and deliver amazing client outcomes.

About the role

This is a key role in our continued focus in data and machine learning to build world leading solutions in WFM. The successful candidate will be involved in the design, creation, and maintenance of a scalable platform for training, deploying, and monitoring our ML models. Success is measured by enabling our ML and Feature teams to build scalable, reliable, and easy-to-use machine learning workflows that underpin the customer experience.

You will be empowered to show your innovative thinking both at the application and infrastructure level, making this a challenging and dynamic role. We’re looking for someone who is a proactive thinker and who loves to solve problems. Someone who takes accountability, builds trust and easily networks with people of all backgrounds.This role can be located in Sydney or elsewhere in Australia for the right applicant.

Role description

As a Data Engineer, you will:

  • You will partner with ML engineers, data scientists/engineers, and product engineering teams to understand business needs, find the right solution to a problem, and ship products.
  • You will apply your system software and collaborate with BI and ML engineers to build scalable, reliable, and easy-to-use machine learning workflows and deploy in live environments.
  • You will architect, create, and maintain real-time data streaming from numerous sources for training, deploying, and monitoring ML models, microservices, and APIs in AWS environments
  • You will work closely with other teams to ensure that applications that require ML services work seamlessly.
  • You will identify technical requirements and deliver solutions within a distributed team.
  • You will automate various steps in ML workflow, from model training, inference, and deployment.

Requirements

We are looking for experience in the following skills and qualifications:

  • BS/MS degree in Software Engineering, Computer Science, Information Technology discipline or a related field.
  • Minimum of 3+ years of hands-on experience as Data Engineer or similar positions in engineering or implementing solutions on AWS. (Instead of a degree, minimum of five years related work experience)
  • Strong hands-on design and development background using AWS to build data-intensive infrastructure, specially for machine learning applications
  • Strong AWS experience with (S3, Glue, Athena, Kinesis, RDS)
  • Apache AirFlow experience or the desire to learn this
  • Strong experience in MongoDB or at least document databases. Extra points if you have used the Mongo Atlas Platform
  • Solid knowledge of SQL, databases, data warehousing, ETL and other data tools
  • Proven experience with Data Lake concepts and relational database concepts
  • Experience building and maintaining real-time data streaming from numerous sources for low-latency data processing
  • Proficient in Python. Solid experience writing and maintaining high-quality production code in Python and other scripting languages
  • Hands on experience with development resources; GitHub, containerization and deployment tools
  • Experience with systems engineering and software development using continuous integration and delivery.

The following skills will be considered a plus:

  • Python, CI/CD - automation
  • Experience with data visualization tools (datadog)
  • Experience with Docker, Kubernetes or similar orchestration and configuration management
  • Knowledge of IAM Roles and Security Groups within AWS
  • Experience with building APIs (Flask)
  • Familiarity working with Data Science teams and productionizing machine learning models
  • Good documentation and communication skills
  • Experience with distributed systems and microservice architecture

We encourage you to apply even if you may not meet every requirement in this posting. We value diversity and our environment is supportive, challenging and focused on the consistent delivery of high quality, meaningful work.

Why join Playvox?

In this fast-paced period of growth, it is genuinely an exciting time to be a Playvoxer. We are a supportive, high energy global collective that loves to celebrate wins, lift each other up and recognise each other’s contributions. We strive for excellence in every interaction, all whilst enjoying the little things along the way.

A few of our Playvoxer perks include:

  • Training and learning opportunities
  • Monthly wellness hours program
  • Complete remote working
  • Additional paid leave for your birthday and Playvoxsary (work anniversary)

If you’re ready to contribute to a driven and supportive team through this challenging yet rewarding opportunity, we’d love to hear from you! APPLY TODAY!

Please note: Due to high volume of applications, we will be contacting shortlisted candidates only

See more jobs at Playvox

Apply for this job

16d

Senior Data Engineer

Falkon AISeattle, WA Remote
marketoairflowsqlsalesforceDesignpythonbackend

Falkon AI is hiring a Remote Senior Data Engineer

At Falkon we're building a revenue growth platform. Our mission is to transform business operations through insights and automation.

Our target customer is revenue growth teams (sales, marketing, customer success and customer support). Falkon's capabilities help them understand their prospect and customer behavior so they can market and sell more effectively to those customers. By combining data analytics with data science, our platform provides powerful tools to solve use cases like:

  • Understanding what marketing/sales tactics are working and what are not
  • Understanding what content is leading to activated, converted and engaged customers
  • Understanding where the sales pipeline is coming from - and what revenue is forecasted for the month and quarter
  • Understanding what rep behaviors are truly making a difference to the sales pipeline
  • Understanding whether the company is on track on meet its revenue targets, and drill down into issues that are holding back revenue
  • Targeting the most likely candidate customers for deal expansion
  • And many more...

Our team consists of product, engineering and research veterans from Microsoft, Amazon, Dropbox, Amperity and Zulily.

We are looking for a results-oriented data engineer who can design, develop and scale the internal data systems that power Falkon's core product. The day-to-day work includes:

  • bringing customer data into Falkon via various data pipelines
  • building SQL and Python data transforms for customer data, including DBT transforms
  • implementing customer tenants
  • writing and updating SQL queries that power metrics in a customer tenant
  • scripting SQL queries using Jinja
  • building programmable data ingestion and processing pipelines that automate data ingestion and extraction
  • writing Python scripts to automate tenant implementations
  • build infrastructure components to quickly process and transform terabytes of data
  • improve the performance of large data processing tasks
  • and many others

Our backend infrastructure is extremely modern and fully containerized on top of Kubernetes. We're big fans of ELT and use tools like DBT and Airflow to power our data pipelines.

What you will do

  • Partner with CSM to implement customer tenants
  • Design, implement and run Falkon's data pipelines
  • Design data models that represent the standard shape of data ingested by Falkon
  • Implement data transforms for data to conform to the data models
  • Build the large-data processing infrastructure that enables Falkon to scale horizontally.
  • Build the next generation revenue growth platform from scratch.
  • Help shape Falkon's culture, and build the workplace of your dreams.

What you will need to succeed

  • Extremely comfortable writing complex SQL queries that run in data warehouses
  • Very comfortable writing data transformations using tools like DBT
  • Experience scripting SQL using Jinja, and writing automation scripts/tools in Python
  • Ability to work on customer-facing projects, delivering data implementations on a deadline
  • Ability to operate with autonomy in highly ambiguous situations.
  • Solid computer science fundamentals.
  • A willingness to put in the hard work it takes to make a startup successful.
  • Alignment on Falkon principles - Think big, Deliver results with urgency, Be radically transparent, Follow the golden rule, Get better every day

Very nice-to-haves:

  • Prior experience working with data from revenue tools like Salesforce, Hubspot and Marketo is a strong plus.
  • Experience building very large data ingestion/processing pipelines.
  • Experience developing/deploying/running metrics processing and monitoring systems


If you're interested in rapid career growth, there is no better place to be than Falkon.

Growth comes from Impact x Learning

At Falkon you'll do your best work, develop new skills, learn from the best, discover what technical areas you're truly passionate about and help our customers grow their businesses. If you're interested in starting your own business, you'll get the opportunity to see how a venture-funded business is built from the ground up. As an early and critical member of our growing team you will help shape our business, our processes and our culture.

See more jobs at Falkon AI

Apply for this job

20d

Director Data Engineering

DomestikaSpain Remote
scalaairflowsqlDesignpythonAWS

Domestika is hiring a Remote Director Data Engineering

At Domestika we are looking for a Director Data Engineering- Remote.

Domestika is the fastest-growing creative community where the best creative experts share their knowledge and skills through professionally produced online courses.

It all started as an online forum and a small but dynamic showcase of creative professionals, designed to help them connect and learn from each other.

Inspired by their thriving community, Domestika widened its reach by producing online courses for anyone interested in unleashing their creative potential and connecting with like-minded creatives from around the world.

Domestika carefully curates the teacher roster and produces all the courses in-house, to ensure a high-quality online learning experience for everyone. Today, the online community is home to over 8 million people from around the world who are curious and passionate about learning new creative skills.


Functions

• Lead Domestika's data engineering teamBuild the long term vision of Domestika's data infrastructure
• Lead the architectural design and implementation of our growing data platform, covering all development life cycles
• Implement and monitor best in class security measures in our data warehouse and analytics environment.
• Orchestrate, design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
• Design, implement and maintain robust data pipelines to support our reporting needs.
• Work with Product, Data and Marketing teams to assist with data-related technical issues and support their data infrastructure needs.
• Work with data and analytics experts to strive for greater functionality in our data systems.
• Mentor less experienced team members on their day-to-day responsibilities.


Requeriments

• Minimum 6+ years of experience in a data engineering.
• 2+ years of experience leading a data engineering team.
• Proven record of architecting large scale, highly-available, fully monitored Big Data data infrastructures.
• Experience working on AWS with technologies such as S3, Redshift, Spectrum, Athena, DataLake Formation, Glue, EMR, etc.
• Working experience with Spark with Python or Scala as well as workflow management tools like Airflow, Luigi, etc.
• Advanced working SQL knowledge and query optimization.
• Strong analytic skills and experience working with unstructured datasets.
• Experience in building processes supporting data transformation, data structures, metadata, dependency and workload management.
• A successful history of manipulating, processing and extracting value from disconnected datasets.
• Working knowledge of message queuing, stream processing and data stores.A professional level of written and spoken English is a must.

What do we offer?

- Working in one of the leading companies in the creative industry.

- A fun, creative, collaborative, and multicultural team.

- Domestika pro-account to have access to all of our courses for free.

- Possibility to work remotely.

- Private health insurance.

- A salary according to your experience and profile.

See more jobs at Domestika

Apply for this job

29d

Software Engineer - Data Engineering

UnbounceRemote
terraformnosqlairflowsqldockerkuberneteslinuxAWS

Unbounce is hiring a Remote Software Engineer - Data Engineering

Software Engineer - Data Engineering - Unbounce - Career Page

See more jobs at Unbounce

Apply for this job

+30d

ETL Developer

airflowDesignpostgresqllinuxpythonAWS

SimplyAnalytics is hiring a Remote ETL Developer

ETL Developer - SimplyAnalytics - Career Page

See more jobs at SimplyAnalytics

Apply for this job

+30d

Data Engineer

Tutela TechnologiesBoston, Massachusetts, United States
airflowsqlDesignpythonAWS

Tutela Technologies is hiring a Remote Data Engineer

Description

Comlinkdata is looking for a Data Engineer to join our Engineering team. If you have a passion for solving difficult problems, a desire to continue learning and strong programming fundamentals, then we want to speak with you.


As a Data Engineer, you’ll join a team focused on building data pipelines to support new and existing products as well as optimizing existing processes and integrating new data sets.The candidate will be involved in all aspects of the software development life cycle, including gathering business requirements, analysis, design, development and production support. The successful candidate will be responsible for implementing and supporting highly efficient and scalable MSSQL and Python processes.The developer should be able to work collaboratively with other team members, as well as users for operational support. The candidate must be focused, hard-working and self-motivated, and enjoy working on complex problems.


Responsibilities (including but not limited to):

  • Support, maintain and evolve existing data pipelines utilizingMSSQL, SSIS, Python. 
  • Implement business rule changes and enhancements in existing data pipelines.
  • Automate existing processes.
  • Document data mappings, data dictionaries, processes, programs and solutions as per established standards for data governance.
  • Troubleshoot data issues and defects to determine root cause.
  • Perform job monitoring, root cause analysis and resolution, and support production processes.
  • Perform tuning of SQL queries, and recommend and implement query tuning techniques
  • Recommend corrective action when necessary to improve performance, capture exceptions, and maintain scheduled jobs.
  • Assist in data migration needs of the department and company where applicable.
  • Develop ETL technical specifications, design, code, test, implement, and support optimal data solutions.
  • Create new pipelines in SQL / Python supporting new product development.
  • Design and develop SQL Server stored procedures, functions, views, transformation queries and triggers.
  • Take directions and complete tasks on-time with minimal supervision.
  • Recommend backup strategies for all data migration projects/applications.
  • Interact with IT management regarding work assignments and report status.
  • Simultaneously work on multiple projects with changing priorities.


Candidate Profile

Required Skills:

  • Minimum 5-6 years SQL development experience including design, development, testing, implementation and maintenance of SQL Server processes in both AWS andon-prem servers.
  • Basic familiarity with Python. 
  • Solid experience developing complex SQL statements, T-SQL wrappers & procedures, SSIS, functions, views, triggers, etc.
  • Excellent query-writing skills.
  • Strong knowledge of ETL and Data Warehouse development best practices.
  • Experience working with high volume data bases.
  • Experience in data mapping, data migration and data analysis.
  • Ability to work independently with minimal supervision.
  • Strong analytical and problem-solving skills.
  • Excellent verbal, written and interpersonal skills.
  • Ability to function effectively in a fast paced, team-oriented work environment.
  • Willingness to adapt and learn new technologies and methodologies.

 

Desired Skills:

  • Familiarity with Redshift, Aurora, Airflow, etc is highly desired.
  • Experience with AWS Environment or similar Cloud Services is highly preferred.
  • Strong data analysis and troubleshooting skills.
  • Production support experience.
  • Experience in the Telecom/cable industry.


About Us

Three leaders in the telecom industry - Comlinkdata, Tutela and Opensignal have recently come together to accelerate their mission of advancing connectivity for all. As a trifecta, we’ve created data and analytics solutions that deliver new levels of understanding about customers’ true network experience and what is really happening in the market so communications providers and other industry stakeholders can constantly optimize their actions.

Our solutions already create extraordinary outcomes for many top communications brands around the world and will create even more exciting opportunities as we bring them together in the future, uniquely enabling operators to link their network experience and market performance in a way that has never before been possible.

With offices in London, Boston and Victoria, British Columbia, we are truly international, with employees working across four continents and representing over 25 nationalities. 


See more jobs at Tutela Technologies

Apply for this job

+30d

Senior Data Engineer

LeapRemote
tableauairflowsqlDesignpython

Leap is hiring a Remote Senior Data Engineer

About Leap:

Leap is building the world's largest network of branded retail stores – powered by data, systems, and scale.  The Leap Platform enables brands to deploy stores that work in concert with ecommerce more rapidly and at significantly reduced cost and risk.  Brand stores powered by Leap bring modern brands to life with compelling, immersive customer experience and data driven operations. 

At Leap, our diverse, growing team is excited by the opportunity to power the next generation of leading consumer brands with a vibrant presence in local communities throughout the country.  We're one of the fastest growing companies in the retail/ecommerce space - since launch we've powered stores for dozens of brands, and we're adding more brands and stores each week.

Our brand customers are modern brands who lead or aspire to lead their categories today and tomorrow, and *outstanding* people are literally at the core of our product.  Our organization is composed of a diverse range of talented individuals and teams.  With functions like Real Estate, Store Design & Development, Retail Operations, Marketing, Engineering, Product Management and Data Science, we're a truly unique company and our shared ambitions and core values tightly align and drive us to succeed.

Our staff are what make our organization so special and honoring our culture and values as we hire, onboard, engage, develop and support our teams is paramount.

Come take this leap with us. Your ideas, thinking, and voice are wanted.

Senior Data Engineer

Mission For The position:

The Analytics Engineering team effectively and sustainably provides data solutions and tools across the organization. We integrate, transform, and improve volumes of data at the project or enterprise level for streamlined processes, greater efficiencies, and smarter, more informed decision-making. This team is high-energy, dynamic and in a business-critical domain space. This role is an opportunity to make a real difference in the data space, and we need confident, experienced people eager to bring in solutions, with a demonstrated ability to learn fast and make that happen.

Key Responsibilities:

You'll work closely with our business teams to make sure the most awesome, high-potential brands join the Leap platform. You'll also help analyze our existing brands and store operations to make sure we're maximizing and optimizing their performance. You'll pair regularly with analytics and engineering to make sure that common analyses get automated, so that we can spend our collective brainpower on the new and unusual.

What You'll Do:

  • Support the data needs of Analytics, Machine Learning, and Business.
  • Lead technical initiatives by architecting the solution and collaborating with team members and peers to execute the solution
  • Architect, design, implement and maintaining multi-layered SQL and Python processes
  • Design flexible and scalable data models
  • Enhance the tooling and frameworks to support complex Extract Transform Load (ETL) processes
  • Troubleshooting discrepancies in existing databases, data pipelines, warehouses, and reporting
  • Function as mentor and adviser for team members
  • Advise on Best Practices and innovative designs/solutions

Our Ideal Candidate Has:

  • A Bachelor’s degree or Masters in Computer Science, Engineering, or equivalent experience.
  • 5+ years of previous experience in data engineering with a focus on database related technologies
  • Expert technical knowledge of SQL and database related technologies.
  • 2+ years of experience working with Cloud Data Warehouse Technologies such as BigQuery, Snowflake, or Redshift.
  • 1+ years of experience working with Python, dbt, Airflow or other workflow orchestration frameworks.
  • Deep understanding of relational database modeling principles and techniques.
  • Experience architecting, designing, and implementing ETL solutions with peers and stakeholders.
  • Experience with data streaming technologies (Kafka, Kinesis, Apache Flink, Apache Beam, etc).
  • Experience with data visualization technologies such as Looker, Tableau, or Microstrategy.
  • Experience supporting organizations using Machine Learning.

In your first month, you will:

  • Go to our stores, ask questions, try on products, buy products, return products, and experience being a customer first-hand
  • Plunge into our existing databases to answer straightforward analytics questions
  • Pair with our engineers to understand our existing ETLs and data 
  • Communicate constantly with an encouraging, supportive team

Leap EEO Statement

However, you identify, whatever your path to get here; Leap celebrates diversity and is committed to maintaining a safe, rewarding, and inclusive environment where Leapers thrive individually and as a team. To achieve our mission, building the world's largest network of branded retail stores – powered by data, systems, and scale; we need to work hard to foster a diverse community to support the brands and customers we serve. These aren't just words; this is who we are. We know that our differences are what make our organization special and are paramount to our culture. Your age, skin color, beliefs, sexual orientation, nationality, disability, parental status, vet status, gender identity are valued.

See more jobs at Leap

Apply for this job

Vera Institute of Justice is hiring a Remote Associate Director, Information Technology and Security, Acacia Center for Justice (Remote)

 

Who we are:

The Vera Institute of Justice, founded in 1961, is a non-profit criminal justice organization that strives to build just government institutions and safe communities free from the scourge of racism, white supremacy, profit, and inequity that is pervasive in this country’s legal systems. Vera is an “inside” lane organization that drives change at scale with ambitious public sector leaders who share our commitment to building anti-racist, reparative systems that deliver justice.  Vera is committed to securing equal justice, ending mass incarceration, and strengthening families and communities.

About Acacia Center for Justice:

The Acacia Center for Justice is a new non-profit created through a collaboration between the Vera Institute of Justice (“Vera”) and the Capital Area Immigrants’ Rights (“CAIR”) Coalition. The CAIR Coalition is a non-profit organization in the focused on providing legal assistance to adult and child immigrants detained by the government in the Capital Region. CAIR adheres to the fundamental belief that all people – no matter their story – deserve to be free, safe, supported, and have access to a just legal system. 

The objective of the Acacia Center for Justice (“Acacia”) is to expand on Vera’s work over the past twenty years in providing legal support and representation to immigrants facing deportation through the development, coordination and management of national networks of legal services providers serving immigrants across the country. Acacia’s goals are two-fold: to support immigrant legal services and defense networks to provide exceptional legal services to immigrants and to advocate for the expansion of these programs and the infrastructure critical to guaranteeing immigrants access to justice, fairness and freedom. Acacia will focus the collective power of both Vera and CAIR on delivering accountable, independent, zealous and person-centered legal services and representation to protect the rights of all immigrants at risk of deportation.

Please note:This career opportunity will begin as a position with the Vera Institute of Justice that will transition to Acacia Center for Justice on or before July 1, 2022.

Who you are:

The Associate Director of ITS will serve an essential function in establishing and implementing foundational IT and security policies and procedures from the early stages of organizational development. The Associate Director of ITS will be a hands-on position, contribution to the strategy development of technology system implementation to support a fully remote staff and ensure the stability, and integrity, and sustainability of Acacia’s infrastructure and services. They will partner closely with internal teams, maintaining fluency in institute initiatives and strategic aims to ensure that the necessary technology-based solutions are established and that appropriate resources are in place. They will serve as the central point of contact for external technology vendors, including a Managed Services Provider. The Associate Director of ITS provides direction that ensures that Acacia’s race equity and inclusion priorities are a core aspect of all work, including in hiring and staffing, and external and internal communications.

Responsibilities include, but are not limited to:

Endpoint management

  • Serve as primary point of contact for all technology-related vendors, providing direction in day-to-day support, including with a Managed Service Provider ;
  • Coordinate on-boarding of new staff, including acquisition of endpoint software and hardware needs as well as development and administration of training and documentation resources ;
  • Coordinate off-boarding of exiting staff, including revoking all hardware and software system access and maintaining documentation of access management processes ;
  • Evaluating and implementing security protocols of the network, data and storage in coordination with legal, research, finance, and programs teams ;
  • Develop and maintain policies and procedures regarding system back-up, restore protocol, and disaster recovery ;
  • Track and maintain licensing and accounts for all cloud-based services ;
  • Vendor procurement and contract review in coordination with end user/s ;
  • Maintain current knowledge of hardware and software to assist with the planning and procurement of new or upgraded solutions ;
  • Has accountability for necessary record retention to ensure contract compliance ;
  • Oversees the management and support of the company’s smartphone and mobile device portfolio.

Strategic project management

  • Project management of technology system implementation and development in partnership with key internal and external stakeholders ;
  • Establishes operating policies and provides consultative support  to internal staff and external vendors to solve technology migration, management, and integration challenges ;
  • Vendor procurement and contract review in coordination with end user/s.

Team leadership, mentorship, and development

  • Participate in planning and development of the Information Technology and Security (ITS) team, including identifying needs (people, process and technology solutions) ;
  • Supervise, support, and mentor ITS team members across roles and levels including DevOps and technology and personnel security roles ;
  • Strategize and build opportunities for professional growth, development, and skill building among supervisees ;
  • Lead recruitment and hiring for the ITS team.

What qualifications do you need?

Required:

  • Prior supervision experience
  • Demonstrated proficiency with LAN, WAN design and implementation (network engineering) and network administration required;
  • Strong hands-on technical knowledge of PC operating systems, networking software & hardware, cabling & protocols (DHCP, DNS, FTP, TCP/IP, VOIP, etc.);
  • Experience in web conferencing, network switch configuration, wireless networks, email, smartphone & firewall administration;
  • Experience with scripting, virtualization and PC imaging;
  • Experience implementing cybersecurity controls and tools, with preference for experience with federal contracting standards such as NIST and FISMA standards;
  • Experience supporting a data science technology stack (including GitHub, the AWS data product ecosystem, secure data transmission protocols, etc.)
  • Ability to work in a dynamic fast-paced environment, managing and navigating cross-functional priorities and complex requirements.

Preferred:

  • Advanced degree in information technology, computer science, or a related field + 9-12 years of experience in information security or risk management.  In lieu of a degree, applicable work or life experience may be considered.
  • Certifications in one or more of the following: VMWARE, CCNA, MCSA, MCSE, or MCP
  • Working knowledge of the AWS data product ecosystem (e.g., S3, EC2, ECS, Athena, Redshift, Glue, Airflow, etc.) or similar cloud-based products to support users of an end-to-end data science stack 
  • Prior experience in an IT consulting capacity that required fast-paced landscape assessment and recommendation development
  • Prior experience with direct supervision

Additional eligibility requirements:

  • In accordance with federal contracts, a National Crime Information Center (NCIC) check is required for this position. While this is not a current hiring requirement, this position may require Electronic Questionnaire for Investigations Processing (e-QIP) security clearance in the future (for additional information: https://www.dcsa.mil/is/eqip/).

List of Required Software Applications:

  • Experience managing Office 365 and enterprise products (for example, SharePoint, OneDrive and Exchange)
  • VOIP
  • Veeam
  • SAAS integrations and management
  • Remote Management Monitoring (RMM) Tools

Compensation and Benefits

The compensation range for this position is $134,000 - $140,000. Actual salaries will vary depending on factors including but not limited to experience and performance. The salary range listed is just one component of Vera Institute’s total compensation package for employees. As an employer of choice in our field, supporting Vera staff—both personally and professionally—is a priority. To do this, we invest in the well-being of our staff through other rewards including merit pay, generous paid time off, a comprehensive health insurance plan, student loan repayment benefits, professional development training opportunities and up to $2,000 annual for education costs and fees relevant to Vera work, employer-funded retirement plan, and flexible time and remote work schedules. To learn more about Vera’s benefits offerings, click here.

Applications may also be faxed to:

ATTN: People Resources / Associate Director, Information Technology and Security
Vera Institute of Justice
34 35th St, Suite 4-2A, Brooklyn, NY 11232
Fax: (212) 941-9407
Please use only one method (online, mail or fax) of submission.
No phone calls, please. Only applicants selected for interviews will be contacted.

As a federal contractor, and in order to ensure a healthy and safe work environment, Vera Institute of Justice is requiring all employees to be fully vaccinated and provide proof of their COVID-19 vaccine before their start date. Employees who cannot receive the vaccine because of a disability/medical contraindication or sincerely-held religious belief may request an accommodation (e.g., an exemption) to this requirement.

Vera is an equal opportunity/affirmative action employer.  All qualified applicants will be considered for employment without unlawful discrimination based on race, color, creed, national origin, sex, age, disability, marital status, sexual orientation, military status, prior record of arrest or conviction, citizenship status, current employment status, or caregiver status. 

Vera works to advance justice, particularly racial justice, in an increasingly multicultural country and globally connected world. We value diverse experiences, including with regard to educational background and justice system contact, and depend on a diverse staff to carry out our mission. 

For more information about Vera, please visit www.vera.org

 

See more jobs at Vera Institute of Justice

Apply for this job

+30d

Data Engineer

UnbounceRemote
terraformairflowsqldockerkubernetespythonAWS

Unbounce is hiring a Remote Data Engineer

Join Unbounce and help the world experience better marketing. We’re a people-first, customer-obsessed company focused on helping employees do their best work. Our landing page and conversion platform empower digital marketing teams and agencies to launch campaigns, increase conversions and get significantly better ROI on their marketing spend in a way that nobody else does today.

The Data Engineering team enables other teams by creating the ecosystem upon which data processing is built. It is a small team with lots of opportunities to create high-impact solutions and collaborate with a large part of the company.

 

A little bit about you:

  • You understand the data lifecycle – ultimately delivering high-quality data that is able to evolve and grow
  • You work as part of a team – your success is the success of the team
  • You are not afraid of taking initiative; taking on new work, improving existing documentation or exploring better ways to achieve a goal – these things start with you
  • You own your work – from idea to delivery, you’ll take your projects all the way

We believe the ideal candidate will have experience in:

  • Delivering production-grade Python
  • Writing reasonably optimized SQL
  • Deploying on containerized infrastructure, such as Docker and Kubernetes
  • Building event stream architecture and using tooling such as Kafka
  • Building batch architecture and using tooling such as Airflow
  • Writing Infrastructure as Code, using tools such as CloudFormation and Terraform
  • Creating and managing databases; designing warehouse usage for distributed systems
  • Develop and consume synchronous and asynchronous APIs
  • Script using unix-based shells and Make
  • Write and maintain permission systems such as RBAC within tools such as IAM or K8s

 

What you’ll be doing:

  • Enable our Data Analysts and Data Scientists by developing pipelines,  tooling, infrastructure, training and pioneering best practices
  • Develop and improve existing data solutions on our warehouses and event streams
  • Develop ETL pipelines and supporting infrastructure for data-driven insights
  • Write IaC for AWS services, including RDS, S3, EKS (Kubernetes manifests), Kinesis and MSK
  • Coordinate with other teams to deliver high-impact solutions

 

What’s in it for you:

  • A remote-friendly office with flexible hours – for this role we will consider all applications from those based in Canada with the option to work from our Vancouver office
  • 4 weeks vacation plus Christmas holiday closure – you're entitled to the week of Christmas off with pay through to and including Jan 1st
  • Vacation bonus - $1,000.00
  • 12 personal wellness days (this includes: personal day, moving day, sick day, etc)
  • Health and wellness budget - $500.00
  • WFH budget - $500.00
  • A paid day off for your birthday
  • One paid volunteer day per year
  • All Unbouncers are encouraged to dedicate 10% of their time to Pro-D time

Please note that we currently do not have a legal entity set up to operate as an employer of record in Quebec. We thank you for your consideration but we are unable to accept candidates from Quebec at this time.

 

Share our values:

  • Courage
  • Ambition
  • Being Real
  • Empathy
  • Diversity

 

Unbounce Welcomes You to be YOU!

At Unbounce, we want every employee to be excited to bring their full, authentic self to work. When you do this – when you bring your unique experiences, background, knowledge, perspective, and self-expression while embracing the same from others – we learn from each other, we innovate, and we co-create an environment where Unbouncers can do the best work of their careers. We’re bolder and more brilliant together.

We’re dedicated to ensuring each Unbouncer feels a sense of belonging, feels safe, cared for, respected and valued for who they are, and trusts that their unique voice is heard, embraced, and meaningfully contributes to decision-making. We’re committed to equitable employee experience, opportunity, pay and support for every employee regardless of gender identity or expression, race, ethnicity, family or marital status, religion, socio-economic status, veteran status, national origin, age, sexual orientation, education, disability, or any other characteristic that makes you unique. 

We have no tolerance for sexism, racism, xenophobia, homophobia, transphobia, ableism, ageism, or any other forms of hateful/harmful discrimination and we’re taking action against unequal pay in our community through leading the #PayUpforProgress movement.

Please let us know if you require any accommodations or support during the recruitment process.

See more jobs at Unbounce

Apply for this job

+30d

Staff Infrastructure Engineer

HustleRemote
terraformairflowDesignmobileansiblemongodbc++dockerAWS

Hustle is hiring a Remote Staff Infrastructure Engineer

Location:Remote - CA, DC, VA, MD, NY, NH, FL, NC, MA, CT

Salary: Competitive with similar Software Engineering roles (120k - 175K DoE)

 

What We’re Building + Who We Work With:

Hustle enables organizations to run large-scale text messaging campaigns by empowering their team members and volunteers to efficiently have thousands of personal 1-to-1 conversations.

Conversations driven by our platform are geared towards driving measurable meaningful outcomes such as voter turnout, event attendance, or dollars raised for clients such as Planned Parenthood, Sierra Club, the DNC, large non-profits, unions, and universities, as well as several 2020 presidential candidates.

To do that our team works on building systems that scale up 100x in a matter of hours and which are able to send 100 million messages a day. Our clients bursty appetite for Hustle requires that we are able to scale up and down two orders of magnitude quickly and efficiently at the drop of a dime so that they can reach voters, volunteers, benefactors, or attendees at the right time, with the right message, sent by the right person!

Sound interesting? Keep reading!

What you’ll do

  • Collaborate with engineers on analyzing current processes, designing new processes, workflows and solutions to better automate the delivery of infrastructure and applications.
  • Improve and simplify existing processes. Design and implement new processes for automated deployments of cloud based infrastructure. 
  • Convert airflow instance to cloud based.
  • Simplify and documentation Terraform.
  • Upgrade and maintenance MongoDB and Kafka. 

We are looking for someone with

  • Prefer development experience with: MongoDB, Kafka, Airflow, Terraform, Ansible, AWS, Docker
  • Ability to effectively communicate a detailed understanding of DevOps concepts & tools.
  • 4+ years of IT experience with hands-on IT DevOps experience including best practices
  • Extensive DevOps design, development and support experience developing, building, and supporting automated DevOps capabilities.
  • An expert understanding of DevOps automation concepts, CI/CD Pipeline configuration best practices, source code management tools, and mock server endpoints for local developer environments.
  • Strong problem solving and debugging skills
  • Comfortable working with a distributed, fully remote team

 

About The Engineering Team at Hustle

The Hustle engineering team is entirely remote. However, you would need to be in NY, CA, MA, FL, DC, VA, MD, NC, NH, CT

 

Come help us build efficient cross-platform mobile + web applications tailored for scale and speed!

See more jobs at Hustle

Apply for this job

+30d

Web/Data Analyst

Skylumremote, , Ukraine
tableauairflowsqlDesignFirebasepython

Skylum is hiring a Remote Web/Data Analyst

Skylum allows millions of photographers to make incredible images faster. We automate photo editing with the power of Artificial Intelligence yet leave all the creative control in the hands of the artist. Our software gives rise to entirely new ways to enjoy photography, and we simply can’t do it without an awesome team of engineers and visionaries behind every release. You can change the way people imagine their photos to be.


Requirements:

  • 3+ years experience working as a web or data analyst
  • Knowledge of Web/App Analytics (GA/GA4, GTM, Firebase, Appsflyer etc.);
  • Experience in developing ETL processes;
  • Advanced experience writing SQL queries and working with large data sets;
  • Experience in product analytics and BI tools (eg. Tableau — pref, Amplitude, Power BI, or similar);
  • Knowledge of statistics fundamentals and experimental design (conducting A/B tests);
  • High level of analytical and problem-solving skills with strong attention to details;
  • Deep understanding of how to calculate and interpret all the core internet-marketing metrics;
  • Basic understanding of the principles of operation of various channels to attract traffic;
  • Good communication skills, ability to work in a team.


Will be a plus:

  • Knowledge of dbt, Dataform or Airflow;
  • Knowledge of Python (NumPy, pandas, and other libraries for working with data)/


Responsibilities:

  • Аudit conducting and setting up analytics process;
  • End-to-end analytics building;
  • Detecting the patterns and causes of changes in different performance indicators;
  • Generating hypotheses about the improvement of key metrics;
  • A\B tests managing and analyzing its results;
  • Preparing reports and creating dashboards.


What we offer:

For personal growth:

  • A chance to work with a strong team and a unique opportunity to make substantial contributions to our award-winning photo editing tools;
  • An educational allowance to ensure that your skills stay sharp;
  • English and German classes to strengthen your capabilities and widen your knowledge.

For comfort:

  • A great environment where you’ll work with true professionals and amazing colleagues whom you’ll call friends quickly;
  • The choice of working remotely or in our office space located on Podil, equipped with everything you might need for productive and comfortable work.

For health:

  • Medical insurance;
  • Twenty-one days of paid sick leave per year;
  • Healthy fruit snacks full of vitamins to keep you energized.

For leisure:

  • Twenty-one days of paid vacation per year;
  • Corporate events at least two times per year.

See more jobs at Skylum

Apply for this job

+30d

Data Engineer(Boosters)

GenesisUkraine
scalaairflowsqlapidockerjenkinspythonAWS

Genesis is hiring a Remote Data Engineer(Boosters)

✨✨✨Вітання!✨✨✨

Нумо знайомитися? :)

Ми - продуктова команда Boosters, і ми створюємо продукти, які покращують життя людей і несуть реальну цінність. Зараз у нас є 4 продукти, давай докладно розповім про них:

  • Words Booster– додаток для вивчення іноземних мов (входить у топ-10 мовних додатків у світі)
  • Avrora– додаток для покращення сну (топ-5 додатків H&F у понад 82 країнах)
  • Manifest – додаток з афірмаціями (більше 22 тисяч репостів наших афірмацій)
  • RiseSpace – платформа з лайф коучами, це наш новий напрямок (реліз був у грудні 2021)

Наша головна перевага - це люди. Люди, які націлені на те, щоб бути кращими за себе вчорашнього і перемагати разом. Зараз у нас в команді вже 60 людей, і ми плануємо не зупинятися.

Наразі у нас відкрита позиція Data Engineer, який буде відповідати за налаштування інфраструктури для отримання, зберігання та обробки даних, з метою постачання дата аналітикам та подальшого отримання з даних інформації, що впливає на прийняття рішень та розвиток продуктів.

У тебе будуть такі завдання:

  • забезпечення постачання та трансформації даних, шляхом роботи з API Facebook, Google Ads, AppsFlyer, інших рекламних мереж
  • побудова архітектури сховища даних та налаштування доставки даних до нього
  • оптимізація існуючих процесів ETL
  • налаштування хмарних сервісів (AWS)
  • обробка, підтримка коректності та опис моделей даних

Що потрібно, щоб приєднатися до нас:

  • від 1 року досвіду на аналогічній позиції
  • впевнені знання Python, досвід створення data pipelines та роботи з dataframe (з використанням Pandas, PySpark)
  • вміння працювати з API
  • відмінні знання SQL, основ архітектури БД та досвід роботи з різними СУБД
  • досвід роботи зі сховищами даних та хмарними сервісами (переважно з Amazon Web Services – S3, Athena, Glue, RDS, Lambda etc.)
  • вміння працювати з Docker
  • досвід роботи з VCS, розуміння принципів CI/CD

Буде перевагою, якщо ти:

  • маєш досвід роботи з Apache AirFlow, Jenkins
  • маєш досвід розробки на GoLang, Scala
  • працював з аналізом продуктивності, навантажувальним тестуванням та оптимізацією модулів/системи

Що ми пропонуємо?

  • Роботу в команді професіоналів та з аудиторією більше одного мільйону в місяць;
  • Філософію та умови для твого постійного росту та розвитку;
  • Великий простір для втілення власних ідей і впливу на продукт.

Також, ми пропонуємо такі бенефіти:

  • Корпоративний лікар та медичне страхування;
  • Допомога з релокейтом для співробітника та сім’ї;
  • Компенсація додаткового навчання на зовнішніх тренінгах і семінарах та Business і Management School для співробітників;
  • Велика електронна бібліотека та доступ до платних онлайн-курсів і конференцій, внутрішні бесіди і воркшопи, курси англійської.

Залишай своє резюме і приєднуйся до Boosters!

See more jobs at Genesis

Apply for this job

+30d

Senior Infrastructure Engineer

HustleRemote
terraformairflowDesignmobileansiblemongodbc++dockerAWS

Hustle is hiring a Remote Senior Infrastructure Engineer

Location:Remote - CA, DC, VA, MD, NY, NH, FL, NC, MA, CT

Salary: Competitive with similar Software Engineering roles (120k - 175K DoE)

 

What We’re Building + Who We Work With:

Hustle enables organizations to run large-scale text messaging campaigns by empowering their team members and volunteers to efficiently have thousands of personal 1-to-1 conversations.

Conversations driven by our platform are geared towards driving measurable meaningful outcomes such as voter turnout, event attendance, or dollars raised for clients such as Planned Parenthood, Sierra Club, the DNC, large non-profits, unions, and universities, as well as several 2020 presidential candidates.

To do that our team works on building systems that scale up 100x in a matter of hours and which are able to send 100 million messages a day. Our clients bursty appetite for Hustle requires that we are able to scale up and down two orders of magnitude quickly and efficiently at the drop of a dime so that they can reach voters, volunteers, benefactors, or attendees at the right time, with the right message, sent by the right person!

Sound interesting? Keep reading!

What you’ll do

  • Collaborate with engineers on analyzing current processes, designing new processes, workflows and solutions to better automate the delivery of infrastructure and applications.
  • Improve and simplify existing processes. Design and implement new processes for automated deployments of cloud based infrastructure. 
  • Convert airflow instance to cloud based.
  • Simplify and documentation Terraform.
  • Upgrade and maintenance MongoDB and Kafka. 

We are looking for someone with

  • Prefer development experience with: MongoDB, Kafka, Airflow, Terraform, Ansible, AWS, Docker
  • Ability to effectively communicate a detailed understanding of DevOps concepts & tools.
  • 4+ years of IT experience with hands-on IT DevOps experience including best practices
  • Extensive DevOps design, development and support experience developing, building, and supporting automated DevOps capabilities.
  • An expert understanding of DevOps automation concepts, CI/CD Pipeline configuration best practices, source code management tools, and mock server endpoints for local developer environments.
  • Strong problem solving and debugging skills
  • Comfortable working with a distributed, fully remote team

 

About The Engineering Team at Hustle

The Hustle engineering team is entirely remote. However, you would need to be in NY, CA, MA, FL, DC, VA, MD, NC, NH, CT

 

Come help us build efficient cross-platform mobile + web applications tailored for scale and speed!

See more jobs at Hustle

Apply for this job

+30d

Lead Product Designer

CareRevRemote
kotlinairflowsketchsqlsalesforceDesignIllustratorswiftapiiosandroidelasticsearchpostgresqllinuxpythonAWSjavascriptbackendfrontend

CareRev is hiring a Remote Lead Product Designer

Lead Product Designer at CareRev (S16)
A Marketplace for Healthcare professionals to find shifts.
Remote / Remote
Full-time
About CareRev

Healthcare costs represent 17% of US GDP. That’s 3 trillion dollars! A large portion of that is costs associated with medical doctors, nurses and healthcare staff in general. The world of healthcare staffing is dominated by telephones, fax machines, and paper calendars. We’re building a future where the people who work in healthcare are better managed and are empowered to be the healers they want to be.

CareRev is an online marketplace for healthcare staff. Hundreds of medical centers in California and Florida find highly specialized nurses and technologists on carerev.com every day.

About the role

At TalentEdge, we work with 30+ startup clients at any given time ranging from series A - series F. Below is a general job description from one of our clients but each of our clients use different technologies, development practices and org charts.

Job Responsibilities:

  • Identify opportunities for new products
  • Analyze how a new product ties in with market needs and consumer preferences
  • Set design requirements based on briefs from internal teams and external partners
  • Research materials and techniques
  • Sketch drafts on paper or digitally (for example, using CAD)
  • Use 3D modeling software to design products and components
  • Produce prototypes and test functionality
  • Improve the design of existing products
  • Gather feedback from product users

Qualifications / Skills:

  • Work experience as a Product Designer or similar role
  • Experience in industrial design
  • Creativity in mixing up colors, shapes and techniques
  • Hands-on experience with computer-aided design (CAD) tools
  • Good knowledge of 3D modeling software
  • Experience with design programs (like Illustrator and Sketch)
  • Time-management skills
  • Adaptability
  • BSc/MSc in Product Design, Manufacturing Design Engineering or relevant field

Please apply if you are interested in these responsibilities or others!

Technology

  • API Stack - Ruby/Rails, Postgresql, Linux, Redis, Lambda, Heroku, AWS
  • Android - Kotlin
  • iOS - Swift
  • Web - Elm, React, JavaScript, Node, Heroku, AWS
  • Data - Postgresql, Kafka, Redshift, Elasticsearch, SQL, Python, Mode
  • ML - sagemaker, H20, airflow
Apply Now

See more jobs at CareRev

Apply for this job

+30d

Senior Backend Engineer

CareRevRemote
Master’s DegreekotlinairflowsqlsalesforceDesignswiftapiiosrubyjavaandroidelasticsearchpostgresqllinuxpythonAWSjavascriptbackendfrontend

CareRev is hiring a Remote Senior Backend Engineer

Senior Backend Engineer at CareRev (S16)
A Marketplace for Healthcare professionals to find shifts.
Remote / Remote
Full-time
About CareRev

Healthcare costs represent 17% of US GDP. That’s 3 trillion dollars! A large portion of that is costs associated with medical doctors, nurses and healthcare staff in general. The world of healthcare staffing is dominated by telephones, fax machines, and paper calendars. We’re building a future where the people who work in healthcare are better managed and are empowered to be the healers they want to be.

CareRev is an online marketplace for healthcare staff. Hundreds of medical centers in California and Florida find highly specialized nurses and technologists on carerev.com every day.

About the role

At TalentEdge, we work with 30+ startup clients at any given time ranging from series A - series F. Below is a general job description from one of our clients but each of our clients use different technologies, development practices and org charts.

Job Responsibilities:

  • Develop information systems by designing, developing, and installing software solutions.
  • Determines operational feasibility by evaluating analysis, problem definition, requirements, solution development, and proposed solutions.
  • Develops software solutions by studying information needs, conferring with users, and studying systems flow, data usage, and work processes.
  • Investigates problem areas.
  • Follows the software development lifecycle.
  • Documents and demonstrates solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code.
  • Prepares and installs solutions by determining and designing system specifications, standards, and programming.
  • Improves operations by conducting systems analysis and recommending changes in policies and procedures.
  • Obtains and licenses software by obtaining required information from vendors, recommending purchases, and testing and approving products.
  • Protects operations by keeping information confidential.
  • Provides information by collecting, analyzing, and summarizing development and service issues.
  • Accomplishes engineering and organization mission by completing related results as needed.

Qualifications / Skills:

  • Analyzing information
  • General programming skills
  • Software design
  • Software debugging
  • Software documentation
  • Software testing
  • Problem solving
  • Teamwork
  • Software development fundamentals
  • Software development process
  • Software requirements

Education, Experience, and Licensing Requirements:

  • Bachelor’s and/or Master’s degree in Computer Science, Computer Engineering or related technical discipline
  • 5+ years of professional software development experience
  • Proficiency in Java, Python, Go, Rust or Ruby and object-oriented design skills
  • Application architecture and design patterns
  • Experience serving as technical lead throughout the full software development lifecycle, from conception, architecture definition, detailed design, scoping, planning, implementation, testing to documentation, delivery and maintenance is preferred
  • Knowledge of professional software engineering and best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
  • Experience in development of distributed/scalable systems and high-volume transaction applications

Please apply if you are interested in these responsibilities or others!

Technology

  • API Stack - Ruby/Rails, Postgresql, Linux, Redis, Lambda, Heroku, AWS
  • Android - Kotlin
  • iOS - Swift
  • Web - Elm, React, JavaScript, Node, Heroku, AWS
  • Data - Postgresql, Kafka, Redshift, Elasticsearch, SQL, Python, Mode
  • ML - sagemaker, H20, airflow
Apply Now

See more jobs at CareRev

Apply for this job

+30d

Senior Frontend Engineer

CareRevRemote
Master’s DegreekotlinairflowsqlsalesforceDesignswiftapiiosandroidelasticsearchpostgresqllinuxpythonAWSjavascriptbackendfrontend

CareRev is hiring a Remote Senior Frontend Engineer

Senior Frontend Engineer at CareRev (S16)
A Marketplace for Healthcare professionals to find shifts.
Remote / Remote
Full-time
About CareRev

Healthcare costs represent 17% of US GDP. That’s 3 trillion dollars! A large portion of that is costs associated with medical doctors, nurses and healthcare staff in general. The world of healthcare staffing is dominated by telephones, fax machines, and paper calendars. We’re building a future where the people who work in healthcare are better managed and are empowered to be the healers they want to be.

CareRev is an online marketplace for healthcare staff. Hundreds of medical centers in California and Florida find highly specialized nurses and technologists on carerev.com every day.

About the role

At TalentEdge, we work with 30+ startup clients at any given time ranging from series A - series F. Below is a general job description from one of our clients but each of our clients use different technologies, development practices and org charts.

Job Responsibilities:

  • Develop information systems by designing, developing, and installing software solutions.
  • Determines operational feasibility by evaluating analysis, problem definition, requirements, solution development, and proposed solutions.
  • Develops software solutions by studying information needs, conferring with users, and studying systems flow, data usage, and work processes.
  • Investigates problem areas.
  • Follows the software development lifecycle.
  • Documents and demonstrates solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code.
  • Prepares and installs solutions by determining and designing system specifications, standards, and programming.
  • Improves operations by conducting systems analysis and recommending changes in policies and procedures.
  • Obtains and licenses software by obtaining required information from vendors, recommending purchases, and testing and approving products.
  • Protects operations by keeping information confidential.
  • Provides information by collecting, analyzing, and summarizing development and service issues.
  • Accomplishes engineering and organization mission by completing related results as needed.

Qualifications / Skills:

  • Analyzing information
  • General programming skills
  • Software design
  • Software debugging
  • Software documentation
  • Software testing
  • Problem solving
  • Teamwork
  • Software development fundamentals
  • Software development process
  • Software requirements

Education, Experience, and Licensing Requirements:

  • Bachelor’s and/or Master’s degree in Computer Science, Computer Engineering or related technical discipline
  • 5+ years of professional software development experience
  • Proficiency in react or other modern technologies, and object-oriented design skills
  • Application architecture and design patterns
  • Experience serving as technical lead throughout the full software development lifecycle, from conception, architecture definition, detailed design, scoping, planning, implementation, testing to documentation, delivery and maintenance is preferred
  • Knowledge of professional software engineering and best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
  • Experience in development of distributed/scalable systems and high-volume transaction applications

Please apply if you are interested in these responsibilities or others!

Technology

  • API Stack - Ruby/Rails, Postgresql, Linux, Redis, Lambda, Heroku, AWS
  • Android - Kotlin
  • iOS - Swift
  • Web - Elm, React, JavaScript, Node, Heroku, AWS
  • Data - Postgresql, Kafka, Redshift, Elasticsearch, SQL, Python, Mode
  • ML - sagemaker, H20, airflow
Apply Now

See more jobs at CareRev

Apply for this job

+30d

Senior Data Scientist, Square for Retail

SquareAtlanta, GA, USA, Remote
tableauairflowsqlDesignpython

Square is hiring a Remote Senior Data Scientist, Square for Retail

Company Description

Since we first opened our doors in 2009, the world of commerce has evolved immensely – and so has Square. After enabling anyone to take a payment and never miss a sale, we saw sellers stymied by disparate, outmoded products and tools that wouldn’t work together. So we expanded into software and started building integrated, omnichannel solutions – to help sellers sell online, manage inventory, run a busy kitchen, book appointments, engage loyal buyers, and hire and pay staff. And across it all, we’ve embedded financial services tools at the point of sale, so merchants can access a business loan and manage their cash flow all in one place.

Today, we’re a partner to sellers of all sizes – large, enterprise-scale businesses with complex commerce operations, sellers just starting out, as well as merchants who began selling with Square and have grown larger over time. As our sellers scale, so do our solutions. We all grow together.

There is a massive opportunity in front of us. We’re building a business that is big, meaningful, and lasting. And we are helping sellers around the world do the same.

Job Description

From the cozy local wine store to the trendy sneaker pop-up, the Retail team at Square is equipping retail sellers with simple solutions to complex problems - like predictive analytics and fluid, omni-channel capabilities to meet customers wherever they are. Square’s mission is economic empowerment, and Data Scientists support this by using data to understand and empathize with our customers, thereby enabling us to build a remarkable product experience. You and your fellow data science & analytics teammates on the Retail Analytics team are embedded within product and marketing sub-teams (“squads”), leveraging data engineering, analytics, statistics, and machine learning to empower data-driven decision making in the full life cycle of product development.

You will:

  • Partner with the product, design, and marketing stakeholders to identify, prioritize, and answer the most important questions where analytics can have a material impact

  • Apply a diverse set of tactics such as statistics, quantitative reasoning, and machine learning; discerning where simple analytics solutions (e.g. a quick heuristic or visualization) are preferable to complex solutions (e.g. machine learning)

  • Contribute to the data strategy of product engineering, influencing engineers to make well-informed architecture and design decisions that affect data at Square

  • Provide comprehensive day-to-day analytics support to partner teams, developing tools and resources to empower data access and self-service so your advanced expertise can be leveraged where it is most impactful

  • Lead and mentor others on medium and long-term cross-functional initiatives that span product domains

Qualifications

You have:

  • 5+ years of analytics experience and a Bachelor’s degree or equivalent

  • Fluency in SQL, with experience exploring and understanding large, complex datasets & data systems

  • Experience working with technical and non-technical partners, such as product managers and product marketing managers

  • Experience with statistical and machine-learning techniques to solve practical business problems such as hypothesis testing, cross-selling, clustering user archetypes, and predicting churn

  • Very strong communication skills: ability to clearly communicate complex results to technical and non-technical audiences in verbal, visual, and written media

  • Proven ability to lead cross-functional projects that depend on the contributions of others in a variety of disciplines

  • Experience maintaining a backlog and performing independently

  • Experience with a BI tool such as Looker or Tableau 

 

Even better:

  • An advanced degree (M.S., PhD.) in Mathematics, Statistics, Computer Science, Physical Sciences, Economics, or a related technical field

  • Experience with data warehouse design, development and best practices

 

Technologies we use and teach:

  • SQL (Snowflake)

  • Optimizely

  • Airflow (ETL)

  • Looker, Amplitude, Tableau

  • Python (pandas, scikit-learn, etc.)

Additional Information

We’re working to build a more inclusive economy where our customers have equal access to opportunity, and we strive to live by these same values in building our workplace. Block is a proud equal opportunity employer. We work hard to evaluate all employees and job applicants consistently, without regard to race, color, religion, gender, national origin, age, disability, pregnancy, gender expression or identity, sexual orientation, citizenship, or any other legally protected class. 

We believe in being fair, and are committed to an inclusive interview experience, including providing reasonable accommodations to disabled applicants throughout the recruitment process. We encourage applicants to share any needed accommodations with their recruiter, who will treat these requests as confidentially as possible. Want to learn more about what we’re doing to build a workplace that is fair and square? Check out our I+D page

Additionally, we consider qualified applicants with criminal histories for employment on our team, and always assess candidates on an individualized basis.

Perks

We want you to be well and thrive. Our global benefits package includes:

  • Healthcare coverage
  • Retirement Plans
  • Employee Stock Purchase Program
  • Wellness perks
  • Paid parental leave
  • Paid time off
  • Learning and Development resources

Block, Inc. (NYSE: SQ) is a global technology company with a focus on financial services. Made up of Square, Cash App, Spiral, TIDAL, and TBD, we build tools to help more people access the economy. Square helps sellers run and grow their businesses with its integrated ecosystem of commerce solutions, business software, and banking services. With Cash App, anyone can easily send, spend, or invest their money in stocks or Bitcoin. Spiral (formerly Square Crypto) builds and funds free, open-source Bitcoin projects. Artists use TIDAL to help them succeed as entrepreneurs and connect more deeply with fans. TBD is building an open developer platform to make it easier to access Bitcoin and other blockchain technologies without having to go through an institution.

See more jobs at Square

Apply for this job