scala Remote Jobs

112 Results

+30d

Senior Analytics Engineer

PRIUnited States Remote
tableauscalaDesignpythonjavascript

PRI is hiring a Remote Senior Analytics Engineer

PRI Talent is hiring a Senior Analytics Engineeron behalf of our client. This role is a full-time, 1099 contract staff augmentation position working with a company that is a leader in reducing electronic waste and finding value in gently used electronics. Our client has seen staggering growth and extraordinary impact on protecting the planet, all while providing a work culture unlike any other.

A Senior Analytics Engineer role sits at the intersection of Data Analytics and Data Engineering, with an emphasis on building out amazing data products (e.g., dashboards, visualizations, and other self-service solutions) that push the organization forward and build better product experiences. You will partner closely with senior leadership to develop self-service tools that allow everybody to have access to relevant insights and build the foundation of a data driven culture.

Key Responsibilities

    • Design and implement high-performance, reusable, and scalable data models for our data warehouse using dbt and Snowflake
    • Design and implement Tableau structures (extracts, datasets, flows, etc.) which will enable users across the organization to self-serve analytics
    • Work closely with data analysts and business teams to understand business requirements and provide data ready for analysis and reporting
    • Continuously discover, transform, test, deploy, and document data sources and data models
    • Apply, help define, and champion data warehouse governance: data quality, testing, documentation, coding best practices, and peer reviews
    • Take the initiative to improve and optimize analytics engineering workflows and platforms

    Knowledge, Skills, and Abilities

      • Experience with Python, Spark, Scala, etc.
      • Varied experience across data engineering, data science, and/or analytics.
      • Experience with developing visualizations in JavaScript, D3, etc.
      • Experience directly supporting product and/or growth teams.
      • Effective communication skills.

      Education and Experience

      • 5+ years of experience building self-service data products for internal partners, or in other relevant data engineering or data analytic capacities.
      • Bachelor’s degree in a quantitative field (e.g., Statistics, Computer Science, etc.) Advanced degree preferred
      • Excellent at working directly with business partners to understand their needs and create impactful and visually compelling data products.
      • Expert with Tableau, Looker, or similar business intelligence tools.
      • Experienced querying extremely large data sets and developing robust data pipelines to support reporting and other analytical use cases.
      • Analytically minded and experienced in developing insights to inform strategic business questions. In-depth experience with metric definition, productionalization, and socialization is relevant.
      • Comfortable with ambiguity and able to self-prioritize multiple workstreams in partnership with stakeholders.

      Please note we will not accept applications that do not include a cover letter and work examples.

      See more jobs at PRI

      Apply for this job

      +30d

      BMC Remedy/ITSM Helix Developer

      DevoteamMilano, Italy, Remote
      scala

      Devoteam is hiring a Remote BMC Remedy/ITSM Helix Developer

      Descrizione del lavoro

      Volete entrare a far parte di un'azienda in rapida crescita che valorizza il supporto e la gestione locale, in un ambiente attento che offre tutte le chiavi del successo? 

      Volete partecipare alla trasformazione digitale guidando progetti su larga scala di importanti aziende in collaborazione con il nostro partner strategico BMC Software

      Cosa vi offriamo : 

      Un programma di formazione personalizzato e adattato; i nostri dipendenti vengono interamente formati e certificati.
      Andate oltre e perfezionate le vostre conoscenze imparando a fianco dei nostri partner ed esperti, accompagnati da manager e capi progetto.
      La possibilità di entrare a far parte del nostro Knowledge Up Programs, un programma di accelerazione della carriera che consente di diventare consulenti senior in 3 anni

      Il candidato sarà inserito all’interno di gruppi di progetto nei quali eseguirà attività di sviluppo su piattaforma Remedy di applicazioni custom, di analisi dei requisiti funzionali e di redazione delle specifiche di sviluppo. In alcuni contesti, è da prevedere anche la redazione di test book di progetto con loro  esecuzione.

      Attività di implementazione e sviluppo di applicazioni attraverso ARS Remedy,Helix ITSM all’interno di gruppi di progetto dalle elevate competenze sul prodotto.

      E la retribuzione? Discuteremo apertamente di questo argomento durante i nostri prossimi step

      Qualifiche

      Titolo di studio:Laurea o comprovata esperienza sull’ambiente di sviluppo Remedy.

      Comprovata conoscenza di sviluppo sulla piattaforma BMC Remedy o ITSM Helix, SQL. Gradita certificazione ITIL 4 e conoscenza BMC ITSM Helix.

      Richiesta la conoscenza dell’inglese tecnico.

      See more jobs at Devoteam

      Apply for this job

      +30d

      (Full-time Only)-Azure Data Engineer

      ChabezTechSmyrna, GA, Remote
      agile5 years of experiencescalaDesignazurejavapython

      ChabezTech is hiring a Remote (Full-time Only)-Azure Data Engineer

      Job Description

      Role: Azure Data Engineer

      Location: Atlanta, GA

      Duration: Long Term

      Experience Required: 10+ years 

       

      • Experience in technical leadership in architecture, design, and engineering in the modernization of legacy Data Ingestion, ETL, and databases to new technologies in the Public Cloud Azure and Big Data Space 
      • Strong understanding across any of the Cloud Azure and its components (compute, storage, data, serverless compute,) to deliver end-to-end infrastructure architecture solutions for the clients 
      • Experience in defining technical direction and roadmap for Cloud migrations and managing the implementation and big data technology solutions. 
      • Experience designing, developing, or maintaining a big data application. 
      • Produce solutions that are models of performance, scalability, and extensibility. 
      • Can create a proof of concepts, demos, and/or scripts from scratch or leveraging reusable components. 
      • At least 5 years of experience in Data warehousing and designing solutions on Modern Data Lake. 
      • At least 5 years of Experience with major Big Data technologies and frameworks including but not limited to Hadoop, Apache Spark, Hive, Kafka, Apache Sqoop, ZooKeeper, and HBase. 
      • Proficiency in any of the programming languages: Scala, Python, or Java  
      • Hands-on Experience with Spark (Python/Scala), and SQL. 
      • Practical expertise in performance tuning and optimization 
      • Experience working on Production grade projects with Terabyte to Petabyte size data sets. 
      • Experience reviewing and implementing Agile standards and practices.

      Thanks & Regards

      Shankar, US IT Recruiter

      ChabezTech LLC | M & K Technovation

      4 Lemoyne Dr #102, Lemoyne, PA 17043, USA

      Email: shankar(at)chabeztech.com | www.chabeztech.com

      Qualifications

      Role: Azure Data Engineer

      Location: Atlanta, GA

      Duration: Long Term

      Experience Required: 10+ years 

       

      • Experience in technical leadership in architecture, design, and engineering in the modernization of legacy Data Ingestion, ETL, and databases to new technologies in the Public Cloud Azure and Big Data Space 
      • Strong understanding across any of the Cloud Azure and its components (compute, storage, data, serverless compute,) to deliver end-to-end infrastructure architecture solutions for the clients

      See more jobs at ChabezTech

      Apply for this job

      +30d

      Mid Level Data Engineer

      agileBachelor's degree3 years of experiencejiraterraformscalapostgressqloracleDesignmongodbpytestazuremysqljenkinspython

      FuseMachines is hiring a Remote Mid Level Data Engineer

      Mid Level Data Engineer - Fusemachines - Career Page

      See more jobs at FuseMachines

      Apply for this job

      +30d

      Software Engineer, Backend

      AtticusLos Angeles, CA or Remote
      scalasqlDesigngraphqlgitrubyjavac++dockerkubernetesjenkinspythonAWSbackend

      Atticus is hiring a Remote Software Engineer, Backend

      About Atticus

      At any given time, 16 million Americans are experiencing a crisis that requires urgent help from our legal system or government. The right assistance could transform their lives. But today, most never get it. 

      Atticus makes it easy for any sick or injured person in crisis to get the life-changing aid they deserve. In just three years, we’ve become the leading platform connecting people with disabilities to government benefits. We also help victims of accidents, misconduct, and violence get compensation from insurance. So far, we’ve gotten thousands of people access to over $2B in life-changing aid, and we’re just getting started.

      We've helped more than 20,000 people in need (see our 6,000+ five-star reviews) and raised more than $50 million from top VC firms like Forerunner, GV (Google Ventures), and True Ventures. (We just closed our Series B round in May 2023, so we're well-funded for the foreseeable future.) We're small but moving fast — our team grew from 52 to 91 last year and we expect to grow again in 2024.

      The Job

      Atticus works in an industry dominated by outdated technology that is ripe for fresh thinking: our core competitors rely on massive call centers to screen clients, antiquated CRMs to track and manage cases, and paper checks to get paid (provided they’re sent to the right address). 

      Conversely, as a VC-backed tech company our product & engineering department powers everything we do: from creating an engaging online experience for people in crisis to providing tools for our network lawyers as they serve our clients, Atticus relies on technology to fulfill our mission.

      We’re looking for Software Engineers to join our team. You’ll work on the back-end, and will partner with every department at Atticus as we continue to grow our platform in an effort to help people in need find trusted legal support.  

      What You'll Do:

      • Design, build, and operate Atticus’ APIs with a focus on performance, modularity, extensibility, and reliability.
      • Work with product and software architects to plan and deliver features, fixes, and performance enhancements
      • Leverage your peers as multipliers for your skills to create excellent products and services.

      The role is a rare opportunity to join a fast-growing Series A startup that doubles as a B-corp social enterprise. Every project you take on will help clients in need get the help they deserve, and you’ll shape our company culture as we scale. We’re looking for engineers who are excited about our mission and the challenges it entails.

      Who You Are:

      • You have 3+ years of experience writing idiomatic JavaScript/Node.js, Golang, Java, Python, Scala, or Ruby
      • You use a modern version-control system for your source code repository (Git, Mercurial, GitHub, BitBucket).
      • You lint all your code or know you should.
      • You know what parts of your code require tests and you write those tests.
      • You use objective judgement in leveraging the right frameworks and technologies.
      • You are versed in cloud computing systems (GCP, AWS, etc.) and SAAS concepts.
      • You build modern, resilient and operationally sane backend systems exemplifying industry standards (HTTP REST, GraphQL, Stream processing, Big Data).
      • You leverage continuous integration systems to their full extent (CircleCI, Bamboo, Jenkins, TravisCI).
      • You plan for, build, evolve and scrutinize monitoring and alerting for your production systems.
      • You are willing and able to deploy, troubleshoot, and maintain your systems in production and staging environments.

      Extra Credit:

      • Experience with Google Cloud Platform, Kubernetes, Docker, CircleCI, Git, Golang, Java
      • Experience with GraphQL, GraphQL Federation, REST APIs and supporting network protocols
      • Experience with a distributed SQL platform like CockroachDB or Google Spanner
      • Experience with Hadoop, MapReduce, or other “Big Data” systems

      We are strongly committed to building a diverse team. If you’re from a background that’s underrepresented in tech, we’d love to meet you!

      Salary & Benefits

      This is a rare opportunity to join a startup that has strong traction (substantial funding, well-respected backers, tremendous growth, and many happy customers) but is still small enough that you can have a huge impact and play a role in shaping our culture.

      We’re a certified B Corporation tackling a critical social problem. Our mission to help people in need drives everything we do, and your work here will touch many lives.

      We offer competitive pay — including equity — and generous benefits:

      • Medical and dental insurance with 100% of employee premiums covered
      • 15 vacation days & ~19 paid holidays each year (including two weeks at end-of-year)
      • Free memberships to ClassPass and OneMedical
      • $1,000 reimbursable stipend for education and training outside of work
      • Student loan repayment assistance, 401(k), and optional HSA
      • Free snacks, drinks, weekly lunches, and regular team dinners/events/retreats
      • Humble, thoughtful, smart, fun colleagues
      We anticipate the base salary band for this role will be between $115,000 to $180,000 in addition to equity and benefits. The salary at offer will be determined by a number of factors such as candidate's experience, knowledge, skills and abilities, as well as internal equity among our team

      Location & Covid

      Today, about half our team are in Los Angeles and half are fully remote and spread across the U.S. There are two options for this job:

      1. Live in Los Angeles, work a few days a week (or more) out of our beautiful office in the Arts District.
      2. Live wherever, work remotely (Ideally PST hours), and travel to LA (on the company dime) as needed to be with your colleagues —somewhere between monthly and quarterly.

      In short: You can do this job well remotely, and we’re committed to empowering everyone with flexibility. But we care a lot about building a great culture and we think some interactions need to happen in person, so we put a lot of thought into retreats, offsites, and other ways to gather. 

      As for Covid: When the pandemic started, we immediately shifted to fully remote to protect our team and shuttered our office. Today, everyone on the team is vaccinated, and many come in often (though we don’t require it). Going forward, you can expect that vaccinations will be required for all employees (unless medically unable) and that if a variant emerges that makes in-person work unsafe for vaccinated people, we’ll close our office, cease any travel, and do whatever it takes to protect and support our team.

      See more jobs at Atticus

      Apply for this job

      +30d

      Senior Cloud Support Engineer - US Public Sector

      snowflakecomputingRemote, TX, USA
      scalasqlsalesforceazureapijavac++.netdockerpythonAWS

      snowflakecomputing is hiring a Remote Senior Cloud Support Engineer - US Public Sector

      Build the future of data. Join the Snowflake team.

      Snowflake Support is committed to providing high-quality resolutions to help deliver data-driven business insights and results. We are a team of subject matter experts collectively working toward our customers’ success. We form partnerships with customers by listening, learning, and building connections.  

      Snowflake’s values are key to our approach and success in delivering world-class Support.  Putting customers first, acting with integrity, owning initiative and accountability, and getting it done are Snowflake core values and are reflected in everything we do.

      Snowflake’s Support team is expanding! We are looking for a Senior Cloud Support Engineer who likes working with data and solving a wide variety of issues utilizing their technical experience having worked on a variety of operating systems, database technologies, big data, data integration, connectors, and networking.

      MANDATORY REQUIREMENTS FOR THE ROLE:

      • The position may require access to U.S. export-controlled technologies, technical data, or sensitive government data.
      • Employment with Snowflake is contingent on Snowflake verifying that you: (i) may legally access U.S. export-controlled technologies, technical data, or sensitive government data; or (ii) are eligible to obtain, in a timely manner, any necessary license or other authorization(s) from the U.S. Government to allow access to U.S. export-controlled technology, technical data, or sensitive government data.

      SPECIAL REQUIREMENTS:

      • Participate in pager duty rotations during nights, weekends, and holidays
      • Applicants should be flexible with schedule changes to meet business needs

      As a Senior Cloud Support Engineer, your role is to delight our customers with your passion and knowledge of the Snowflake Cloud Data Platform.  Customers will look to you for technical guidance and advice in addressing their product usage and issue needs.  You will be the voice of the customer into Snowflake’s product and engineering teams for product feedback and improvements.  You will play an integral role in building knowledge within the team and be part of strategic initiatives for organizational and process improvements.  Ideally, you have worked in a 24x7 environment, handled all tiers of support issues, been on-call during weekends, and are familiar with Salesforce Service Cloud.

      YOU WILL: 

      • Provide technical support to commercial and public sector customers 
      • Drive technical solutions to complex problems providing in-depth analysis and guidance to Snowflake customers and partners using the following methods of communication: email, web, and phone
      • Adhere to response and resolution SLAs and escalation processes in order to ensure fast resolution of customer issues that exceed expectations
      • Utilize the Snowflake environment, connectors, 3rd party partner software, and tools to investigate issues 
      • Document known solutions to the internal and external knowledge base
      • Submit well-documented bugs and feature requests arising from customer-submitted requests
      • Partner with engineering teams in prioritizing and resolving customer requests
      • Participate in incident management and on-call rotation
      • Participate in a variety of Support initiatives
      • Provide support coverage during holidays and weekends based on business needs

      OUR IDEAL SENIOR CLOUD SUPPORT ENGINEER WILL HAVE:

      • Bachelor's or Master's degree in Computer Science or equivalent discipline
      • 5+ years experience in a Technical Support environment or a similar technical function in a customer-facing role
      • Experience in a designated customer-focused support role providing a personalized support experience with a deeper knowledge of the customer environment
      • Good understanding of Amazon AWS (SQS, SNS, API Gateway, S3, Route 53) and/or similar services in Microsoft Azure ecosystems, Google Cloud
      • Experience in ETL, ELT tools like AWS Glue, EMR, Azure Data Factory, Informatica, Data Pipelines
      • Experience in Spark, Kafka systems
      • Experience in using 3rd Party troubleshooting tools like Wireshark, Fiddler, Process Monitor/Explorer
      • Experience in analyzing the tcpdump, Fiddler Trace. Capturing the heap dump and analyzing the heap dump
      • Capturing the Stack trace and analyzing the trace
      • Foundational SQL experience/usage
      • Experience troubleshooting database connectivity issues using a variety of methods (client software, drivers/connectors)
      • Experience in Configuring/Troubleshooting Drivers like ODBC, JDBC, GO, .Net 
      • Excellent ability to troubleshoot on a variety of operating systems (Windows, Mac, *Nix) 
      • Good understanding of the technical fundamentals of the Internet. You should have knowledge of internet protocols such as TCP/IP, HTTP/S, SFTP, DNS as well as the ability to use diagnostic tools to troubleshoot connectivity issues
      • Debugging / Development experience in  Python, Java, or Scala 
      • Excellent writing and communication skills in English with an attention to detail
      • Strong teaming skills in a highly collaborative environment and the ability to function in global arenas

      NICE TO HAVES:

      • Experience working with big data and/or MPP (massively parallel processing) databases
      • Understanding of Data Warehousing fundamentals and concepts
      • Database migration and ETL experience
      • Scripting/coding experience In any of the following: .Net, NodeJS, R, GO
      • Experience supporting applications running on either Amazon AWS or MS Azure
      • Experience with virtualization solutions (VMware, Docker, Virtualbox, etc)
      • Understanding of cloud computing security concepts

      Every Snowflake employee is expected to follow the company’s confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company’s data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.

      See more jobs at snowflakecomputing

      Apply for this job

      +30d

      Senior Software Engineer in Test

      BrightcoveMexico - Remote
      agilescalasqlDesignmobileapiUXqac++dockerjavascriptbackendPHP

      Brightcove is hiring a Remote Senior Software Engineer in Test

      Brightcove is looking to hire a Senior Software Engineer in Test to help ensure the quality of services in the Brightcove Platform. These services receive millions of requests per day from millions of viewers per day. This is an exciting position with high visibility across the company and growth potential.

      Job Responsibilities 

      • Work with the development team to design and automate test cases and prioritize testing activities..
      • Execute test cases and report results both directly and through automation pipelines.
      • Create testing plans, mostly automated, for new and/or existing features.
      • Create and manage bug reports, communicate with the team, and verify bug fixes.
      • Contribute to existing tools, frameworks and related solutions, as well as building new tools that test things in new ways.
      • Work collaboratively with the development team to plan delivery, manage risk, improve quality, and streamline our software deployments.
      • Act as an advocate for Software Quality and the end users.
      • Work with Customer Support and Account Management to respond to customer impacting issues.

      Qualifications/Experience 

      • 5+ years of experience in software quality assurance.
      • Experience performing a broad range of testing activities, including exploratory, end to end, regression and specially API testing.
      • Experience with backend automation testing tools.
      • Deep expertise in writing test plans, test cases and parsing.
      • Proficient in programming languages and scripting such as Bash and Python.
      • Knowledge of docker containers / K8s.
      • Comfortable working in an agile/scrum setting.
      • Ability to identify dependencies/risks during sprint planning.
      • Collaborate with other software engineers, QA engineers, product owners, and UX designers for continuous improvement of product quality.
      • Experience with Agile Development Process and agile “whole team approach” testing.
      • Experience using Postman for API testing is preferred.
      • Experience on testing payments and subscriptions flows on multiple paywalls (Google Play, Apple Store, Stripe or others) is preferred.
      • Experience with testing on mobile devices, additional experience with Smart TVs and Roku devices is preferred.
      • Knowledge of Scala and Gatling performance tests is preferred.
      • Knowledge of Go, JavaScript, PHP or SQL is preferred.
      • Experience with video or media is preferred.
      • Work Arrangement - Remote, On-site, or Hybrid. Prefer Hybrid/On-site in GDL.

      How we’ll help you grow!

      • Collaborate with Media industry experts to learn about the business, our team loves to mentor.
      • Company provides access to professional development and training courses so you can grow into a testing expert.
      • Opportunity to expand scope and increase responsibility at an aggressive pace.

      About Brightcove 

      Brightcove is a diverse, global team of smart, passionate people who are revolutionizing the way organizations deliver video. We’re hyped up about storytelling, and about helping organizations reach their audiences in bold and innovative ways. When video is done right, it can have a powerful and lasting effect. Hearts open. Minds change. 

      Since 2004, Brightcove has been supporting customers that are some of the largest media companies, enterprises, events, and non-profit organizations in the world. There are over 600 Brightcovers globally, each of us representing our unique talents and we have built a culture that values authenticity, individual empowerment, excellence and collaboration. This culture enables us to harness the incredible power of video and create an environment where you will want to grow, stay and thrive. Bottom line: We take our video seriously, and we take great pride in doing it as #oneteam.

      WORKING AT BRIGHTCOVE 

      We strive to provide our employees with an environment where they can do their best work and be their best selves. This includes a focus on our employees’ work experience, actively creating a culture where inclusion and growth are at the center, and hiring, recognizing, promoting employees who are committed to living and breathing these same ideals.  

      While remote work arrangements are available for most positions we also offer hybrid or on-site working options in our vibrant Guadalajara office located right in front of Andares shopping mall where employees enjoy access to fully-stocked kitchens, company events and social activities as well as an inspiring work environment. We are focused on creating a culture where inclusion and growth are at the center. We value collaboration, creativity, work/life balance, professional growth and providing an empowering space for open communication. You will have plenty of opportunities to meet your colleagues around the globe as we also celebrate a variety of personal interests with organized groups and clubs including an Employee Action Committee, Women of Brightcove, Pride of Brightcove, Parents of Brightcove … and more to come!

      We recognize that no candidate is perfect and Brightcove would love to have the chance to get to know you. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender, gender identity or expression, or veteran status. Brightcove embraces diversity and seeks candidates who support persons of all identities and backgrounds. We strongly encourage individuals from underrepresented and/or marginalized identities to apply. If you need any accommodations for your interview, please email recruiting@brightcove.com

      BC21025

      See more jobs at Brightcove

      Apply for this job

      +30d

      Data Engineer AWS Python

      agileBachelor's degree3 years of experiencejiraterraformscalapostgressqloracleDesignmongodbazureapimysqljenkinspythonAWS

      FuseMachines is hiring a Remote Data Engineer AWS Python

      Data Engineer AWS Python - Fusemachines - Career Page

      See more jobs at FuseMachines

      Apply for this job

      +30d

      Staff Machine Learning Engineer

      TubiSan Francisco, CA; Remote
      scalaDesignc++pythonbackend

      Tubi is hiring a Remote Staff Machine Learning Engineer

      Join Tubi (www.tubi.tv), Fox Corporation's premium ad-supported video-on-demand (AVOD) streaming service leading the charge in making entertainment accessible to all. With over 200,000 movies and television shows, including a growing library of Tubi Originals, 200+ local and live news and sports channels, and 455 entertainment partners featuring content from every major Hollywood studio, Tubi gives entertainment fans an easy way to discover new content that is available completely free. Tubi's library has something for every member of our diverse audience, and we're committed to building a workforce that reflects that diversity. We're looking for great people who are creative thinkers, self-motivators, and impact-makers looking to help shape the future of streaming.

      About the Role:

      The Machine Learning team at Tubi works on core algorithms that define the entire experience of its 33+ million users. We work on different areas such as recommendations, search, content understanding and ads. We are searching for a talented and motivated Machine Learning Engineer to join our team. In this role you will work with a variety of machine learning algorithms, from traditional models to cutting-edge LLM technologies, to address relevant problems. You need a strong background in machine learning, hands-on development skills to tackle some of the challenges in a fast-paced dynamic environment, and the ability to collaborate well in a cross-functional setting. The tech stack you will be working with is Spark, Scala, and Python, including PyTorch. 

      Responsibilities:

      • Design and build end-to-end machine learning models from analysis through production
      • Conduct A/B tests to prove your ideas and share your learnings from the experiment results
      • Collaborate with Product and backend engineering teams to ship high-impact features 

      Requirements:

      • 6+ years of experience in machine learning engineering with production systems using Scala, Python, and Apache Spark
      • Experience owning recommendation or search models 
      • MSc or PhD in Computer Science, Statistics, Applied Mathematics, Physics, or other technical field. PhD preferred

      #LI-Remote # LI-MQ1

      Pursuant to state and local pay disclosure requirements, the pay range for this role, with final offer amount dependent on education, skills, experience, and location is is listed annually below. This role is also eligible for an annual discretionary bonus, long-term incentive plan, and various benefits including medical/dental/vision, insurance, a 401(k) plan, paid time off and other benefits in accordance with applicable plan documents.

      California, New York City, Westchester County, NY, and Seattle, WA
      $192,000$274,000 USD
      Colorado and Washington (excluding Seattle, WA)
      $172,000$245,000 USD

      Tubi is a division of Fox Corporation, and the FOX Employee Benefits summarized here, covers the majority of all US employee benefits.  The following distinctions below outline the differences between the Tubi and FOX benefits:

      • For US-based non-exempt Tubi employees, the FOX Employee Benefits summary accurately captures the Vacation and Sick Time.
      • For all salaried/exempt employees, in lieu of the FOX Vacation policy, Tubi offers a Flexible Time off Policy to manage all personal matters.
      • For all full-time, regular employees, in lieu of FOX Paid Parental Leave, Tubi offers a generous Parental Leave Program, which allows parents twelve (12) weeks of paid bonding leave within the first year of the birth, adoption, surrogacy, or foster placement of a child. This time is 100% paid through a combination of any applicable state, city, and federal leaves and wage-replacement programs in addition to contributions made by Tubi.
      • For all full-time, regular employees, Tubi offers a monthly wellness reimbursement.

      Tubi is proud to be an equal opportunity employer and considers qualified applicants without regard to race, color, religion, sex, national origin, ancestry, age, genetic information, sexual orientation, gender identity, marital or family status, veteran status, medical condition, or disability. Pursuant to the San Francisco Fair Chance Ordinance, we will consider employment for qualified applicants with arrest and conviction records. We are an E-Verify company.

      See more jobs at Tubi

      Apply for this job

      +30d

      Manager, Specialist Partner Sales Engineering

      snowflakecomputingRemote Bay Area, CA, USA
      Master’s Degree10 years of experiencescalasqlazurejavapythonAWSjavascript

      snowflakecomputing is hiring a Remote Manager, Specialist Partner Sales Engineering

      Build the future of data. Join the Snowflake team.

      ABOUT SNOWFLAKE:

      Snowflake is pioneering the future of data with its Cloud Data Platform, transforming how businesses leverage data to drive insights and innovation. As we expand our Sales Engineering team, we seek a Manager for the Specialist Partner Sales Engineering who will play a pivotal role in leading a team dedicated to empowering strategic partners in their application development journey on Snowflake's platform.

      Role Overview:

      The Manager,  Specialist Partner Sales Engineering, will be a player-coach, leading a team of Applications Specialists and Engineers focused on guiding partners like Blue Yonder, Fiserv, and Maxa.ai in developing high-quality applications rapidly on the Snowflake Cloud Data Platform. This leadership role requires a balance of technical depth, strategic thinking, and effective team management to ensure the success of our partners and, by extension, our customers.

      The ideal candidate will possess deep technical expertise managing highly technical resource,  full-stack development experience and data architecture, coupled with the ability to inspire and mentor a team. As a liaison between Snowflake and its partners, the Manager will foster innovation, streamline the application development lifecycle, and ensure that our partners can leverage the full capabilities of Snowflake's technology.

      KEY RESPONSIBILITIES:

      • Lead and mentor a team of technical specialists and engineers, fostering a culture of innovation, collaboration, and accountability.
      • Serve as a technical thought leader, guiding partners through their application development journey with Snowflake, ensuring the delivery of high-quality applications efficiently.
      • Develop strong relationships with key stakeholders across sales, product management, and technology partnerships, enhancing collaboration and alignment.
      • Provide hands-on technical guidance and support to partners, addressing application requirements, accelerating development cycles, and ensuring operational success.
      • Present Snowflake’s technology and vision to technical and executive audiences, emphasizing its utility and impact.
      • Collaborate with internal teams, including product management and alliances, to tailor solutions that expedite the development cycle and address product gaps and limitations.
      • Influence product roadmaps and strategies based on partner feedback and market needs.

      QUALIFICATIONS:

      • A minimum of 10 years of experience in full-stack application architecture, with a strong understanding of scaling applications with large data sets.
      • At least 4 years of direct people management experience in Manager or Director capacity with experience in hiring, performance management and strategy development
      • At least 3 years of customer-facing application development experience with major Cloud Providers (AWS, Azure, GCP).
      • Familiarity with Snowflake and its application framework highly desirable.
      • Exceptional leadership skills with a proven track record of managing technical teams to success.
      • Excellent presentation skills, capable of engaging both technical and executive audiences.
      • A broad range of experience with databases/data warehouses in on-prem and cloud environments.
      • Proficiency in SQL, and programming languages such as Javascript, Python, Java, Scala, Go, Ruby.
      • A Bachelor’s Degree in computer science, engineering, mathematics, or related fields is required; a Master’s Degree or equivalent experience is preferred.

      Every Snowflake employee is expected to follow the company’s confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company’s data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.

      See more jobs at snowflakecomputing

      Apply for this job

      +30d

      Applications Specialist, Partner Sales Engineer

      snowflakecomputingRemote Bay Area, CA, USA
      scalasqlDesignazurerubyjavapythonAWSjavascript

      snowflakecomputing is hiring a Remote Applications Specialist, Partner Sales Engineer

      Build the future of data. Join the Snowflake team.

      ABOUT SNOWFLAKE:

      At Snowflake, we are on a mission to revolutionize the world of data. We invite you to join our team and contribute to building the future of data, empowering our strategic partners to expedite their applications journey with our cutting-edge Cloud Data Platform. Our dynamic Sales Engineering team is expanding, and we are in search of an Applications Specialist in Partner Sales Engineering who will be a specialist in Native Apps, Managed Apps and Connected App frameworks for Snowfalke Data Cloud. This role is embedded within our Partner Sales Engineering organization dedicated to guiding strategic snowflake partners through the design and architecture of Applications on the Snowflake Cloud Data Platform, ensuring it becomes an integral part of their enterprise data strategy and broader ecosystem.

      ROLE OVERVIEW:

      In the role of Applications Specialist, Partner Sales Engineer, you will serve as a critical thought leader, primarily focused on empowering our partners to accelerate their application development journey using Snowflake's Cloud Data Platform. Your responsibility will encompass being the subject matter expert on Snowflake Application Framework, providing technical guidance to Partners on their Application ideas and accelerate the development lifecycle. Your goal will be to help Partners such as Blue Yonder, Fiserv and Maxa.ai produce as many high quality applications as possible in the shortest amount of time.

      Your role demands a highly technical skill set with tenured hands-on full-stack development experience involving databases/data warehouses and programming frameworks such as Java . You will engage with executives and technical teams to articulate Snowflake's value proposition and technical capabilities. A significant part of your role involves collaborative efforts with product management and partners to drive innovation, as well as working alongside sales and channel partners to develop tailored solutions that address customer needs. This approach is aimed at accelerating the sales process and guaranteeing the success of our customers and partners

      KEY RESPONSIBILITIES: 

      • Act as a Technical thought leader to guide partners through their application development journey with Snowflake
      • Build and maintain relationships with key stakeholders across sales, product management, and technology partnerships
      • Support Partners in understanding Application requirements, developing strategies to accelerate development cycles, and providing enterprise architecture solutions that ensure operational success.
      • Present Snowflake’s technology and vision to both executive and technical audiences, emphasizing its impact and utility.
      • Collaborate with internal teams and partners to innovate and refine Snowflake’s offerings based on partner feedback and market needs.
      • Work closely with Alliances team to create customized solutions that expedite the development  cycle and ensure successful outcomes for our Partners.
      • Collaborate with the Product Management team to provide feedback and enhance awareness of potential product gaps and limintations

      ON DAY ONE, WE WILL EXPECT YOU TO HAVE:

      • 10+ years of full stack application architecture experience with a deep understanding of how to scale applications with large data sets
      • 3+ years of Cloud Provider (AWS, Azure, GCP) customer facing application development experience
      • Experience with Snowflake a plus
      • Outstanding presentation skills to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos
      • Broad range of experience with on-prem and/or cloud databases
      • Knowledge of SQL, and Javascript, Python, Java, Scala, Go, Ruby or other languages
      • Industry Focus a plus  (Education, Federal, Financial Services, Healthcare & Lifesciences, Insurance, Adv. Media, Retail CPG, Technology and Telecom)
      • Bachelor’s Degree required, Masters Degree in computer science, engineering, mathematics or related fields, or equivalent experience preferred.

      Every Snowflake employee is expected to follow the company’s confidentiality and security standards for handling sensitive data. Snowflake employees must abide by the company’s data security plan as an essential part of their duties. It is every employee's duty to keep customer information secure and confidential.

      See more jobs at snowflakecomputing

      Apply for this job

      +30d

      Data Engineer 3

      agilescalaairflowsqloracleDesignazuregitc++mysqlpython

      Blueprint Technologies is hiring a Remote Data Engineer 3

      Who is Blueprint?

      We are a technology solutions firm headquartered in Bellevue, Washington, with a strong presence across the United States. Unified by a shared passion for solving complicated problems, our people are our greatest asset. We use technology as a tool to bridge the gap between strategy and execution, powered by the knowledge, skills, and the expertise of our teams, who all have unique perspectives and years of experience across multiple industries. We’re bold, smart, agile, and fun.

      What does Blueprint do?

      Blueprint helps organizations unlock value from existing assets by leveraging cutting-edge technology to create additional revenue streams and new lines of business. We connect strategy, business solutions, products, and services to transform and grow companies.

      Why Blueprint?

      At Blueprint, we believe in the power of possibility and are passionate about bringing it to life. Whether you join our bustling product division, our multifaceted services team or you want to grow your career in human resources, your ability to make an impact is amplified when you join one of our teams. You’ll focus on solving unique business problems while gaining hands-on experience with the world’s best technology. We believe in unique perspectives and build teams of people with diverse skillsets and backgrounds. At Blueprint, you’ll have the opportunity to work with multiple clients and teams, such as data science and product development, all while learning, growing, and developing new solutions. We guarantee you won’t find a better place to work and thrive than at Blueprint.

      What will I be doing?

      Job Summary:

      The Data Engineer’s responsibilities include designing, developing, and deploying Data Integration (ETL and or ELT) solutions using agreed upon design patterns and technologies; working with a large variety of data sources from json, csv, Oracle, SQL Server, Azure Synapse, Azure Analysis Services, Azure SQL DB, DataLake, polybase; and Streaming data sets.

       

      Supervisory Responsibilities:

      • None

       

      Duties/Responsibilities:

      • Create workflows, templates, and design patterns
      • Communicate with stakeholders to obtain accurate business requirements
      • Create and perform unit tests for solutions
      • Converting existing SSIS packages into Azure Data Factory Pipelines
      • Performs other related duties as assigned

       

      Required Skills/Abilities:

      •    Familiarity of SQL programming language basic fundamentals
      •    Familiarity of Basic understanding use of Python or R or Scala
      •    Awareness of Python/Synapse/Snowflake Distributed/Parallel Computing
      •    Familiarity of Basic understanding of modeling tools such as ERWin, Dbeaver, Lucid, Visio 
      •    Awareness of Git for version control of code repositories
      •    Awareness of RDBMS Development tools. SQL Enterprise Manager, Visual Studio, Azure Data Studio.
      •    Awareness of Open-source database platforms. MySQL
      •    Awareness of Big Data frameworks such as Pyspark, Hadoop, etc
      •    Familiarity of Modern Data Estate patterns: Source to Raw To Stage To Curated
      •    Familiarity of Databricks concepts: batch, streaming, autoloader, etc.
      •    Familiarity of Cloud diagnostics, logging, and performance monitoring/tuning.
      •    Familiarity of Understanding of data shoveling tools: ADF, Fivetran, Airflow, etc..

      •    Familiarity of Database i/o skills -- writing/reading structured and unstructured DBs
      •    Awareness of Debugging, documentation, testing, and optimization skills
      •    Awareness of Convert business needs into technical requirements
      •    Experience Explaining DE concepts to business stakeholders
      •    Experience Communicating and collaborating effectively with a remote team
      •    Experience Communicating effectively with interdisciplinary teams of various technical skill levels
      •    Learning Working effectively and delivering value in ambiguous settings

      Education and Experience:

      • Bachelor’s degree in Computer Science, Industrial Engineering, Business Analytics, or equivalent.
      • 3+ years of broad-based IT experience with technical knowledge of data Integration (ETL, ELT) technologies and approaches, Data Warehousing, Data Lake Methodologies.
      • 3+ years’ experience with SQL Server. Expert level TSQL knowledge required.
      • 3+ years’ experience designing and implementing scalable ETL processes including data movement (SSIS, replication, etc.) and quality tools.
      • 2+ years’ experience building cloud hosted data systems. Azure preferred.
      • 2+ years’ experience with SQL Server Analysis Services (SSAS)
      • 2+ years’ experience with SQL Server Integration Services (SSIS)

       

      Physical Requirements:

      • The employee is frequently required to sit at a workstation for extended and long periods of time. The employee will occasionally walk; will frequently use hands to finger, handle, grasp or feel; and reach with hands, wrists, or arms in repetitive motions.
      • The employee will frequently use fingers for manipulation of computers (laptop and desktops) and telephone equipment including continuous 10-key, handwriting, use of mouse (or alternative input device), use of keyboard (or alternative input device), or sporadic 10-Key, telephone, or telephonic headsets. This position will also frequently use other office productivity tools such as the printer/scanner.
      • Role requires the ability to lift, carry, push, pull and/or move up to 10lbs on a frequent basis and may require twisting actions of the upper body (i.e., Picking up, carrying a laptop – and twist to work on an L shape desk).
      • Specific vision abilities required by this job include close vision, distance vision, peripheral vision, depth perception, and ability to adjust focus. This position requires frequent use of a computer monitor and visual acuity to perform email responses, prepare and analyze data; transcribe; extensive reading and online communication.
      • Role requires being able to hear and use verbal communication for interactions with internal clients and dependent on role with external clients via conference calls.

       

      Cognitive Ability Requirements: The employee must have the ability to:

      • Works with others (co-workers, professionals, public, customers, clients)
      • Works professionally in alignment with the organization’s code of conduct
      • Interact face to face with others (co-workers, superiors)
      • Constant verbal and email communication with others (co-workers, supervisors, vendors, client, customers etc.) to exchange information
      • Ability to take constructive feedback and show courtesy to co-workers, professionals, public, customers, clients
      • Make quick, accurate decisions without supervision
      • Evaluate or make decisions based on experience or knowledge
      • Divide attention between issues requiring multi-tasking
      • Use judgment on routine matters
      • Distinguish situations requiring judgment and adaptation of procedures from one task to another
      • Adapt to tightly scheduled and hurried pace of work activities
      • Meet frequent project deadlines
      • Organize own work
      • Ask questions or request assistance when needed
      • Follow instructions received both orally and in writing

      Work Environment:

      • The work environment is usually a traditional office, indoor setting with no exposure to outside elements
      • This position requires no travel    
      • The employee will frequently be required to work closely with others and occasionally work alone
      • This position may require a work schedule across weekends and holidays
      • This position is subject to blackout dates which may include holidays where PTO is not approved
      • May work remotely based on adherence to the organizations work from home policy
      • Reasonable accommodations may be made to enable individuals with disabilities to perform the job

      Salary Range

      Pay ranges vary based on multiple factors including, without limitation, skill sets, education, responsibilities, experience, and geographical market. The pay range for this position reflects geographic based ranges for Washington state: $88,300 - $115,300 USD/annually. The salary/wage and job title for this opening will be based on the selected candidate’s qualifications and experience and may be outside this range.

      Equal Opportunity Employer

      Blueprint Technologies, LLC is an equal employment opportunity employer. Qualified applicants are considered without regard to race, color, age, disability, sex, gender identity or expression, orientation, veteran/military status, religion, national origin, ancestry, marital, or familial status, genetic information, citizenship, or any other status protected by law.

      If you need assistance or a reasonable accommodation to complete the application process, please reach out to: recruiting@bpcs.com

      Blueprint believe in the importance of a healthy and happy team, which is why our comprehensive benefits package includes:

      • Medical, dental, and vision coverage
      • Flexible Spending Account
      • 401k program
      • Competitive PTO offerings
      • Parental Leave
      • Opportunities for professional growth and development

       

      Location: Remote - USA

      See more jobs at Blueprint Technologies

      Apply for this job

      +30d

      Data Engineering Development Manager

      agilescalaairflowsqlDesignazuregitc++pythonAWS

      Blueprint Technologies is hiring a Remote Data Engineering Development Manager

      Who is Blueprint?

      We are a technology solutions firm headquartered in Bellevue, Washington, with a strong presence across the United States. Unified by a shared passion for solving complicated problems, our people are our greatest asset. We use technology as a tool to bridge the gap between strategy and execution, powered by the knowledge, skills, and the expertise of our teams, who all have unique perspectives and years of experience across multiple industries. We’re bold, smart, agile, and fun.

      What does Blueprint do?

      Blueprint helps organizations unlock value from existing assets by leveraging cutting-edge technology to create additional revenue streams and new lines of business. We connect strategy, business solutions, products, and services to transform and grow companies.

      Why Blueprint?

      At Blueprint, we believe in the power of possibility and are passionate about bringing it to life. Whether you join our bustling product division, our multifaceted services team or you want to grow your career in human resources, your ability to make an impact is amplified when you join one of our teams. You’ll focus on solving unique business problems while gaining hands-on experience with the world’s best technology. We believe in unique perspectives and build teams of people with diverse skillsets and backgrounds. At Blueprint, you’ll have the opportunity to work with multiple clients and teams, such as data science and product development, all while learning, growing, and developing new solutions. We guarantee you won’t find a better place to work and thrive than at Blueprint.

      We are looking for a Data Engineering Development Manager to join us as we build cutting-edge technology solutions!  This is your opportunity to be part of a team that is committed to delivering best in class service to our customers.

       In this role, you will lead and mentor a remote team of highly skilled data engineers, overseeing their development plans and performance reviews. Beyond managerial responsibilities, this role demands a hands-on approach, with an expected 50% involvement in coding for Data Engineer projects. This hands-on engagement serves as a model for your team, showcasing your commitment to technical excellence. Join us on this exciting journey where your leadership and technical acumen will play a vital role in shaping the success of our technology solutions.

      Responsibilities:

      Supervisory Responsibilities

      • Interview, hire, and train new staff.
      • Oversee the training and development of the Data Engineer team.
      • Provide constructive and timely performance evaluations.
      • Handle discipline and termination of employees in accordance with company policy.

      Duties/Responsibilities

      • Make high-level architecture decisions and execute them, with the ability to explain and defend those decisions to all stakeholders, both internal and external.
      • Plan and execute successful complex technical projects in an Agile process.
      • Model, query, optimize, and analyze large, business-critical datasets.
      • Host design and code reviews.
      • Collaborate on project plans, deliverables, and timeline estimates for Data Engineer projects.
      • Identify resourcing requirements for Data Engineer projects.
      • Participate in all stages of implementation, from early brainstorming to design, coding, and bug fixing.
      • Evaluate and identify use cases for new technologies.
      • Drive vision and alignment internally and across external stakeholders.
      • Comfortably speak to patterns and best practices for Data Engineering teams.

      Qualifications:

      Technical Skills Foundation

      • Proficient in Python, R, or Scala with a fundamental understanding of their use.
      • Skilled with Cloud technologies such as Azure, AWS, GCP, Snowflake.
      • Skilled with Big Data frameworks such as PySpark, Hadoop, etc.
      • Skilled with Open-source database platforms, particularly MySQL.
      • Skilled with Git for version control of code repositories.
      • Proficient in modeling tools such as ERWin, DBeaver, Lucid, SQLDBM, or Visio.
      • Skilled with RDBMS Development tools: SQL Enterprise Manager, Visual Studio, Azure Data Studio.

      Data Processing and Management

      • Skilled with Modern Data Estate patterns: Medallion architecture.
      • Skilled with Databricks concepts: batch, streaming, autoloader, etc.
      • Skilled with Cloud diagnostics, logging, and performance monitoring/tuning.
      • Skilled with understanding data shoveling tools: ADF, Fivetran, Airflow, etc.
      • Skilled with Data Governance concepts and tools.
      • Skilled with Data rule and Business rule application (schema vs Great expectations).
      • Skilled with CI/CD.

      Advanced Data Engineering

      • Expert in Data wrangling skills with csv, tsv, parquet, and json files.
      • Expert in Database I/O skills -- writing/reading structured and unstructured DBs.
      • Expert in Debugging, documentation, testing, and optimization skills.
      • Expert in explaining DE concepts to business stakeholders.
      • Skilled with providing hands-on-code support for blocked/struggling team members.

      Databricks Expertise

      • Proficiency in configuring and fine-tuning Databricks settings for optimal performance and resource utilization.
      • Experience in managing the Unity Catalog within Databricks, ensuring efficient organization and retrieval of metadata.
      • Competency in implementing robust access control measures within Databricks to safeguard data and maintain compliance.
      • Expertise in scheduling and monitoring jobs within Databricks, ensuring timely and accurate execution.
      • Proficiency in configuring security settings within Databricks to protect sensitive data and maintain a secure environment.

      Leadership and Collaboration

      • Strong people skills, ability to manage multiple tasks and projects, and operate within ambiguity.
      • Skilled in working effectively and delivering value in ambiguous settings.
      • Skilled in communicating and collaborating effectively with a remote team.
      • Skilled in communicating effectively with interdisciplinary teams of various technical skill levels.
      • Expert in communicating effectively with leadership and executives.
      • Expert in defining incremental deliverables to deliver value quickly and iterate.
      • Expert in prioritizing new projects/features in accordance with LOE and potential value.
      • Skilled in establishing short and long-term vision/goals for the team.
      • Skilled in establishing policies and principles for the team.
      • Expert in converting business needs into technical requirements.
      • Skilled in mentoring other Data Engineers.
      • Skilled with interviewing and selecting new team members according to the needs of the team.
      • Skilled in working with internal groups such as Marketing and Sales on collaborative strategies.
      • Skilled in contributing to presales conversations with prospective clients.

      Preferred Qualifications:

      • Experience with Azure required; AWS strongly preferred.

      Salary Range

      Pay ranges vary based on multiple factors including, without limitation, skill sets, education, responsibilities, experience, and geographical market. The pay range for this position reflects geographic based ranges for Washington state: $164,900 to $207,200 USD/annually. The salary/wage and job title for this opening will be based on the selected candidate’s qualifications and experience and may be outside this range.

      Equal Opportunity Employer

      Blueprint Technologies, LLC is an equal employment opportunity employer. Qualified applicants are considered without regard to race, color, age, disability, sex, gender identity or expression, orientation, veteran/military status, religion, national origin, ancestry, marital, or familial status, genetic information, citizenship, or any other status protected by law.

      If you need assistance or a reasonable accommodation to complete the application process, please reach out to: recruiting@bpcs.com

      Blueprint believe in the importance of a healthy and happy team, which is why our comprehensive benefits package includes:

      • Medical, dental, and vision coverage
      • Flexible Spending Account
      • 401k program
      • Competitive PTO offerings
      • Parental Leave
      • Opportunities for professional growth and development

      Location:Remote

      See more jobs at Blueprint Technologies

      Apply for this job

      +30d

      Software Engineer, Trust & Safety

      GeminiRemote (USA)
      scalaDesignAWSbackend

      Gemini is hiring a Remote Software Engineer, Trust & Safety

      About the Company

      Gemini is a global crypto and Web3 platform founded by Tyler Winklevoss and Cameron Winklevoss in 2014. Gemini offers a wide range of crypto products and services for individuals and institutions in over 70 countries.

      Crypto is about giving you greater choice, independence, and opportunity. We are here to help you on your journey. We build crypto products that are simple, elegant, and secure. Whether you are an individual or an institution, we help you buy, sell, and store your bitcoin and cryptocurrency. 

      At Gemini, our mission is to unlock the next era of financial, creative, and personal freedom.

      In the United States, we have a flexible hybrid work policy for employees who live within 30 miles of our office headquartered in New York City. Employees within the New York Metropolitan area are expected to work from the NYC office twice a week, unless there is a job-specific requirement to be in the office every workday. Employees outside of this area are considered part of our remote-first workforce. We believe our hybrid approach for those near our NYC office increases productivity through more in-person collaboration where possible.

      The Department: Service Fundamentals (Trust & Safety)

      The Role: Software Engineer

      As an engineer on the Trust & Safety team at Gemini, you’ll be directly involved in building controls that prevent fraud for all users of Gemini. We are building controls that monitor customer activities for malicious behavior, illicit transactions and raise alarms in the event a transaction is suspicious. We are also focusing on creating controls to combat account takeovers and continuously monitor customers transactions for hints suspicious activity. We are a primarily backend team working in Scala. We have a strong culture of code reviews, and a focus on security, with the end goal of writing and shipping high-quality, highly scalable and available systems. We want to continue building the best product we can as we scale and grow our business. If you get excited about solving technical challenges that directly impact our customers, clients, and the rest of the Gemini team, we’d love to hear from you.

      Responsibilities:

      • Develop new products and product features on the Gemini platform, as part of a tight knit team of seven to eight developers.
      • Write automated tests to ensure the operation and correctness of new product features.
      • Provide technical input and knowledge to the planning, design, and requirements process for new products and features.
      • Review other software engineers’ code for correctness, style, and information security concerns.
      • Improve the performance, maintainability, and operations of the Gemini codebase by engaging in occasional refactoring and upgrade projects.
      • Support your team’s production software by responding to an occasional alert or bug report.

      Minimum Qualifications:

      • At least 2+ years of software engineering experience.
      • Proficiency with the JVM (Scala preferred).
      • The ability to adapt and handle multiple competing priorities in collaboration with peers.
      • A customer and product-focused mindset, with the ability to make well-reasoned tradeoffs between speed and quality.
      • A proven track record of working with distributed systems.
      • Familiarity writing highly observable, well monitored code.

      Preferred Qualifications:

      • Familiarity with AWS cloud infrastructure.
      • Interest in working with Functional Programming paradigms.
      • Prior experience working with gRPC and/or protobuf.
      It Pays to Work Here
       
      The compensation & benefits package for this role includes:
      • Competitive starting salary
      • A discretionary annual bonus
      • Long-term incentive in the form of a new hire equity grant
      • Comprehensive health plans
      • 401K with company matching
      • Annual Learning & Development stipend
      • Paid Parental Leave
      • Flexible time off

      Salary Range: The base salary range for this role is between $120,000 - $168,000 in the State of New York, the State of California and the State of Washington. This range is not inclusive of our discretionary bonus or equity package. When determining a candidate’s compensation, we consider a number of factors including skillset, experience, job scope, and current market data.

      At Gemini, we strive to build diverse teams that reflect the people we want to empower through our products, and we are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, or Veteran status. Equal Opportunity is the Law, and Gemini is proud to be an equal opportunity workplace. If you have a specific need that requires accommodation, please let a member of the People Team know.

      #LI-REMOTE

      Apply for this job

      +30d

      Software Engineer

      HomeAdvisor & Angie\'s listNew York, NY - Remote
      agilescalagitrubyjavadockerelasticsearchkubernetesredux

      HomeAdvisor & Angie\'s list is hiring a Remote Software Engineer

      Angi® is transforming the home services industry, creating an environment for homeowners, service professionals and employees to feel right at “home.” For most home maintenance needs, our platform makes it easier than ever to find a qualified service professional for indoor and outdoor jobs, home renovations (or anything in between!). We are on a mission to become the home for everything home by helping small businesses thrive and providing solutions to financing and booking home jobs with just a few clicks.  

      Over the last 25 years we have opened our doors to a network of over 200K service professionals and helped over 150 million homeowners love where they live. We believe home is the most important place on earth and are embarking on a journey to redefine how people care for their homes. Angi is an amazing place to build your dream career, join us—we cannot wait to welcome you home!

      The Opportunity:

      We are looking for a Software Developer with 4+ years of experience to join our growing team! In this role, you'll take on the rewarding challenge of not just writing clean, efficient code, but also nurturing it through testing, documentation, and clear communication. It's a chance to truly own your work and contribute to a team where learning and growth are part of the daily routine, all within a supportive environment that values collaboration and personal development. Our software is currently written with Java, Ruby, Scala and JavaScript. This position can be hybrid in NYC, Denver, or Indianapolis, or fully remote.

      As a Software Engineer:

      • You will foster a collaborative environment for you and your teammates to deliver high-quality, reliable, and well-tested features to ensure our customers and professionals enjoy the best possible experience with our product.
      • Collaborating with professionals across the organization to gain a shared understanding of the initiative, you will analyze requirements and propose solutions that meet product and business needs while balancing time and cost.
      • You'll write, modify and review clean, maintainable code and implement features to enhance our application's performance and scalability. You'll engage actively in code sprints and agile processes, contributing to all development lifecycle phases. As you go, you'll create or modify a suite of tests to exercise the initiative’s functionality in an automated manner.
      • You're an owner - you're responsible for operating what you and your teammates built in production. You'll ensure that the code meets performance, reliability, quality, security, and testability standards.
      • Your voice matters - you'll actively participate and lead discussions in team and project meetings to ensure we're solving the right problems, designing systems in a scalable way, and delivering products that help customers love where they live & pros build their businesses.

      Who you are:

      • You have at least 4 years of hands-on development experience, ideally in a tech or marketplace environment
      • You have a BS or MS in Computer Science or related STEM field
      • You have experience in developing enterprise-level features with an emphasis on functional programming, ideally in Java, Scala, or Ruby on Rails
      • You understand how to use code versioning tools, such as Git
      • You're an exceptional communicator & can work with and effectively collaborate across multiple technical and non-technical teams

      Preferred:

      • Familiarity with microservices and creating RESTful APIs
      • Understanding of React.js and Redux and their core principles
      • Experience with ElasticSearch and Kafka
      • Experience working with application monitoring tools such as New Relic
      • Experienced with application monitoring strategies and Tools (New Relic, etc)
      • Experience with containerization tools (Docker, Kubernetes)
      • Experience working with less experienced engineers, providing them with coaching and mentorship to help them become better engineers
      • Willingness to learn and apply new skills and technologies

      Compensation & Benefits:

      • The salary band for this position ranges from $110,000-175,000, commensurate with experience and performance.
      • Full medical, dental, vision package and a retirement plan to fit your needs
      • Flexible vacation policy; work hard and take time when you need it
      • The rare opportunity to work with sharp, motivated teammates solving some of the most unique challenges and changing the world

      #LI-Remote

      Apply for this job

      +30d

      Sr. Data Engineer - AWS & Databricks

      agile5 years of experiencescalaairflowDesignc++pythonAWS

      Blueprint Technologies is hiring a Remote Sr. Data Engineer - AWS & Databricks

      Who is Blueprint?

      We are a technology solutions firm headquartered in Bellevue, Washington, with a strong presence across the United States. Unified by a shared passion for solving complicated problems, our people are our greatest asset. We use technology as a tool to bridge the gap between strategy and execution, powered by the knowledge, skills, and the expertise of our teams, who all have unique perspectives and years of experience across multiple industries. We’re bold, smart, agile, and fun.

      What does Blueprint do?

      Blueprint helps organizations unlock value from existing assets by leveraging cutting-edge technology to create additional revenue streams and new lines of business. We connect strategy, business solutions, products, and services to transform and grow companies.

      Why Blueprint?

      At Blueprint, we believe in the power of possibility and are passionate about bringing it to life. Whether you join our bustling product division, our multifaceted services team or you want to grow your career in human resources, your ability to make an impact is amplified when you join one of our teams. You’ll focus on solving unique business problems while gaining hands-on experience with the world’s best technology. We believe in unique perspectives and build teams of people with diverse skillsets and backgrounds. At Blueprint, you’ll have the opportunity to work with multiple clients and teams, such as data science and product development, all while learning, growing, and developing new solutions. We guarantee you won’t find a better place to work and thrive than at Blueprint.

      We are looking for a Sr. Data Engineer – AWS & Databricks to join us as we build cutting-edge technology solutions!  This is your opportunity to be part of a team that is committed to delivering best in class service to our customers.

       In this role will play a crucial role in designing, developing, and maintaining robust data infrastructure solutions, ensuring the efficient and reliable flow of data across our organization. If you are passionate about data engineering, have a strong background in AWS and Databricks, and thrive in a collaborative and innovative environment, we want to hear from you.

      Responsibilities:

      • Design, implement, and maintain scalable data architectures that supports our client’s data processing and analysis needs.
      • Collaborate with cross-functional teams to understand data requirements and translate them into efficient and effective data pipeline solutions.
      • Develop, optimize, and maintain ETL (Extract, Transform, Load) processes to ensure the timely and accurate movement of data across systems.
      • Implement best practices for data pipeline orchestration and automation using tools like Apache Airflow.
      • Leverage AWS services, such as S3, Redshift, Glue, EMR, and Lambda, to build and optimize data solutions.
      • Utilize Databricks for big data processing, analytics, and machine learning workflows.
      • Implement data quality checks and ensure the integrity and accuracy of data throughout the entire data lifecycle.
      • Establish and enforce data governance policies and procedures.
      • Optimize data processing and query performance for large-scale datasets within AWS and Databricks environments.
      • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and provide the necessary infrastructure.
      • Document data engineering processes, architecture, and configurations.

      Qualifications:

      • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
      • Minimum of 5 years of experience in data engineering roles, with a focus on AWS and Databricks.
      • Proven expertise in AWS services (S3, Redshift, Glue, EMR, Lambda) and Databricks.
      • Strong programming skills in languages such as Python, Scala, or Java.
      • Experience with data modeling, schema design, and database optimization.
      • Proficiency in using data pipeline orchestration tools (e.g., Apache Airflow).
      • Familiarity with version control systems and collaboration tools.
      • Ability to troubleshoot complex data issues and implement effective solutions.
      • Strong communication and interpersonal skills.
      • Ability to work collaboratively in a team-oriented environment.
      • Proactive in staying updated with industry trends and emerging technologies in data engineering.

      Salary Range

      Pay ranges vary based on multiple factors including, without limitation, skill sets, education, responsibilities, experience, and geographical market. The pay range for this position reflects geographic based ranges for Washington state: $146,400 to $175,100 USD/annually. The salary/wage and job title for this opening will be based on the selected candidate’s qualifications and experience and may be outside this range.

      Equal Opportunity Employer

      Blueprint Technologies, LLC is an equal employment opportunity employer. Qualified applicants are considered without regard to race, color, age, disability, sex, gender identity or expression, orientation, veteran/military status, religion, national origin, ancestry, marital, or familial status, genetic information, citizenship, or any other status protected by law.

      If you need assistance or a reasonable accommodation to complete the application process, please reach out to: recruiting@bpcs.com

      Blueprint believe in the importance of a healthy and happy team, which is why our comprehensive benefits package includes:

      • Medical, dental, and vision coverage
      • Flexible Spending Account
      • 401k program
      • Competitive PTO offerings
      • Parental Leave
      • Opportunities for professional growth and development

      Location:Remote

      See more jobs at Blueprint Technologies

      Apply for this job

      +30d

      Sr. Data Engineer - Azure & Databricks

      agile5 years of experiencescalaDesignazurec++python

      Blueprint Technologies is hiring a Remote Sr. Data Engineer - Azure & Databricks

      Who is Blueprint?

      We are a technology solutions firm headquartered in Bellevue, Washington, with a strong presence across the United States. Unified by a shared passion for solving complicated problems, our people are our greatest asset. We use technology as a tool to bridge the gap between strategy and execution, powered by the knowledge, skills, and the expertise of our teams, who all have unique perspectives and years of experience across multiple industries. We’re bold, smart, agile, and fun.

      What does Blueprint do?

      Blueprint helps organizations unlock value from existing assets by leveraging cutting-edge technology to create additional revenue streams and new lines of business. We connect strategy, business solutions, products, and services to transform and grow companies.

      Why Blueprint?

      At Blueprint, we believe in the power of possibility and are passionate about bringing it to life. Whether you join our bustling product division, our multifaceted services team or you want to grow your career in human resources, your ability to make an impact is amplified when you join one of our teams. You’ll focus on solving unique business problems while gaining hands-on experience with the world’s best technology. We believe in unique perspectives and build teams of people with diverse skillsets and backgrounds. At Blueprint, you’ll have the opportunity to work with multiple clients and teams, such as data science and product development, all while learning, growing, and developing new solutions. We guarantee you won’t find a better place to work and thrive than at Blueprint.

      We are looking for a Sr. Data Engineer – Azure & Databricks to join us as we build cutting-edge technology solutions!  This is your opportunity to be part of a team that is committed to delivering best in class service to our customers.

       In this role you will be responsible for designing, developing, and maintaining efficient data infrastructure solutions, ensuring seamless data flow across our organization. You must be passionate about data engineering, possess a solid background in Azure and Databricks, and thrive in a collaborative and innovative environment.

      Responsibilities:

      • Architect, implement, and maintain scalable data architectures to meet our client's data processing and analytics requirements.
      • Collaborate with cross-functional teams to understand data needs and translate them into effective data pipeline solutions.
      • Develop, optimize, and maintain ETL processes to facilitate the smooth and accurate movement of data across systems.
      • Implement best practices for data pipeline orchestration and automation using Azure Data Factory or similar tools.
      • Leverage Azure services, including Azure Blob Storage, Azure Synapse Analytics, Azure Databricks, and Azure Functions, to build and optimize data solutions.
      • Utilize Databricks for big data processing, analytics, and machine learning workflows.
      • Establish data quality checks and ensure data integrity and accuracy throughout the data lifecycle.
      • Implement and enforce data governance policies and procedures.
      • Optimize data processing and query performance for large-scale datasets within Azure and Databricks environments.
      • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and provide necessary infrastructure.
      • Document data engineering processes, architecture, and configurations.

      Qualifications:

      • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
      • Minimum of 5 years of experience in data engineering roles, with a focus on Azure and Databricks.
      • Proficient in Azure services such as Azure Blob Storage, Azure Synapse Analytics, Azure Databricks, and Azure Functions.
      • Strong programming skills in languages such as Python, Scala, or Java.
      • Experience with data modeling, schema design, and database optimization.
      • Experience with data pipeline orchestration tools (e.g., Azure Data Factory).
      • Familiarity with version control systems and collaboration tools.
      • Ability to troubleshoot complex data issues and implement effective solutions.
      • Strong communication and interpersonal skills.
      • Ability to work collaboratively in a team-oriented environment.
      • Proactive in staying updated with industry trends and emerging technologies in data engineering.

      Salary Range

      Pay ranges vary based on multiple factors including, without limitation, skill sets, education, responsibilities, experience, and geographical market. The pay range for this position reflects geographic based ranges for Washington state: $146,400 to $175,100 USD/annually. The salary/wage and job title for this opening will be based on the selected candidate’s qualifications and experience and may be outside this range.

      Equal Opportunity Employer

      Blueprint Technologies, LLC is an equal employment opportunity employer. Qualified applicants are considered without regard to race, color, age, disability, sex, gender identity or expression, orientation, veteran/military status, religion, national origin, ancestry, marital, or familial status, genetic information, citizenship, or any other status protected by law.

      If you need assistance or a reasonable accommodation to complete the application process, please reach out to: recruiting@bpcs.com

      Blueprint believe in the importance of a healthy and happy team, which is why our comprehensive benefits package includes:

      • Medical, dental, and vision coverage
      • Flexible Spending Account
      • 401k program
      • Competitive PTO offerings
      • Parental Leave
      • Opportunities for professional growth and development

      Location:Remote - USA

      See more jobs at Blueprint Technologies

      Apply for this job

      +30d

      Customer Support Engineer-Canada

      h2o.aiRemote
      terraformscalaDesignazurejavac++dockerkuberneteslinuxpythonAWS

      h2o.ai is hiring a Remote Customer Support Engineer-Canada

      Customer Support Engineer-Canada - h2o.ai - Career Page

      See more jobs at h2o.ai

      Apply for this job

      +30d

      Lead Data Engineer (AdTech)

      Sigma SoftwareSão Paulo, Brazil, Remote
      7 years of experienceBachelor's degreescalasqlDesignAWS

      Sigma Software is hiring a Remote Lead Data Engineer (AdTech)

      Job Description

      • Design and implement robust data infrastructure using Spark with Scala
      • Collaborate with our cross-functional teams to design data solutions that meet business needs
      • Build out our core data pipelines, store data in optimal engines and formats, and feed our machine-learning models
      • Leverage and optimize AWS resources
      • Collaborate closely with the Data Science team 

      Qualifications

      • Minimum of 7 years of experience in data engineering
      • Proven experience building data infrastructure using Spark with Scala
      • Familiarity with data lakes, cloud warehouses, and storage formats 
      • Strong proficiency in AWS services 
      • Expertise in SQL for data manipulation and extraction 
      • Excellent written and verbal communication skills 
      • Bachelor's degree in Computer Science or a related field 

      WILL BE A PLUS:

      • Experience in AdTech
      • Familiarity with Elastic Map Reduce (EMR) 
      • Previous experience leading and/or building out a Data Engineering function 
      • Proven experience working closely with Data Science teams on machine learning pipelines 

      See more jobs at Sigma Software

      Apply for this job

      +30d

      Middle/Senior Java Developer (AdTech)

      Sigma SoftwareMedellín, Colombia, Remote
      kotlinscalapostgressqloracleDesigngitjavamysqlAWS

      Sigma Software is hiring a Remote Middle/Senior Java Developer (AdTech)

      Job Description

      • Implement portions of software following given classes/components design and using your primary tech stack 
      • Ensure quality, maintainability, and conformance of software to best practices 
      • Produce clean code 
      • Participate in requirements clarification sessions, collect inputs and requirements of assigned tasks 
      • Proactively review own code with peers to ensure its quality 
      • Participate in estimation and planning sessions 
      • Play supervisory, advisory, and coaching roles for one or several Juniors specialists, ensuring their assigned tasks are delivered thanks to guidance and peer reviews provided 
      • Develop technical project documentation and user documentation 
      • Participate in project and team meetings, provide relevant contributions and information 

      Qualifications

      • At least 4+ years of working experience with Java 
      • Knowledge of concurrency, multithreading, and performance optimization 
      • Experience with any CI/CD and any collaboration tool, such as GitHub 
      • Deep understanding of software development principles, methodologies, design patterns, and best practices 
      • At least one modern build tool (Maven, Gradle, sbt) 
      • Experience with Spring Boot 
      • Experience with IO, network IO, and serialization  
      • Experience with at least one RDBMS (Oracle, Postgres, MySQL, SQL Server, etc.) 
      • Experience with AWS (at least on a user level) 
      • Experience with unit and module testing 
      • Proficiency with such tools as Git, IDEs, etc. 
      • At least an Upper-Intermediate level of English 

      WILL BE A PLUS

      • Knowledge or experience building high-load concurrent, low-latency applications 
      • Knowledge/experience with Java21 Virtual Threads and structural concurrency 
      • Experience with UNIX systems 
      • Experience with Docker/Kubernetes 
      • Knowledge of the AdTech domain 
      • Knowledge or experience related to Bidder development 
      • Previous experience with Kotlin or Scala 
      • Experience with any data framework (e.g., Spark, Flink, Hadoop) or data store (e.g., Spark, Hive, Redshift, Presto, Snowflake) 

       

      See more jobs at Sigma Software

      Apply for this job