python Remote Jobs

1558 Results

4d

(Senior) Data Engineer (m/f/x) onsite or remote in Germany

Scalable GmbHBerlin, Germany, Remote
SalesS3agilekotlinterraformsqlscrumjavapythonAWSbackend

Scalable GmbH is hiring a Remote (Senior) Data Engineer (m/f/x) onsite or remote in Germany

Job Description

  • Develop our scalable cloud-based data backbone which drives our data-driven company using the most up-to-date technologies in the data space
  • Shape an AWS based streaming and batch data processing solution, ingesting data from 3rd party as well as our internal backend services
  • Create a financial data warehouse combining latest technologies with features required by regulatory requirements
  • Prepare and clean structured and unstructured data and develop high-quality data models for advanced analytics, machine learning and AI use cases
  • Work with highly ambitious and skilled people in our growing data department
  • Work closely together with our data scientists, product & development colleagues to release smart features for our product
  • Build interactive dashboards and reporting solutions to support stakeholders, such as management, marketing, sales and quantitative research
  • Share your expert knowledge about data best practices within the company

Qualifications

  • Excellent University degree in computer science, mathematics, natural sciences, or similar field and relevant working experience
  • Experience designing and operating data pipelines in AWS
  • Excellent SQL Skills, including advanced concepts such as window functions, experience with dbt is a plus
  • Very good programming skills in Python, including frameworks like PySpark
  • Knowledge of Java and Kotlin is a plus
  • Experience with AWS Services like S3, Athena, Redshift and Glue
  • Experience using infrastructure-as-code tools such as terraform
  • A passion for everything-as-code and code that is well architected, testable and documented
  • Data-driven and good with numbers whilst being able to explain complex concepts in simple terms
  • Experience using agile frameworks like Scrum
  • Interest in financial services and markets
  • Fluent English communication and presentation skills
  • Sense of humour and positive outlook on life

See more jobs at Scalable GmbH

Apply for this job

5d

Lead Data Engineer

Blend36Edinburgh, United Kingdom, Remote
Designapigitdockerkubernetespython

Blend36 is hiring a Remote Lead Data Engineer

Job Description

Life as a Lead Data Engineer at Blend

We are looking for someone who is excited by the idea leading a data engineering squad to develop best in class analytical infrastructures and pipelines.

However, they also need to be aware of the practicalities of making a difference in the real world – whilst we love innovative advanced solutions, we also believe that sometimes a simple solution can have the most impact.

Our Lead Data Engineer is someone who feels most comfortable around solving problems, answering questions and proposing solutions. We place a high value on the ability to communicate and translate complex analytical thinking into non-technical and commercially oriented concepts, and experience working on difficult projects and/or with demanding stakeholders is always appreciated.

Reporting to the Director of Data Engineering the role will work closely with the other Data Engineering Leads and other capabilities in the organisation such as the Data Science, Data Strategy and Business Development teams.

This role will be responsible for driving high delivery standards and innovation within the data engineering team and the wider company. This involves delivering data solutions to support the provision of actionable insights for stakeholders.

Our Data Engineering Leads remain hands-on and work with the Data Engineers within their squad.

What can you expect from the role?

  • Lead project delivery, covers overseeing the end-to-end delivery of projects and ensure robust project governance from a Data Engineering perspective.
  • Squad management, responsible for managing a team of Data Engineers, from Junior to Senior levels, covering resource utilization, training, mentoring, and recruitment.
  • Stakeholder engagement by preparing and present data-driven solutions to stakeholders, translating complex technical concepts into actionable insights.
  • Data pipeline ownership by designing, develop, deploy, and maintain scalable, reliable, and efficient data pipelines.
  • Domain expertise by keeping up to date on emerging trends and advancements within data ecosystems, ensuring the team remains at the forefront of innovation.
  • Business development support by providing expert input into proposal submissions and business development initiatives from a Data Engineering perspective.
  • Champion engineering excellence by promoting best practices for high-quality engineering standards that reduce future technical debt.
  • Evolve best practices by continuously refining the Data Engineering team's ways of working, driving alignment across the squad and wider teams.
  • Strategic contributions by collaborating with the Data Engineering Director on strategic workstreams, contributing to the Data Engineering strategy and Go-to-Market (GTM) initiatives.

Qualifications

What you need to have?

Experience:

  • Proven experience in leading technical teams and building applications using microservice architecture.
  • Extensive experience with Python and FastAPI.
  • Strong understanding of microservices principles and components such as logging, monitoring, health checks, scalability, resilience, service discovery, API gateways, and error handling.

Technical Skills:

  • Proficiency with Pydantic and data validation in FastAPI.
  • Experience with containerization technologies (e.g., Docker, Kubernetes).
  • Familiarity with CI/CD pipelines and tools.
  • Knowledge of API design and implementation.
  • Experience with monitoring and logging tools (e.g., Prometheus, Grafana, other).
  • Knowledge of security best practices in microservices architecture.
  • Familiarity with version control systems (e.g., Git) and collaborative development workflows.

Nice to Have:

  • Experience working with Retriever models, including implementation of chunking strategies.
  • Knowledge and use of vector databases.
  • Understanding of optimal approaches in querying LLM models via API.
  • Prompt engineering knowledge, with familiarity in different strategies.
  • Exposure to various prompt engineering techniques in different scenarios.

 

 

 

See more jobs at Blend36

Apply for this job

5d

Threat Intelligence Researcher

SecurityScorecardRemote (India)
SalesBachelor's degree5 years of experienceDesignrubyc++pythonjavascript

SecurityScorecard is hiring a Remote Threat Intelligence Researcher

About SecurityScorecard:

SecurityScorecard is the global leader in cybersecurity ratings, with over 12 million companies continuously rated, operating in 64 countries. Founded in 2013 by security and risk experts Dr. Alex Yampolskiy and Sam Kassoumeh and funded by world-class investors, SecurityScorecard’s patented rating technology is used by over 25,000 organizations for self-monitoring, third-party risk management, board reporting, and cyber insurance underwriting; making all organizations more resilient by allowing them to easily find and fix cybersecurity risks across their digital footprint. 

Headquartered in New York City, our culture has been recognized by Inc Magazine as a "Best Workplace,” by Crain’s NY as a "Best Places to Work in NYC," and as one of the 10 hottest SaaS startups in New York for two years in a row. Most recently, SecurityScorecard was named to Fast Company’s annual list of theWorld’s Most Innovative Companies for 2023and to the Achievers 50 Most Engaged Workplaces in 2023 award recognizing “forward-thinking employers for their unwavering commitment to employee engagement.”  SecurityScorecard is proud to be funded by world-class investors including Silver Lake Waterman, Moody’s, Sequoia Capital, GV and Riverwood Capital.

About the Team:

The Threat Research team within the STRIKE (SecurityScorecard, Threat Research, Intelligence, Knowledge & Engagement) at SecurityScorecard drives both basic and applied security research that directly and indirectly contribute to the security posture of our customers. The team has several objectives, including tracking, investigating, and analyzing the latest advanced threats and campaigns affecting our customers and their vendors, the planning, software development, and design of threat data collections systems that produce signals which can automatically highlight active threats to customers or intrusions, and advising both internal and external stakeholders up the C-level on their security risk posture as part of threat intel’s professional services.

The tight-knit SecurityScorecard Threat research team brings together staff with a combination of skills ranging from fundamental cyber threat intelligence gathering, software engineering, vulnerability analysis, Internet measurement, malware research, digital forensics, machine learning and data analysis, and networking and operating systems fundamentals that all together lead to the sourcing of active threats and data that can better help SecurityScorecard's customers protect their assets, understand their vendors, and educate their staff.

This team works in tandem with other teams in STRIKE, as well teams outside, including Data Science, Attribution, Scoring, and Data Analytics and Engineering, as well as publishes and communicates research with the outside world through conferences, partnerships, and organizations like the Cyber Threat Alliance, CISA, ISACS, and the FBI.

About the Role:

In this role, we are looking for an experienced threat hunter/threat researcher that is comfortable with ambiguity, has significant experience writing automation code to gather threat intelligence, has demonstrated expertise at the upper levels of the security community, and is self-driven and able to work in an environment where every day presents a new challenge.

The right candidate will be expected to lead and/or play a major role in the following activities:

  • Tracking active campaigns from major threat actors both known and unknown against public, private, and government entities and automating collection of data on these topics
  • Writing automation code in Python to collect new in-house developed threat intelligence data that will be consumed by upstream teams and products
  • Maintaining knowledge of Advanced Persistent Threat (APT), ransomware, and major cybercrime Tactics Techniques and Procedures(TTPs)
  • Writing and publishing reports and then sharing with the security research community through our partnerships
  • Teaching and training others in the company on the tactics and methods of tracking advanced threats
  • Providing threat context and integration support to multiple SecurityScorecard products, customers, and sales architects
  • Analyzing technical data to extract attacker TTPs, identify unique attributes of malware, map attacker infrastructure, and pivot to related threat data
  • Identifying and hunting for emerging threat activity across all internal/external sources
  • Establishing standards, taxonomy, and processes for threat modeling and integration
  • Performing threat research and analysis during high-severity cyber-attacks impacting SecurityScorecard customers globally

Required Qualifications:

  • Has at least 2-5 years of experience in security research broadly, including hunting threat actors (criminals or nation states), with specific technical experience (analysis of campaigns, malware involved, command and control (C2) servers, and CVEs exploited)
  • Analysis of campaigns and actors extends beyond data breaches and traditional attacks (e.g. DDoS, public leaked credentials to network access) to sophisticated, nation-state or cybercrime-driven campaigns
  • Fluent in at least one high-level programming language (Python, Go, Ruby, JavaScript, etc.) and ability to use the experience to automate threat hunting and threat intelligence gathering activities (in Threat Research we use Python on a daily basis)
  • Experience working with threat intelligence platforms such as MISP and related analysis systems such as Splunk, VirusTotal Intelligence Graph Explorer, Silobreaker, or other commercial tools for analyzing our data

Preferred Qualifications:

  • Experience with C and/or Assembly or another low level programming language that ties into development of exploits for software, firmware, and hardware products
  • Experience with producing and consuming data from streaming platforms such as Confluent Kafka, which we use internally to centralize all our threat intelligence data for consumption by upstream products
  • Functional understanding of vulnerabilities and related exploit code, capable of writing automation and detection for various CVEs
  • Experience in developing automation to statically and dynamically analyze malware and subsequent campaigns
  • Experience with reverse engineering using IDA, Radare, Ghidra or another malware analysis program as well as working knowledge of debuggers such as Olybdg and hopper

Additional Qualifications: 

  • Excellent communication and presentation skills with the ability to present to technical and non-technical audiences
  • Exceptional written communication skills
  • Strong decision making skills with the ability to prioritize and execute
  • Ability to set and manage expectations with senior stake-holders and team members
  • Strong problem solving, troubleshooting, and analysis skills
  • Experience working in fast-paced, often chaotic environments during major incidents
  • Excellent interpersonal and teamwork skills in a fully remote environment

Benefits:

Specific to each country, we offer a competitive salary, stock options, Health benefits, and unlimited PTO, parental leave, tuition reimbursements, and much more!

SecurityScorecard is committed to Equal Employment Opportunity and embraces diversity. We believe that our team is strengthened through hiring and retaining employees with diverse backgrounds, skill sets, ideas, and perspectives. We make hiring decisions based on merit and do not discriminate based on race, color, religion, national origin, sex or gender (including pregnancy) gender identity or expression (including transgender status), sexual orientation, age, marital, veteran, disability status or any other protected category in accordance with applicable law. 

We also consider qualified applicants regardless of criminal histories, in accordance with applicable law. We are committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. If you need assistance or accommodation due to a disability, please contact talentacquisitionoperations@securityscorecard.io.

Any information you submit to SecurityScorecard as part of your application will be processed in accordance with the Company’s privacy policy and applicable law. 

SecurityScorecard does not accept unsolicited resumes from employment agencies.  Please note that we do not provide immigration sponsorship for this position.   #LI-DNI

See more jobs at SecurityScorecard

Apply for this job

5d

DevSecOps Engineer - Top Secret

Full TimeagileterraformazurejavajenkinspythonAWSjavascriptNode.js

Maania Consultancy Services is hiring a Remote DevSecOps Engineer - Top Secret

DevSecOps Engineer - Top Secret - Maania Consultancy Services - Career Page var DV_DEPUTY = ""; var DV_APP_ROOT = ""; var DV_ID = "PaBjJBnziq"; var DV_SUBDOMAIN = ""; var DV_CUSTID = ""; var DV_USER_FIRSTNAME = ""; var DV_USER_LASTNAME = ""; var DV_USER_EMAIL = ""; var

See more jobs at Maania Consultancy Services

Apply for this job

5d

Operations Associate, Claims Design

sqlDesignqac++python

Oscar Health is hiring a Remote Operations Associate, Claims Design

Hi, we're Oscar. We're hiring an Operations Associate to join our Claims Design team.

Oscar is the first health insurance company built around a full stack technology platform and a focus on serving our members. We started Oscar in 2012 to create the kind of health insurance company we would want for ourselves—one that behaves like a doctor in the family.

About the role

This role is responsible for designing, optimizing and implementing processes and improvements for Oscar’s claims adjudication platform. This person will identify, scope, build, test and implement solutions, with a focus on creating efficiency and scale through technology and automation. They will understand Oscar’s infrastructure, underlying logic, and data models, and will work cross-functionally to understand and translate requirements/rules from business stakeholders to enable technical implementation.

You will report to the Manager, Claims Design.

Work Location:

If you live within commutable distance to our New York City office (in Hudson Square), our Tempe office (off the 101 at University Dr), or our Los Angeles office (in Marina Del Rey), you will be expected to come into the office at least two days each week. Otherwise, this is a remote / work-from-home role.

You must reside in one of the following states: Alabama, Arizona, California, Colorado, Connecticut, Florida, Georgia, Illinois, Indiana, Iowa, Kansas, Kentucky, Maine, Maryland, Massachusetts, Michigan, Minnesota, Missouri, Nevada, New Hampshire, New Jersey, New Mexico, New York, North Carolina, Ohio, Oregon, Pennsylvania, Rhode Island, South Carolina, Tennessee, Texas, Utah, Vermont, Virginia, Washington, or Washington, D.C. Note, this list of states is subject to change.

Pay Transparency:

The base pay for this role in the states of California, Connecticut, New Jersey, New York, and Washington is: $98,400 - $129,150 per year. The base pay for this role in all other locations is: $88,560 - $116,235 per year. You are also eligible for employee benefits, participation in Oscar’s unlimited vacation program and annual performance bonuses.

Responsibilities

  • Designs and prioritizes operational processes by working cross functionally to gather business requirements and implementing process- and technology-enabled solutions
  • Leads and cross collaborates on iterative problem definition and technical design and scoping to build solutions with the end-user in mind
  • Utilizes data to independently drive business decisions, impact cross-functional strategy and develop KPIs to measure the effectiveness of your domain
  • Serve as a subject matter expert in the claims payments and overpayments process within the claims lifecycle
  • Proactively identifies risks; responds to and resolves issues/errors/escalations through data-driven investigation to produce insights for short, medium, and long-term technology-enabled solutions
  • Distills the requirements of new product and market expansions and designs automated workflows to reduce manual work requirements
  • Works with engineers, analysts and other Operations team members to resolve escalations, develop training, and create technical tooling to enable others
  • Understands the operations ecosystem, technology and data models; its current strengths, weaknesses, and gaps; and connects the dots for other teams
  • Compliance with all applicable laws and regulations 
  • Other duties as assigned

Qualifications

  • A bachelor’s degree or 4+ years commensurate experience 
  • 3+ years experience in a technical role (QA analyst, PM, operations analyst, finance, consulting, industrial engineering)
  • 3+ years experience using data software such as SQL or Python
  • Experience in analytics & ability to derive insights from complex, broad datasets 
  • Proficiency in designing and improving workflows as well as standing up accompanying operating and technical procedures

Bonus Points

  • An academic and professional background in coding, math, statistics, engineering or data science
  • Knowledge management, training, or content development in operational settings
  • Process Improvement or Lean Six Sigma training
  • Experience in healthcare or startup environments - specifically with claims adjudication
  • Proficiency with Python

This is an authentic Oscar Health job opportunity. Learn more about how you can safeguard yourself from recruitment fraudhere

At Oscar, being an Equal Opportunity Employer means more than upholding discrimination-free hiring practices. It means that we cultivate an environment where people can be their most authentic selves and find both belonging and support. We're on a mission to change health care -- an experience made whole by our unique backgrounds and perspectives.

Pay Transparency: Final offer amounts, within the base pay set forth above, are determined by factors including your relevant skills, education, and experience.Full-time employees are eligible for benefits including: medical, dental, and vision benefits, 11 paid holidays, paid sick time, paid parental leave, 401(k) plan participation, life and disability insurance, and paid wellness time and reimbursements.

Reasonable Accommodation:Oscar applicants are considered solely based on their qualifications, without regard to applicant’s disability or need for accommodation. Any Oscar applicant who requires reasonable accommodations during the application process should contact the Oscar Benefits Team (accommodations@hioscar.com) to make the need for an accommodation known.

California Residents: For information about our collection, use, and disclosure of applicants’ personal information as well as applicants’ rights over their personal information, please see our Notice to Job Applicants.

See more jobs at Oscar Health

Apply for this job

5d

Staff Data Science Engineer

InstacartUnited States - Remote
sqlapigitpythonbackend

Instacart is hiring a Remote Staff Data Science Engineer

We're transforming the grocery industry

At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

Instacart is a Flex First team

There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

OVERVIEW

ABOUT THE ROLE -We are currently seeking a Staff Data Science Engineer to join our cutting-edge team at Instacart, where we're transforming online grocery delivery with data-driven technical solutions. Our mission is to create impactful 0 to 1 intelligent systems that revolutionize the way we do business. As a key player in our dynamic environment, you'll engage in strategic development, pioneering new data collection and integration methods, while building intuitive tools to empower fellow data scientists. Collaborate cross-functionally to influence multiple facets of our product, merging data science with engineering to drive innovation.

 

ABOUT THE TEAM - The Data & Systems team focused on driving complex, data-driven solutions that span multiple teams, with an emphasis on online grocery delivery.

 

ABOUT THE JOB

The way we will execute on our mission is to lead:

  1. Intelligent systems strategy: Engage across the company and identify the most promising opportunity areas across the product along with contributing data expertise to refine and develop new product ideas
  2. Data collection and distribution: Identify and introduce new and valuable datasets to Instacart and onboard teams to use the new datasets
  3. Intelligent systems tooling: Build interfaces to Instacart’s infrastructure that enable data scientists to easily train and deploy new intelligent systems while automatically complying with all company policies
  4. Technical work with implications across multiple teams: Own building systems and performing analysis that impact multiple teams and is often deeply complex or critical to get right

 

The role requires strong XFN collaboration where an ideal candidate can:

  1. Drive critical efforts to completion with little oversight, while jumping into roles adjacent to data science (i.e. data engineering, machine learning engineer, etc).
  2. Bring new ideas to the team that get incorporated into the product roadmap
  3. Ruthlessly prioritize among requests from multiple competing stakeholders
  4. Act as a senior cross-functional leader, aligning the org on principles, processes, and goals

 

ABOUT YOU

MINIMUM QUALIFICATIONS

  • 6+ years of work experience in a data science or related field
  • Expert in Python, SQL, git, and Jupyter notebooks

Deployed several machine learning models to production and had to support those models

  • Know how machine learning algorithms work
  • Deeply passionate about building tools to enable data scientists to maximize their impact

PREFERRED QUALIFICATIONS

  • 8+ years of work experience in a data science or related field
  • Significant experience as a backend software engineer in a data-related space
  • Expert in scikit-learn - you know the philosophy behind the API extremely well



Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policyhere.

Offers may vary based on many factors, such as candidate experience and skills required for the role.Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offeringshere.

For US based candidates, the base pay ranges for a successful candidate are listed below.

CA, NY, CT, NJ
$255,000$283,000 USD
WA
$245,000$272,000 USD
OR, DE, ME, MA, MD, NH, RI, VT, DC, PA, VA, CO, TX, IL, HI
$234,000$260,000 USD
All other states
$212,000$235,000 USD

See more jobs at Instacart

Apply for this job

5d

ML Platform Engineer

Stitch FixRemote, USA
MLredis3 years of experiencepostgresDesignjavapythonAWS

Stitch Fix is hiring a Remote ML Platform Engineer

About Stitch Fix, Inc.

Stitch Fix (NASDAQ: SFIX) is the leading online personal styling service that helps people discover the styles they will love that fit perfectly so they always look - and feel - their best. Few things are more personal than getting dressed, but finding clothing that fits and looks great can be a challenge. Stitch Fix solves that problem. By pairing expert stylists with best-in-class AI and recommendation algorithms, the company leverages its assortment of exclusive and national brands to meet each client's individual tastes and needs, making it convenient for clients to express their personal style without having to spend hours in stores or sifting through endless choices online. Stitch Fix, which was founded in 2011, is headquartered in San Francisco.

 

About the Team

The mission of the ML Platform team at Stitch Fix is to build exceptional systems for deploying and serving machine learning models in production. As a member of this team, you will help develop APIs, systems, and platforms that unlock critical algorithmic capabilities. You will also create self-service tools and platforms to empower our data scientists with scalable research and development capabilities.

We are passionate about crafting elegant, scalable abstractions that stand the test of time. Our work is centered on reducing friction for colleagues running production algorithms, ultimately enhancing the velocity at which the business can innovate and grow.

About the Role

As an ML Platform Engineer at Stitch Fix, you will play a key role in building and maintaining the critical infrastructure that powers machine learning across our organization. You will design, develop, and support scalable, resilient services and frameworks for ML model training and deployment.

In this role, you'll contribute to the day-to-day operations of the ML Platform team, ensuring the smooth functioning of existing systems while driving improvements. You’ll collaborate closely with full-stack data scientists, offering consultation and support to help them unlock the full potential of our platform.

With significant autonomy, you’ll have the opportunity to shape the future of ML at Stitch Fix. Your ideas and expertise will drive improvements, codify best practices, and influence how we approach machine learning infrastructure and operations.

You're excited about this opportunity because you will…

  • Collaborate with cross-functional teams, including data scientists, engineers, and business partners, to solve complex distributed systems and business challenges at scale.
  • Be part of a team with high visibility across the organization, driving impactful solutions that make a difference.
  • Share your ideas and help guide the team’s investments toward high-value opportunities.
  • Foster a culture of technical collaboration and contribute to the development of scalable, resilient systems.

We’re excited about you because…

  • You bring 1–3 years of experience in software development, with a focus on data and ML infrastructure.
  • You have a proven track record of building scalable, distributed production systems.
  • Your coding and design skills are exceptional. While Python and Java are our primary tools, your overall engineering expertise and problem-solving abilities matter most.
  • You’re proficient in leveraging big data technologies such as Kafka, Spark, Hive/Iceberg, Postgres, Redis, and similar tools.
  • You have hands-on experience with AWS or other cloud-based providers.
  • You excel at thinking globally, developing solutions that address multiple needs while prioritizing business impact.
  • Your strong cross-functional communication skills enable you to simplify complex challenges and collaborate effectively with business partners to drive solutions forward.

Why you'll love working at Stitch Fix...

  • We are a group of bright, kind people who are motivated by challenge. We value integrity, innovation and trust. You’ll bring these characteristics to life in everything you do at Stitch Fix.
  • We cultivate a community of diverse perspectives— all voices are heard and valued.
  • We are an innovative company and leverage our strengths in fashion and tech to disrupt the future of retail. 
  • We win as a team, commit to our work, and celebrate grit together because we value strong relationships.
  • We boldly create the future while keeping equity and sustainability at the center of all that we do. 
  • We are the owners of our work and are energized by solving problems through a growth mindset lens. We think broadly and creatively through every situation to create meaningful impact.
  • We offer comprehensive compensation packages and inclusive health and wellness benefits.

Compensation and Benefits

This role will receive a competitive salary and benefits. The salary for US-based employees will be aligned with the range below, which includes our three geographic areas. A variety of factors are considered when determining someone’s compensation–including a candidate’s professional background, experience, location, and performance.This position is eligible for an annual cash award depending on employee and company performance. In addition, the position is eligible for medical, dental, vision, and other benefits. Applicants should apply via our internal or external careers site.

Salary Range
$100,600$148,000 USD

This link leads to the machine readable files that are made available in response to the federal Transparency in Coverage Rule and includes negotiated service rates and out-of-network allowed amounts between health plans and healthcare providers. The machine-readable files are formatted to allow researchers, regulators, and application developers to more easily access and analyze data.

Please review Stitch Fix's US Applicant Privacy Policy and Notice at Collection here: https://stitchfix.com/careers/workforce-applicant-privacy-policy

Recruiting Fraud Alert: 

To all candidates: your personal information and online safety are top of mind for us.  At Stitch Fix, recruiters only direct candidates to apply through our official career pages at https://www.stitchfix.com/careers/jobs or https://web.fountain.com/c/stitch-fix.

Recruiters will never request payments, ask for financial account information or sensitive information like social security numbers. If you are unsure if a message is from Stitch Fix, please email careers@stitchfix.com

You can read more about Recruiting Scam Awareness on our FAQ page here: https://support.stitchfix.com/hc/en-us/articles/1500007169402-Recruiting-Scam-Awareness 

 

See more jobs at Stitch Fix

Apply for this job

5d

Machine Learning Engineer - Only IIT/NIT's - Remote

PromptcloudBengaluru, India, Remote
MLsqlelasticsearchpython

Promptcloud is hiring a Remote Machine Learning Engineer - Only IIT/NIT's - Remote

Job Description

Machine learning engineer

Responsibilities :

Work closely with product teams to implement new products and features using ML.
Develop highly scalable classifiers and tools leveraging machine learning, deep learning, and rules-based models.
Own and drive all ML projects.
Inculcate ML culture within an organization.

Deployment of models & experience in the cloud.

Good knowledge on scripting and well versed with linux.

Requirements :

1 to 4 years of relevant work experience.  

Python programming skill is a must. Strong coding capabilities in TensorFlow or PyTorch. 

Good understanding of Machine Learning algorithms (Regression algorithms, Decision Trees), Distributions, and Deep Learning algorithms.                        Good understanding of Data structures & Algorithms.

Good in problem solving skills.

Understanding/Knowledge on ElasticSearch is a plus point.

Flexible with cross-functional work.

Any prior research publication is a plus point.
Strong programming skills with proven experience crafting, prototyping, and delivering advanced algorithmic solutions.
A passion for making ML methods robust and scalable.
Experience in extraction from structured/unstructured text (knowledge or statistics based).
Experience in one or more of the following areas: entity/relation extraction, normalization, text. summarization, semantic search, word/paragraph/document embedding, ranking, etc.
NLP algorithm implementation experience as well as the ability to modify standard algorithms (e.g. change objectives, work-out the math and implement).
Experience in deep learning approaches to NLP: word/paragraph embedding representation learning, text/sentiment classification, word2vec.
Experience with neural networks and deep learning frameworks (such as Keras, TensorFlow, torch)
Familiarity with database queries and data analysis processes (SQL, Python).
Good in natural language processing, information extraction, and text mining algorithms, tech stacks, and experience with datasets - benchmark and the real world.
Track record of leading, building, and deploying production NLP systems on large scale text data.

Should be very good at Natural language processing

Must have a transformer, BERT understanding. Should be able to fine-tune Transformer, BERT architecture for sub-level tasks.

Monitor the performance of existing pipelines and different services, continuously look out for opportunities to improve the developer experience, and cost optimization

Troubleshoot the platform related incidents

Create and generate periodic and ad hoc reports

Having experience in Bard,  Gemini, etc will be added advantages.  

We are a 100% remote company and we strive to build a unique and thriving workplace where work is designed around people (and not the other way round). More details on PromptCloud’s culture and how we work can be found in the PC 2.0 Culture Handbook shared here  - https://www.promptcloud.com/life-at-promptcloud.   

Perks

  • An environment where each employee is celebrated. 

  • A one-time home office setup allowance, monthly allowances for internet bills, child care allowance for new mothers/single parents.

  • Half-yearly performance appraisals

  • Flexible working hours

  • Competitive salary

 

 

 

Qualifications

See more jobs at Promptcloud

Apply for this job

5d

Staff Automation Engineer

AddeparRemote, USA
DevOPSagileterraformjavac++dockerkubernetesjenkinspythonAWSjavascript

Addepar is hiring a Remote Staff Automation Engineer

Who We Are

Addepar is a global technology and data company that helps investment professionals provide the most informed, precise guidance for their clients. Hundreds of thousands of users have entrusted Addepar to empower smarter investment decisions and better advice over the last decade. With client presence in more than 45 countries, Addepar’s platform aggregates portfolio, market and client data for over $6 trillion in assets. Addepar’s open platform integrates with more than 100 software, data and services partners to deliver a complete solution for a wide range of firms and use cases. Addepar embraces a global flexible workforce model with offices in Silicon Valley, New York City, Salt Lake City, Chicago, London, Edinburgh and Pune.

We are seeking a highly skilled Staff Automation Engineer to drive the development and optimization of automation solutions that enhance developer productivity, improve software quality, and reduce technical debt. This role focuses on implementing scalable frameworks, advancing service-oriented architecture (SOA) practices, and enabling teams to achieve operational excellence through technical innovation. The ideal candidate is passionate about empowering developers, improving engineering efficiency, leveraging data-driven insights, and fostering a culture of continuous improvement.

The current range for this role is $144,000 - $225,000 + bonus + equity + benefits.Your recruiter can share more about the specific salary range for your preferred location during the hiring process. Additionally, these ranges reflect the base salary only, and do not include bonus, equity, or benefits.

Key Responsibilities:

  1. Technical Leadership and Innovation:
  • Architect, develop, and maintain scalable automation frameworks and tools.
  • Lead technical efforts to support both developer enablement and quality assurance needs.
  • Provide expertise in designing and optimizing automation systems for maintainability, scalability, and performance.
  • Developer Enablement:
    • Build and deploy automation solutions that streamline the SDLC, including tools for test data generation, environment setup, and deployment pipelines.
    • Integrate automation into CI/CD workflows to deliver rapid feedback and seamless deployments.
    • Drive the adoption of developer-first tools to minimize friction and accelerate development processes.
    • Implement and manage centralized software catalogs to improve discoverability, compliance, and reuse of automation tools and services.
  • Technical Debt Reduction and Modernization:
    • Identify and address technical debt in automation frameworks and infrastructure.
    • Modernize legacy systems and align them with modern tools and methodologies.
    • Evaluate emerging technologies to continuously improve developer and quality enablement capabilities.
  • Quality Assurance and Metrics:
    • Define and implement robust test automation strategies to ensure reliable and consistent quality across releases.
    • Ensure comprehensive testing coverage, focusing on functional, integration, and end-to-end tests with an emphasis on SOA and microservices.
    • Track and improve delivery performance metrics, such as DORA metrics (deployment frequency, lead time for changes, change failure rate, and mean time to recovery).
  • Collaboration and Cross-Team Alignment:
    • Partner with engineering, DevOps, and architecture teams to align automation strategies with organizational goals.
    • Collaborate on service integration testing, contract testing, and system-level quality initiatives.
    • Advocate for best practices and provide technical leadership in automation priorities across teams.

    Required Qualifications:

    • Proven experience as a technical leader in automation engineering or developer enablement.
    • Expertise in designing and implementing automation frameworks for SOA and microservices architectures.
    • Strong programming skills in languages such as Python, Java, or JavaScript, emphasizing maintainable and scalable code.
    • Hands-on experience with CI/CD pipelines, infrastructure-as-code (IaC), and build automation tools (e.g., Jenkins, GitHub Actions, Terraform).
    • Deep understanding of service integration testing, contract testing, and system reliability testing.
    • Demonstrated success in improving DORA metrics across engineering teams.

    Preferred Qualifications:

    • Experience with cloud-native technologies and platforms (e.g., AWS).
    • Familiarity with developer tooling and software cataloging tools and practices (e.g., Backstage, SonarQube, IDE plugin development).
    • Expertise in pipeline technologies, including CI/CD orchestration, artifact management, and automated rollbacks (e.g., Jenkins, GitHub Actions, ArgoCD, Artifactory).
    • Proficiency with build tools and dependency management systems (e.g., Maven, Gradle, Bazel, npm, Yarn).
    • Experience with containerization and orchestration tools (e.g., Docker, Kubernetes).
    • Strong understanding of Agile and DevOps principles.

    Our Values 

    • Act Like an Owner -Think and operate with intention, purpose and care. Own outcomes.
    • Build Together -Collaborate to unlock the best solutions. Deliver lasting value. 
    • Champion Our Clients -Exceed client expectations. Our clients’ success is our success. 
    • Drive Innovation -Be bold and unconstrained in problem solving. Transform the industry. 
    • Embrace Learning -Engage our community to broaden our perspective. Bring a growth mindset. 

    In addition to our core values, Addepar is proud to be an equal opportunity employer. We seek to bring together diverse ideas, experiences, skill sets, perspectives, backgrounds and identities to drive innovative solutions. We commit to promoting a welcoming environment where inclusion and belonging are held as a shared responsibility.

    We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

    PHISHING SCAM WARNING: Addepar is among several companies recently made aware of a phishing scam involving con artists posing as hiring managers recruiting via email, text and social media. The imposters are creating misleading email accounts, conducting remote “interviews,” and making fake job offers in order to collect personal and financial information from unsuspecting individuals. Please be aware that no job offers will be made from Addepar without a formal interview process. Additionally, Addepar will not ask you to purchase equipment or supplies as part of your onboarding process. If you have any questions, please reach out to TAinfo@addepar.com.

    See more jobs at Addepar

    Apply for this job

    5d

    Senior Machine Learning Engineer - Marketplace

    InstacartUSA-Remote
    MLsqlDesignjavapythonbackend

    Instacart is hiring a Remote Senior Machine Learning Engineer - Marketplace

    We're transforming the grocery industry

    At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

    Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

    Instacart is a Flex First team

    There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

    Overview



    About the Role

    As a Senior Machine Learning Engineer, you will leverage your deep technical expertise to tackle critical and complex problems within our marketplace systems. You will apply machine learning to create and optimize solutions in impactful areas such as delivery ETA, pricing, and incentives targeting. Through continuous innovations and improvements, your work will be instrumental in driving sustainable growth.



    About the Team 

    Our team comprises highly skilled Machine Learning Engineers and ML Infrastructure Backend Engineers focused on ML-driven systems such as ETA prediction, pricing elasticity modeling, and incentives optimization and personalization. The team thrives on innovation and collaboration, driving advancements that directly affect Instacart’s growth and efficiency.




    About the Job 

    Design, develop, and deploy machine learning solutions to tackle practical challenges in the marketplace.Collaborate closely with product managers, data scientists, and backend engineers to deeply understand business needs and create impactful ML applications.Actively engage with diverse stakeholders to ensure that solutions are well-integrated and aligned with business goals.Push the envelope on our operational efficiency by continually refining and advancing our algorithms and models.


    About You

    Minimum Qualifications

    MS or PhD in Computer Science/Engineering or a related field, with 0-3+ years of industry experience focused on machine learning.Strong programming skills in Python, Java, or other relevant languages Fluency in data manipulation (e.g., SQL, Spark, Pandas) and machine learning tools (e.g., scikit-learn, XGBoost, Tensorflow and Pytorch).Have strong analytical skills and problem-solving abilitySelf-motivated individual with a strong ownership mindsetStrong communicator who can collaborate with diverse stakeholders

    Preferred Qualifications

    Experience with managing large-scale machine learning model training and deployment.Knowledge of deep learning frameworks and methodologies.Proven track record of effectively identifying and addressing high-impact problems with solutions that balance urgency and quality.

    Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policyhere.

    Offers may vary based on many factors, such as candidate experience and skills required for the role.Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offeringshere.

    For US based candidates, the base pay ranges for a successful candidate are listed below.

    CA, NY, CT, NJ
    $198,000$220,000 USD
    WA
    $190,000$211,000 USD
    OR, DE, ME, MA, MD, NH, RI, VT, DC, PA, VA, CO, TX, IL, HI
    $182,000$202,000 USD
    All other states
    $165,000$183,000 USD

    See more jobs at Instacart

    Apply for this job

    5d

    Machine Learning Engineer II

    InstacartCanada - Remote (ON, AB, BC, or NS Only)
    MLsqlDesignpythonbackend

    Instacart is hiring a Remote Machine Learning Engineer II

    We're transforming the grocery industry

    At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.

    Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.

    Instacart is a Flex First team

    There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.

    Overview



    About the Role

    As a Machine Learning Engineer, you will apply machine learning to create and optimize solutions in impactful areas such as incentives targeting. Through continuous innovations and improvements, your work will be instrumental in driving sustainable growth.



    About the Team 

    Our team comprises highly skilled Machine Learning Engineers focused on ML-driven systems such as incentive optimization, contributing to Instacart’s growth.



    About the Job 

    Design, develop, and deploy machine learning solutions to tackle practical problems.Collaborate closely with product managers, data scientists, and backend engineers to ensure the ML solutions are well-integrated and aligned with business goals.Improve the reliability and scalability of our ML systems.

    About You

     

    Minimum Qualifications

    MS or PhD in Computer Science/Engineering or a related field, with 0-3+ years of industry experience focused on machine learning.Strong programming skills in PythonFluency in data manipulation (e.g., SQL, Spark) and machine learning tools (e.g., scikit-learn, Tensorflow and Pytorch).Have strong analytical skills and problem-solving abilitySelf-motivated individual with a strong ownership mindsetStrong communicator who can collaborate with diverse stakeholders

    Preferred Qualifications

    1+ years of industry experience in productionizing large-scale machine learning models and systemsProven track record of sovling impactful problems with urgency and qualityKnowledge of deep learning frameworks and methodologies.

    Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policy here. Currently, we are only hiring in the following provinces: Ontario, Alberta, British Columbia, and Nova Scotia.

    Offers may vary based on many factors, such as candidate experience and skills required for the role. Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offerings here.

    For Canadian based candidates, the base pay ranges for a successful candidate are listed below.

    CAN
    $153,000$170,000 CAD

    See more jobs at Instacart

    Apply for this job

    6d

    Data Driven | MLOps Engineer

    DevoteamLisboa, Portugal, Remote
    MLsqlgitc++dockerpythonAWSfrontend

    Devoteam is hiring a Remote Data Driven | MLOps Engineer

    Job Description

    • Ensure the quality of the end-to-end lifecycle of our AI solutions: this includes everything from data preparation and model training to – POC and at scale – deployment, monitoring, and maintenance.
    • Close collaboration with the AI Tech Lead and other team members, helping the team to build and deliver AI-powered applications in a secure, efficient, and reliable manner.
    •  Advise and guide the team on how to automate and optimize the deployment of AI- powered applications. Take the lead on establishing best practices and adopting the most suitable frameworks, tools and platforms for machine learning operations in the team.
    • Develop, deploy, maintain and continually improve AI-powered applications and ML pipelines, which might include multiple components, such as data ingestion and transformation, model training and serving, frontend interfaces, databases and APIs.
    • Implement continuous integration and delivery (CI/CD) for our AI solutions.
    • Monitor model performance and data quality, and iterate on models in production.
    • Collaborate with data engineers, data scientists and software engineers to make models, applications or use cases operational.
    • Ensure the scalability and reliability of our AI and ML infrastructure.
    • Ensure our practices keep up to date with MLOps standards.
    • Actively share your knowledge with the team.

    Qualifications

    • Fluency in one at least one OOP language such as C# or Java.
    • Experience with common data science languages, such as Python or R. 
    • Clear understanding of the machine learning project lifecycle.
    • Proficiency with open-source tools, such as Jenkins.
    • Experience with AWS Cloud Platform.
    • Proficiency in using query languages such as SQL, Spark SQL, etc.
    • Experience with container technologies like Docker and Kubernetes.
    • Experience with software build and release processes, unit testing, version control, etc.
    • Experience using Git source control.
    • A passion for ML/AI.
    • A collaborative and can-do attitude.
    • Excellent written and verbal communication skills, comfortable with audiences including product and engineering management.

    See more jobs at Devoteam

    Apply for this job

    6d

    Machine Learning Engineer

    MobicaRemote, Poland
    MLMaster’s DegreeDesignazurepythonAWS

    Mobica is hiring a Remote Machine Learning Engineer

    Job Description

    As a Machine Learning Engineer, you will play a crucial role in designing, developing, and deploying cutting-edge machine learning solutions. Your work will impact millions of users, specifically in the field of natural language processing (NLP), cloud-based applications, and deep learning.

    Key responsibilities:

    • Design and develop machine learning models, algorithms, and systems
    • Implement appropriate ML algorithms for tasks such as NLP, recommendation systems, and computer vision
    • Conduct experiments and analyze data to improve model performance
    • Optimize algorithms for scalability, efficiency, and accuracy
    • Collaborate with cross-functional teams, including data scientists, software engineers, and domain experts
    • Stay up-to-date with the latest advancements in machine learning and AI
    • Leverage cloud services (such as Azure, AWS, or Google Cloud) for model deployment and scalability
    • Explore and apply GenAI techniques to enhance model capabilities

    Qualifications

    Must have:

    • Bachelor’s or master’s degree in computer science, data science, or a related field
    • Strong programming skills in languages like Python or R
    • Proficiency in machine learning libraries (e.g., TensorFlow, PyTorch, scikit-learn)
    • Experience with NLP techniques, including text classification, sentiment analysis, and named entity recognition
    • Knowledge of deep learning architectures (e.g., CNNs, RNNs, Transformers).
    • Familiarity with cloud platforms and services
    • Excellent problem-solving abilities and analytical thinking
    • Good command of English

    Nice to have:

    • Contributions to open-source projects or research publications
    • Experience with distributed computing and big data frameworks
    • Certifications related to machine learning or cloud platforms

    See more jobs at Mobica

    Apply for this job

    6d

    Azure Data Engineer

    MobicaRemote, Poland
    DevOPSLambdaagile10 years of experiencesqlazurepython

    Mobica is hiring a Remote Azure Data Engineer

    Job Description

    As Azure Data Engineer you help the team in setting-up a new DWH environment in Microsoft Azure and deliver the existing on premise data ingestion and data analytics functionality in Azure cloud. You also contribute to fine-tuning the Target architecture with best suitable tools / technologies. In addition to hands-on, you help / guide the team with best practices, technology standards, possible automations etc. in Azure cloud architecture.

    Qualifications

    Must have skills:

    • 5-10 years of experience primarily in Cloud Data engineering, Data warehousing projects
    • Experience in Microsoft Azure enterprise Data Lake platform with
      • Azure Data Lake Storage
      • Azure Data Factory
      • Azure Data Factory Dataflow
      • Databricks using Python
      • Azure Functions
      • Azure SQL DB / Synapse
    • Experience working in Python and PowerShell
    • Experience in building and deploying pipelines in Azure DevOps
    • Familiarity with Azure security services – Active Directory, Azure Key Vault, Managed Identities, RBAC, Firewall on VNET & SubNets
    • Extensive experience in software development in an agile DevOps environment
    • Good communication & interfacing skills and able to work in a structured manner
    • Strong in collaboration , Flexibility and showing entrepreneurship
    • Eager to be open, learn, advice & implement new technologies
    • Good command of English

    Nice to have skills:

    • Development of APIs to expose data through REST APIs
    • Knowledge on Big Data batch and stream ingestion (Lambda architecture)
    • Fair knowledge of Data modelling (specifically Dimensional modelling)

    See more jobs at Mobica

    Apply for this job

    6d

    Snowflake Data Engineer

    MobicaRemote, Poland
    Bachelor's degreesqlDesignazurepythonAWS

    Mobica is hiring a Remote Snowflake Data Engineer

    Job Description

    As a Snowflake Data Engineer, you will be responsible for designing, developing, and maintaining data pipelines and solutions using Snowflake. You will work closely with data architects, data scientists, and business stakeholders to ensure the efficient and effective use of data. Your role will involve building scalable data solutions, optimizing performance, and ensuring data quality and security.

    Key Responsibilities:

    • Design and develop data pipelines and ETL processes using Snowflake.
    • Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical solutions.
    • Implement data models, data warehouses, and data marts in Snowflake.
    • Optimize Snowflake performance, including query performance tuning and resource management.
    • Ensure data quality, data governance, and data security best practices are followed.
    • Troubleshoot and resolve data-related issues, ensuring timely and effective solutions.
    • Develop and maintain documentation for data pipelines, data models, and data processes.
    • Provide technical support and guidance to team members and stakeholders.
    • Lead and execute data migration projects, including migrating data from legacy systems to Snowflake.
    • Develop and implement strategies for data migration, ensuring minimal downtime and data integrity during the migration process.
    • Design and implement data lake solutions, integrating them with Snowflake to support large-scale data storage and processing.

    Qualifications

    Must have:

    • Bachelor's degree in Computer Science, Information Technology, or a related field.
    • Proven experience in designing and developing data solutions using Snowflake.
    • Strong knowledge of SQL and ETL processes.
    • Experience with data modeling, data warehousing, and data integration.
    • Proficiency in scripting languages such as Python or Java.
    • Familiarity with cloud platforms such as AWS, Azure, or GCP.
    • Excellent problem-solving and troubleshooting skills.
    • Strong communication and interpersonal skills.
    • Ability to work collaboratively in a team environment.
    • Good command of English

    Nice to have:

    • Exposure to data governance best practices is a plus
    • Certification in Snowflake is an advantage

    See more jobs at Mobica

    Apply for this job

    6d

    Senior Gen AI Data Scientist

    MobicaRemote, Poland
    agileazureapipython

    Mobica is hiring a Remote Senior Gen AI Data Scientist

    Job Description

    We are looking for a skilled Data Scientist to join our Generative AI team to build LLM-based enterprise solutions. Main responsibilities would be testing, fine-tuning and deploying generative AI models, in addition to collaborating with cross-functional teams to define project requirements and objectives, ensuring alignment with overall business goals.

    Responsibilities:

    • Conceptualize, prototype, and work with state-of-the-art LLMs to drive technical solutions
    • Enhance and innovate existing model completions with prompt engineering or prompt tuning techniques
    • Develop performance monitoring and evaluation pipelines to ensure the quality and effectiveness of generative models
    • Stay up-to-date with the latest research and advancements in Generative AI and actively contribute to upskilling of the team

    Qualifications

    Must have:

    • Minimum of BSc or higher degree in STEM subject 
    • Proficient in Python
    • Solid knowledge of machine learning concepts (model training, hyper parameter optimisation, experiment tracking, learning curves, etc.) with a focus on NLP and Gen AI
    • Familiarity with RAG-based chatbots
    • Experience with large language models - chatgpt, llama, falcon, BARD, etc.
    • Experience with deep learning frameworks - Tensorflow, Pytorch, Keras, HuggingFace, LangChain etc.
    • Familiarity with NLP tasks - NER, POS tagging, Classification, Topic Modeling, Summarization, Question-Answering, etc.
    • Can conceptualise, formulate, prototype and implement algorithms to solve technical problems.
    • Writing and debugging Python/SQL code adhering to best coding practices (debugging, unit testing, etc.)
    • Data Wrangling skills (NULLs, duplicates, outliers, regex cleaning, etc.)
    • Example of building models and data pipelines.
    • Experience with NLP tasks - NER, POS tagging, Classification, Topic Modeling, Summarization, Question-Answering, etc.
    • Experience with large language models - GPT3, Word2Vec, GloVe, BERT, etc.
    • Experience with deep learning frameworks - Tensorflow, Pytorch, Keras, HuggingFace, etc.
    • Experience with version/source control using Git.
    • Experiment tracking (using mlflow, comet, Neptune, tensorboard, weights & biases, sagemaker, etc.)

    Nice to have:

    • Experience with cloud platform, particularly Azure
    • Experience deploying machine learning models through REST API
    • Understanding of Agile methodologies

    See more jobs at Mobica

    Apply for this job

    6d

    Databricks Engineer

    MobicaRemote, Poland
    DevOPSBachelor's degreesqlazure.netpython

    Mobica is hiring a Remote Databricks Engineer

    Job Description

    As a Databricks Engineer, this role offers the opportunity to work with a team in building, optimizing, and maintaining data analytics solutions using Azure cloud services and the Databricks platform.

    Key responsibilities:

    • Provide technical expertise on integrating Databricks with Azure and optimizing data processing workflows
    • Collaborate with stakeholders to translate business requirements into technical solutions
    • Implement DevOps practices using Azure DevOps for project management and deployment automation
    • Ensure adherence to security and monitoring best practices in the production environment

    Qualifications

    Must have:

    • Bachelor's degree in Computer Science, Engineering, or related field
    • Experience in SQL, Python, PySpark, and Databricks
    • Strong understanding of cloud services, particularly Azure Databricks and Azure SQL
    • Certifications in Azure or Databricks are preferred
    • Strong problem-solving and analytical skills
    • Ability to effectively communicate technical concepts to non-technical stakeholders
    • Proven track record of leading development teams and delivering successful projects
    • Continuous learning mindset to keep up with the latest technologies and best practices

    Nice to have:

    • Experience with Azure DevOps for version control and CI/CD pipelines
    • Knowledge of Azure Web Apps, .NET, Azure Functions, Azure Automation, Azure Logic Apps, and Azure Event Hubs

    See more jobs at Mobica

    Apply for this job

    6d

    Azure Data Architect

    MobicaRemote, Poland
    agilesqlDesignazurepython

    Mobica is hiring a Remote Azure Data Architect

    Job Description

    As Azure Data Architect You will be delivering a high quality and innovative data driven solution to our client’s advance analytical needs. This will include, working within a team on the clients site: understanding the client’s pain, designing an analytical approach, implementing a solution, ensuring it is of high quality, and leading and mentoring multi discipline technology teams

    Your Key Result Areas:

    • You will be a part of experienced onsite consulting/delivery teams developing & providing architectures for a highly resilient, fault tolerant, horizontally scalable data platform.
    • Will build multiple layers within the platform around Data Ingestion, Integration, Metadata Management, Data security and data presentation
    • Working with the client Architects to refine ideas on the Analytics Architecture and help with standards and guidelines.
    • Building a Data Engineering community and helping customers/clients to build centre of excellence for engineering.
    • Ensuring development teams deliver the highest quality data products through the adoption and continuous improvement of patterns and practices, tools and frameworks and processes.
    • Applying years of experience and expertise to help teams resolve and overcome technical challenges of any size or complexity and assist where necessary to accelerate problem resolution.

    Qualifications

    Must have:

    • You will be a skilled Architect helping customer teams to design, strategize and build the data foundations to drive Analytics business use cases on Azure.
    • You will engineer and orchestrate data flows & pipelines in a cloud environment using a progressive tech stack e.g. ADF, Databricks, Spark, Python, PySpark, Synapse, Delta Lake, SQL, Logic Apps, Azure Functions, ADLS
    • Strong experience in building data transformation techniques using Spark and DataBricks.
    • You will design the architecture to Ingest and integrate data from a large number of disparate data sources
    • You will design and build complex data processing solutions both in batch & real time use cases. Knowledge in Kafka/Spark is a value add but experience in EventHub/Stream Analytics is needed for designing real time solution.
    • You will work closely with analysts, data scientists and technology partners to understand their requirements
    • You will exercise fluid communication with your team of data engineers and analysts
    • You will enthusiastically evolve your technical skillset, engage in training, and learn new technologies and techniques
    • You will work comfortably in an agile and fast paced environment.
    • Able to compare between best of technologies by layer – e.g., stream processing, messaging, search, etc.
    • Comes with strong experience in delivering Data Solutions using Big-Data technologies and cloud platforms
    • Suggest the architecture which incorporates Security controls around authentication, authorizations, and encryption
    • Recommend the migration tools and design data migration strategy
    • Design and implement Metadata management and data catalogs process on top of the built Azure Data Platform

    Nice to have:

    • Experience in Architecting multi cloud solutions
    • Experience in delivering big data solutions using a leading Hadoop distribution like Cloudera, Snowflake Knowledge of RDBMS, ETL and Data warehouse technologies

    See more jobs at Mobica

    Apply for this job

    6d

    Cloud Engineer

    MobicaRemote, Poland
    EC2LambdaMaster’s DegreeterraformDesignazuredockerlinuxjenkinspythonAWS

    Mobica is hiring a Remote Cloud Engineer

    Job Description

    We are seeking a skilled Cloud Engineer to join our team. The ideal candidate will be responsible for the setup and deployment of required infrastructure in Amazon AWS for data quality assessments and implementing basic cloud functionalities like serverless code and integrations.

    Responsibilities:

    • Automated multi-stage and multi-region deployments of AWS infrastructure using Terraform
    • Implementation and maintenance of a multi-level CI/CD pipeline infrastructure using several GitLab repositories, Linux build servers, Bash scripts, and Docker builds
    • Maintain IBM ODM components running as serverless Fargate services using self-composed Docker images and connections to other AWS services
    • Develop technical documentation to accurately represent application design, processes, and code
    • Setup integration with data pipeline to feed datasets into rules engine
    • Setup integration with Logging & Monitoring framework
    • Reduce redundancies in storage and processing
    • Setup integration with metadata management solution (using AWS Glue)
    • Analyze, setup, and maintain AWS cloud infrastructure including ECS tasks and services, RDS databases, Lambda functions, APIs and models, Route 53 DNS zones, EC2 instances, Docker repositories, and security roles and policies

    Qualifications

    Must Have:

    • Proficiency in Amazon Web Services (AWS)
    • Experience with AWS Glue
    • Experience with CI/CD pipeline tools (e.g., GitLab, Jenkins)
    • Strong understanding of cloud security best practices
    • Experience with serverless architectures (e.g., AWS Lambda)
    • Bachelor’s or Master’s degree
    • Good problem-solving skills and attention to detail
    • Effective communication skills in English

    Nice to Have:

    • Experience with Python
    • AWS Certifications
    • Familiarity with other cloud platforms (e.g., Azure, Google Cloud)
    • Knowledge of data integration and ETL processes
    • Experience with monitoring and logging tools (e.g., CloudWatch, ELK stack)
    • Experience with Terraform for infrastructure as code

    See more jobs at Mobica

    Apply for this job

    6d

    Principal Consultant

    DatacomAuckland,New Zealand, Remote Hybrid
    SalesDevOPSagileDesignazure.netangularpython

    Datacom is hiring a Remote Principal Consultant

    Our Why 

    Datacom works with organisations and communities across Australia and New Zealand to make a difference in people’s lives and help organisations use the power of tech to innovate and grow. 

     

    About the Role (your why)  

    We are seeking a highly skilled and experienced Principal Consultant to join our team. The ideal candidate will have a deep understanding of integration platforms, techniques and approaches along with general application development & cloud architecture methodologies and a proven track record in delivering complex solutions. This role will be pivotal in driving our pre-sales consulting activities, including RFx, discovery, solution design, and estimation for various integration, and application development projects. This is a client facing role, so you'll need to be an excellent communicator and good with people. The Principal Consultant works closely with practice leads, project managers, and client stakeholders and teams.

     

    What you’ll do   

    Pre-Sales Consulting:

    • Lead and participate in client discovery workshops to understand business requirements, technical challenges and provide customers with compelling soltuions
    • Design and architect robust application solutions, aligning with business objectives and technical constraints.
    • Develop compelling technical proposals and presentations to win new business.
    • Provide accurate and timely estimates for project scopes and timelines.

    Solution Architecture:

    • Define and document detailed application architecture blueprints, including component diagrams, data flows, and integration points.
    • Identify and mitigate potential risks and challenges.
    • Ensure alignment with industry best practices and standards.
    • Technical Leadership:
    • Mentor and guide junior team members, fostering a culture of technical excellence.
    • Stay up-to-date with the latest application development trends and technologies.
    • Contribute to the development of reusable frameworks and methodologies.

    Client Engagement:

    • Build strong relationships with clients and stakeholders.
    • Effectively communicate technical concepts to both technical and non-technical audiences.
    • Proactively address client concerns and questions.

     

    What you’ll bring  

    • Proven experience as a Principal Consultant in application development & cloud architecture.
    • Deep understanding of software development methodologies (Agile, Waterfall, DevOps).
    • Strong experience in designing and implementing enterprise-grade applications.
    • Proficiency in multiple programming languages and frameworks (e.g., .NET, Python, React, Angular).
    • Proficient in Azure services, networking, and security best practices.
    • Excellent problem-solving and troubleshooting skills.
    • Strong communication and presentation skills.
    • Ability to work independently and as part of a team.

     

    Why join us here at Datacom? 

    Datacom is one of Australia and New Zealand’s largest suppliers of Information Technology professional services. We have managed to maintain a dynamic, agile, small business feel that is often diluted in larger organisations of our size. It's our people that give Datacom its unique culture and energy that you can feel from the moment you meet with us. 

    We care about our people and provide a range of perks such as social events, chill-out spaces, remote working, flexi-hours and professional development courses to name a few. You’ll have the opportunity to learn, develop your career, connect and bring your true self to work. You will be recognised and valued for your contributions and be able to do your work in a collegial, flat-structured environment. 

    We operate at the forefront of technology to help Australia and New Zealand’s largest enterprise organisations explore possibilities and solve their greatest challenges, so you will never run out of interesting new challenges and opportunities. 

    We want Datacom to be an inclusive and welcoming workplace for everyone and take pride in the steps we have taken and continue to take to make our environment fun and friendly, and our people feel supported. 

    See more jobs at Datacom

    Apply for this job