nosql Remote Jobs

170 Results

+30d

Full Stack Developer

Default PortalHickory, NC - Remote
redisnosqlpostgressqlDesignmobiledockerkuberneteslinuxpythonjavascript

Default Portal is hiring a Remote Full Stack Developer

About Matrix Retail

We invent, design, build, and implement workforce optimization solutions for specialty retailers.

We believe that service matters, payroll is an investment, and employees are essential to brand profitability. We build long-term, win-win partnerships to help our clients deliver their brand promise.

We think like retailers because we are retailers. Matrix was founded in 2005 by experienced retailers who have built tools that have written millions and millions of schedules. Our retail experience has helped us develop solutions and approaches that address the unique challenges and opportunities of specialty retail, while also differentiating us in the marketplace.

About the Role

We are looking for a Full Stack Developer, with a heavier emphasis on front-end development, to produce scalable software solutions. You’ll be part of a cross-functional team that’s responsible for the full software development life cycle, from conception to deployment.

As a Full Stack Developer, you should be comfortable around both front-end and back-end coding languages, development frameworks and third-party libraries.

You should be able to write clean code and ensure your programs run properly. We also expect you to be passionate about building software and perform well working in a team.

Because we are a small team, you will have big responsibility, and will be expected to have a knack for visual design and utility.

Responsibilities

  • Work with development teams and product managers to ideate software solutions
  • Design client-side and server-side architecture
  • Build the front-end of applications through appealing visual design
  • Develop and manage well-functioning databases and applications
  • Write effective APIs
  • Test software to ensure responsiveness and efficiency
  • Troubleshoot, debug and upgrade software
  • Create security and data protection settings
  • Build features and applications with a mobile responsive design
  • Write technical documentation
  • Keep up-to-date with emerging technologies and industry trends and apply them into operations and activities
  • Maintain, expand, and scale our application as assigned
  • Monitor and address operational and security concerns in production systems

Requirements

  • Proven experience as a Full Stack Developer or similar role (5+ years experience)
  • Good ability to work and thrive in a fast-paced environment, learn rapidly and master diverse web technologies and techniques
  • Experience developing desktop and mobile applications
  • Familiarity with Linux servers and deployment strategies, including Docker and Kubernetes
  • Significant experience working with modern Javascript
  • Knowledge of front-end languages and libraries (React, Ember.js, SCSS)
  • Knowledge of Python backed APIs and libraries (Flask, Sanic)
  • Familiarity with SQL databases and NoSQL databases (e.g. Postgres, Redis)
  • Understanding of technology compliance and systems security
  • Strong problem solving skills with a creative approach
  • Solid grasp of UI/UX best practices and techniques
  • Excellent communication and teamwork skills
  • Great attention to detail
  • Organizational skills
  • An analytical and curious mind
  • BS in Computer Science or relevant field and experience

See more jobs at Default Portal

Apply for this job

+30d

Full Stack Developer

DapiSan Francisco, CA - Remote
golangnosqlsqlDesignapijavac++typescriptbackendfrontend

Dapi is hiring a Remote Full Stack Developer

Dapi is a fin-tech infrastructure that enables real-time bank payments with the use of our API. We enable cheap electronic payments in the US and beyond, thus disintermediating current payment processing platforms.

We are looking to hire a passionate full-stack engineer who can contribute to the design, development, maintenance and implementation of our API services. You will be joining our talented team of engineers whilst working alongside our designers and product managers. Your primary focus will be developing, maintaining and supporting our novel in-house scraping automation engine.

Our ideal candidate is one that can take up the challenge of maintaining Dapi's backend whilst contributing to the creation of new features on our existing system. As such, candidates who are independent, highly motivated and can react well under changing business requirements are an ideal match.

Responsibilities:

  • Design, develop, maintain and implement features of our API services
  • Develop maintain and provide support for our in-house scraping automation engine
  • Collect, analyze and address technical and design requirements
  • Help to create reusable code and libraries for future use
  • Maintain and expand upon a large existent codebase
  • Adapt well and react quickly to changing business requirements

Requirements:

  • Minimum experience of 1 year in a software engineering role
  • Advanced proficiency in any strongly typed language (such as C#, Java, C++, etc.)
  • Comfortable with SQL and NoSQL databases
  • Extremely good problem solving skills
  • Knowledge of professional software engineering best practices via code reviews, building tools and documentation
  • Ability to understand, contribute to and expand upon a complex existent codebase
  • Excellent understanding of current web technologies and inner workings of frontend frameworks

Bonus:

  • Proficiency in Golang and Typescript
  • Experience using Puppeteer

See more jobs at Dapi

Apply for this job

+30d

Senior Full-Stack Engineer

Crover LtdTaguig, Philippines, Remote
DevOPSDjangoS3EC2LambdaredisnosqlDesignmongodbazuregitrubyjavadockerpostgresqlMySQLcsskubernetesangularpythonAWSjavascriptNode.jsPHP

Crover Ltd is hiring a Remote Senior Full-Stack Engineer

Job Description

Key Responsibilities: 

  • Design, develop, and maintain both front-end and back-end components of web applications.

  • Architect and design of scalable and robust systems.

  • Work closely with product managers, designers, and other engineers to deliver high-quality products.

  • Write clean, maintainable, and efficient code, adhering to best practices and coding standards.

  • Optimize application performance for maximum speed and scalability.

  • Implement and maintain automated testing frameworks to ensure the reliability and quality of the codebase.

  • Manage the deployment process, including setting up CI/CD pipelines and ensuring smooth releases.

  • Identify and resolve complex technical issues across the stack

  • Keep up with the latest trends, tools, and technologies to continuously improve front-end and back-end architecture and deliver cutting-edge solutions

Qualifications

Requirements:

  • Proven experience in front-end and back-end development, with a strong portfolio of successful projects

  • Strong proficiency in HTML, CSS, JavaScript, and modern front-end frameworks/libraries such as React, Angular, and/or Django

  • Experience with relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Redis)

  • Proficiency in server-side languages (e.g., Java, Python, PHP, Ruby, Node.js) and frameworks (e.g., Django)

  • Expertise in designing and developing RESTful APIs

  • Experience with AWS and Azure services, including EC2, S3, RDS, Lambda, Azure VMs, and Azure Functions

  • Familiarity with DevOps practices and tools (e.g., Docker, Kubernetes, CI/CD pipelines)

  • Experience with version control systems, particularly Git

  • Familiarity with front-end and back-end testing frameworks and tools

  • Knowledge of performance optimization techniques and tools (e.g., Lighthouse, Webpack)

  • Deep understanding of security best practices and experience implementing secure front-end and back-end code

  • A keen eye for design and a passion for creating beautiful and functional user interfaces

  • Strong understanding of SEO principles and web accessibility

  • Excellent problem-solving skills and the ability to troubleshoot complex issues

  • Strong communication and collaboration skills, with the ability to work effectively in a team environment

  • Excellent attention to detail and organizational skills

  • Experience working with international teams and in global markets

  • Ability to work in a fast-paced environment and adapt to changing project requirements

  • Demonstrated eagerness to learn new skills and take on diverse tasks, fostering both personal and company growth

See more jobs at Crover Ltd

Apply for this job

+30d

Full Stack Java/React Developer

Mid LevelFull TimePWANextJSagilenosqltailwindDesignmobilegraphqlsassuiscrumapiUXjavatypescriptAWSjavascript

Portland Webworks is hiring a Remote Full Stack Java/React Developer

Full Stack Java/React Developer - Portland Webworks - Career PageRemote

See more jobs at Portland Webworks

Apply for this job

+30d

Data Architect - 100% Remote (REF1574W)

CitizantChantilly, VA, Remote
agileMaster’s DegreenosqlsqlDesignazureAWS

Citizant is hiring a Remote Data Architect - 100% Remote (REF1574W)

Job Description

Join our remote team as a Data Architect where you can share inventive ways to support federal clients modernizing their data practices. You will help guide the development of processes supporting data architecture, data management, and data governance.

Position Duties:

  • Collaborate with stakeholders to understand business requirements and translate them into data architecture solutions.
  • Design, develop, and maintain scalable and efficient data models, ensuring data integrity and consistency.
  • Define data standards, best practices, and guidelines to ensure the quality and security of the data architecture.
  • Collaborate with cross-functional teams to integrate data solutions into business processes and applications.
  • Stay informed about industry trends, emerging technologies, and best practices in data architecture.
  • Create data models, data flow diagrams, and data dictionaries to document the data architecture and ensure data integrity and consistency.
  • Analyze and suggest improvements to an organization’s data governance and data management processes.
  • Collaborate with business users and analysts to understand data requirements and translate them into architectural designs.
  • Provide guidance and expertise on process adoption and stakeholder collaboration, specifically around data management.
  • Develop reports/presentations and present to management and business users on data architecture and data governance topics.

Qualifications

Required Skillset:

  • Minimum of 10 years of related experience in fields of data management and data architecture
  • Strong proficiency in SQL and experience with relational and NoSQL databases.
  • Knowledge of data integration techniques and tools.
  • Excellent problem-solving and analytical skills.
  • Strong communication and interpersonal skills to effectively collaborate with both technical and non-technical stakeholders.
  • Experience in data governance, data quality, and metadata management.
  • Strong knowledge of data modeling techniques, data governance concepts, and database design principles

Desired Skillset:

  • Experience working with the Federal Government.
  • Familiarity with cloud-based data platforms (e.g., AWS, Azure, Google Cloud).
  • Experience working with Agile Teams utilizing Scrum.

Education:

  • Master’s Degree in Computer Science or other technical field of study.

Clearance Requirement:

  • Ability to obtain a Public Trust Clearance.
  • U.S. Citizenship Required

See more jobs at Citizant

Apply for this job

+30d

Data Engineer

LegalistRemote
agilenosqlsqlDesignc++dockerkubernetesAWS

Legalist is hiring a Remote Data Engineer

Intro description:

Legalist is an institutional alternative asset management firm. Founded in 2016 and incubated at Y Combinator, the firm uses data-driven technology to invest in credit assets at scale. We are always looking for talented people to join our team.

As a highly collaborative organization, our data engineers work cross-functionally with software engineering, data science, and product management to optimize growth and strategy of our data pipeline. In this position, you will be joining the data engineering team in an effort to take our data pipeline to the next level.

Where you come in:

  • Design and develop scalable data pipelines to collect, process, and analyze large volumes of data efficiently.
  • Collaborate with cross-functional teams including data scientists, software engineers, and product managers to understand data requirements and deliver solutions that meet business needs.
  • Develop ELT processes to transform raw data into actionable insights, leveraging tools and frameworks such as Airbyte, BigQuery, Dagster, DBT or similar technologies.
  • Participate in agile development processes, including sprint planning, daily stand-ups, and retrospective meetings, to deliver iterative improvements and drive continuous innovation.
  • Apply best practices in data modeling and schema design to ensure data integrity, consistency, and efficiency.
  • Continuously monitor and optimize data pipelines and systems for performance, availability, scalability, and cost-effectiveness.

What you’ll be bringing to the team:

  • Bachelor’s degree (BA or BS) or equivalent.
  • A minimum of 2 years of work experience in data engineering or similar role.
  • Advanced SQL knowledge and experience working with a variety of databases (SQL, NoSQL, Graph, Multi-model).
  • A minimum of 2 years professional experience with ETL//ELT, data modeling and Python.
  • Familiarity with cloud environments like GCP, AWS, as well as cloud solutions like Kubernetes, Docker, BigQuery, etc.
  • You have a pragmatic, data-driven mindset and are not dogmatic or overly idealistic about technology choices and trade-offs.
  • You have an aptitude for learning new things quickly and have the confidence and humility to ask clarifying questions.

Even better if you have, but not necessary:

  • Experience with one or more of the following: data processing automation, data quality, data warehousing, data governance, business intelligence, data visualization.
  • Experience working with TB scale data.

See more jobs at Legalist

Apply for this job

+30d

Staff Software Engineer

UnqorkUnited States (Remote)
remote-firstnosqlDesigngraphqlqac++javascriptreactjsNode.js

Unqork is hiring a Remote Staff Software Engineer

Unqork is the first Enterprise App Cloud solution, reshaping how organizations create, secure, and manage the entire lifecycle of their applications in the cloud—all with zero code. Unqork’s Enterprise App Cloud represents the next evolution of the application cloud layer, empowering enterprises to unleash business agility while removing the burden of technical debt. Unqork serves enterprises of all sizes, providing industry-tailored solutions for customers in financial services, insurance, government, and healthcare. Its customers include Goldman Sachs, Marsh, BlackRock, and the U.S. Department of Health and Human Services.

At Unqork, we value inclusive and innovative thinkers who boldly challenge the status quo. We encourage you to apply! 

The Impact U will make:

  • Your team will help customers organize, navigate and govern their applications on the Unqork platform
  • You'll help guide your team in the creation of new products, work to influence how we operate and provide value to customers, and get exposure to learning an industry-changing platform
  • Foster a supportive environment that gives your engineers space to make a difference
  • Help establish organizational metrics and roadmaps
  • Introduce and refine processes to improve the collaboration and productivity of your team
  • Partner with engineers, product managers, QA and Design to identify opportunities, and provide solutions that positively improve the end-user experience
  • Advocate for technical excellence and provide technical guidance on your team

What U bring:

  • 7+ years of experience in software engineering 
  • Experience growing your engineers into organizational contributors
  • Deep knowledge of systems design and architecture
  • The ability to identify operational gaps and help solve them by gaining agreement among your peers and teams
  • Experience helping establish organizational metrics and roadmaps within your organization
  • Exposure to working with distributed systems as well as scalable solutions
  • Experience in Node.js, modern JavaScript frameworks (like ReactJS), GraphQL or NoSQL databases

Compensation, Benefits, & Perks

???? Work from home with a remote-first community

???? Unlimited PTO (and the encouragement to use it)

???? Student loan payback program

???? 100% employer-covered medical, dental, and vision options available to you and your dependents

???? Flexible Spending Account (FSA)

???? Monthly stipend toward your WFH setup, vacation, development and more

???? Employer-sponsored 401(k) with contribution match

???? Robust DEI Program that compensates ERSG leaders for their efforts

????????‍♀️ Subsidized ClassPass Membership

???? Generous Paid Parental Leave

???? Join Aerodei at Unqork, where we track and report on diversity, equity, and inclusion efforts

???? Hiring Ranges:

  • Tier 1: $172,800-$230,000 base salary
  • Tier 2: $155,500-$207,000 base salary

Unqork employs a market-driven approach to establish compensation ranges. In addition to a base salary, employees may also be eligible to receive a target incentive and company equity in the form of stock options.

An employee’s compensation within the range provided above depends on a variety of factors including, but not limited to, their location, role, skillset, level of experience, and similar peer salaries.

As a remote-first company, Unqork incorporates a geographic differential into our compensation structure, depending on the candidate’s location. We utilize a tiered system—Tier 1 and Tier 2—to accurately reflect local market rates and ensure our compensation packages are both fair and competitive.

Our geographic tiers are defined as follows:

  • Tier 1: New York Metro, Seattle Metro, San Francisco Bay Area, Southern California, and Washington, D.C. Metro
  • Tier 2:All other US and US territory locations 

Unqork is an equal opportunity employer, and proud to be committed to diversity and inclusiveness. We will consider all qualified applicants without regard to race, color, nationality, gender, gender identity or expression, sexual orientation, religion, disability or age.

See more jobs at Unqork

Apply for this job

+30d

Python Developer

GlintsRemote
gRPCFull TimenosqlRabbitMQDesignmongodbapigitdockerkubernetespython

Glints is hiring a Remote Python Developer

Python Developer - Glints - Career PageSee more jobs at Glints

Apply for this job

+30d

Senior Data Engineer (Portfolio Companies)

IFSColombo, Sri Lanka, Remote
S3EC2golang6 years of experienceagilenosqlairflowsqlDesignmongodbdockerelasticsearchjenkinsAWS

IFS is hiring a Remote Senior Data Engineer (Portfolio Companies)

Job Description

  • Design, develop, and maintain a generic ingestion framework capable of processing various types of data (structured, semi-structured, unstructured) from customer sources.
  • Implement and optimize ETL (Extract, Transform, Load) pipelines to ensure data integrity, quality, and reliability as it flows into the centralized datastore like Elasticsearch.
  • Ensure the ingestion framework is scalable, secure, efficient and capable of handling large volumes of data in real-time or batch processes.
  • Continuously monitor and enhance the data ingestion process to improve performance, reduce latency, and handle new data sources and formats.
  • Develop automated testing and monitoring tools to ensure the framework operates smoothly and can quickly adapt to changes in data sources or requirements.
  • Provide documentation, support, and training to other team members and stakeholders on using the ingestion framework.
  • Implement large-scale near real-time streaming data processing pipelines.
  • Design, support and continuously enhance the project code base, continuous integration pipeline, etc.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into key business performance metrics.
  • Perform POCs and evaluate different technologies and continue to improve the overall architecture.

Qualifications

  • Experience building and optimizing Big Data data pipelines, architectures and data sets.
  • Strong proficiency in Elasticsearch, its architecture and optimal querying of data.
  • Strong analytic skills related to working with unstructured datasets.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data systems.
  • One plus years of experience contributing to the architecture and design (architecture, design patterns, reliability and scaling) of new and current systems.
  • Candidates must have 4 to 6 years of experience in a Data Engineer role with  Bachelors or Masters (preferred) in Computer Science or Information Systems or equivalent field. Candidate should have knowledge of using following technologies/tools:
    • Experience working on Big Data processing systems like Hadoop, Spark, Spark Streaming, or Flink Streaming.
    • Experience with SQL systems like Snowflake or Redshift
    • Direct, hands-on experience in two or more of these integration technologies; Java/Python, React, Golang, SQL, NoSQL (Mongo), Restful API.
    • Versed in Agile, APIs, Microservices, Containerization etc.
    • Experience with CI/CD pipeline running on GitHub, Jenkins, Docker, EKS.
    • Knowledge of at least one distributed datastores like MongoDb, DynamoDB, HBase.
    • Experience using batch scheduling frameworks like Airflow (preferred), Luigi, Azkaban etc is a plus.
    • Experience with AWS cloud services: EC2, S3, DynamoDB, Elasticsearch

See more jobs at IFS

Apply for this job

+30d

Senior Manager, Data Engineering

ATPCO1Herndon, VA, Remote
MLS3EC2LambdascalanosqlsqlDesignazureapidockerkubernetespythonAWS

ATPCO1 is hiring a Remote Senior Manager, Data Engineering

Job Description

Position Summary 

As the Senior Manager of Data Engineering, you will be accountable for delivering impactful data solutions that align with company objectives. This includes strong collaboration with product, and analytics teams, with a focus on data-driven impact and scalability. In this role, you will lead the data engineering team, ensuring seamless execution of data roadmaps and fostering engineering excellence. As a leader of leaders, you will mentor managers, team leads, and architects, empowering them to build high-performing teams dedicated to creating valuable data products. 

What will you do: 

  • Work with Product and Analytics teams to define and implement a strategic data roadmap that includes advanced analytics and AI/ML capabilities. Champion data-mesh principles to enable decentralized ownership and accessibility across teams. 

  • Design and oversee scalable, robust data architectures and ETL pipelines, leveraging AWS services like Redshift, Glue, and Lambda, ensuring support for machine learning, advanced analytics, and governance. 

  • Partner closely with data scientists and analytics teams to create data infrastructures that support real-time analytics and ML model deployment, delivering actionable insights that drive business decisions. 

  • Drive initiatives to enhance data security, stability, and governance, establishing frameworks that uphold quality, consistency, and regulatory compliance across ATPCO’s data ecosystem. 

  • Mentor and lead multiple teams of data engineers and analytics professionals, fostering technical growth and a shared commitment to data quality. Encourage innovative approaches to data automation and performance tuning. 

  • Continuously refine data engineering practices for efficiency and resilience, optimizing workflows and monitoring costs within the AWS environment, with a focus on scalable, automated data solutions. 

  • Foster a culture that values innovation and experimentation, empowering teams to explore new techniques in data, analytics, and AI, including generative AI models. 

  • Ensure the operational health and performance of data products in cloud environments, maintaining high availability, fault tolerance, and performance optimization across all services. 

  • Establish a data lake, data mesh, and advanced data modeling-first approach, creating well-structured data practices with an API-driven methodology that improves usability and governance. 

  • Own and lead internal data analytics platforms, collaborating with finance and operations to identify cost optimization opportunities, manage data product P&L, and establish benchmarks for data-driven outcomes. 

 

 

 

 

What will make you a great fit: 

  • 7+ years of experience in data engineering roles focused on high-volume, complex data applications, with 5+ years in people leadership. 

  • Proven ability to build and manage high-quality data products and infrastructure, with experience leading cross-functional teams in complex data environments. 

  • Technical expertise in a wide range of AWS services, including Redshift, Glue, S3, Lambda, EMR, Kinesis, EC2, API Gateway, serverless technologies, and container services such as ECS and EKS, integrated with data orchestration tools. Proficiency in Python, SQL, or Scala, plus experience with ML frameworks. 

  • Experience with database management (RDBMS, NoSQL, graph databases) and big data frameworks, including Apache Spark and Hadoop, showcasing your ability to design and optimize complex data architectures and process large-scale data efficiently. 

  • Strong communication skills to align cross-functional teams and convey technical concepts to technical and non-technical stakeholders. 

  • Familiarity with airline or travel industry data needs is advantageous. 

  • Track record of fostering inclusive environments that encourage innovation, teamwork, and diverse perspectives. 

 

 

Other Preferred Qualifications: 

  • Proficiency in cloud-native data services (AWS, Azure, GCP) and container tools (Docker, Kubernetes) for scalable data infrastructure. 

  • Familiarity with cloud security best practices, ensuring the secure deployment and maintenance of data solutions. 

  • Commitment to advancing data engineering and analytics practices, ensuring cutting-edge solutions. 

Salary Range:  USD $163,900 to $193,000

*The disclosed range estimate has not been adjusted for applicable geographic differential associated with the location*

Qualifications

See more jobs at ATPCO1

Apply for this job

+30d

Senior Software Engineer

phDataUS - Remote
kotlinnosqlDesignUI/UX designvuejavac++angularAWSfrontend

phData is hiring a Remote Senior Software Engineer

Job Application for Senior Software Engineer at phData{"@context":"schema.org","@type":"JobPosting","hiringOrganization":{"@type":"Organization","name":"phData","logo":"https://recruiting.cdn.greenhouse.io/external_greenhouse_job_boards/logos/000/010/147/resized/Copy_of_Untitled_(7).png?1617892865"},"title":"Senior Software Engineer","datePosted":"2024-10-17","jobLocation":{"@type":"Place","address":{"@type":"PostalAddress","addressLocality":null,"addressRegion":null,"addressCountry":null,"postalCode":null}},"description":"\u003cp\u003eHere at phData, our Product Engineering team develops a suite of tools, known as the\u0026nbsp;\u003ca href=\"https://toolkit.phdata.io\"\u003eToolkit\u003c/a\u003e, to automate and simplify essential data engineering and AI/ML engineering tasks. We focus on producing high-quality, reliable, and effective tools. Our tools are used both internally by our own engineers, and externally by our clients.\u003c/p\u003e\n\u003cp\u003eWe are looking for a full-stack engineer with a focus on front-end development. In this position you will own the front-end experience and development whi

See more jobs at phData

Apply for this job

+30d

Staff Software Engineer

Expert InstituteMilwaukee, WI, Remote
DevOPSDjangoNextJSterraformnosqlpostgressqltailwindDesignazureapigitjavac++cssangularpythonAWS

Expert Institute is hiring a Remote Staff Software Engineer

Job Description

The Role 

As a Staff Software Engineer on the Expert iQ development team, you’ll play a pivotal role in building and optimizing web applications that enhance the efficiency of connecting legal teams with expert witnesses. You will collaborate with cross-functional teams, designing and implementing innovative solutions that align with business objectives while ensuring a seamless user experience. You’ll be expected to contribute to both new and existing projects, develop and maintain secure APIs, and drive the adoption of infrastructure-as-code using tools like Terraform or CloudFormation. Your ability to navigate cloud platforms, optimize performance, and provide technical leadership will be crucial in tackling complex challenges. You will also play a key role in fostering a culture of continuous learning and innovation by embracing new technologies and frameworks.

Why You’ll Love This Role:

  • High-impact projects: Your work will directly influence our platform, helping legal professionals solve complex cases with expert data.

  • Innovation at its core: We foster a culture of continuous learning and creativity, encouraging you to work with the latest technologies like AI, cloud infrastructure, and more.

  • Collaborative environment: Be part of a cross-functional team where your voice matters and your contributions are highly valued.

  • Career growth: Grow your technical and leadership skills as you work on complex challenges in a fast-growing company.

Key Responsibilities

  • Collaborate with the engineering team to enhance web applications, system designs, and best practices.

  • Work with cross-functional teams to deliver innovative solutions aligned with business goals.

  • Participate in both existing projects and new developments, always focusing on the end user's needs.

  • Develop and maintain APIs for web applications, ensuring robust security and authentication.

  • Solve complex technical challenges, balancing requirements, design considerations, and trade-offs. You should feel comfortable with in-place re-architecture.

  • Embrace new technologies and frameworks, fostering a culture of innovation.

  • Implement infrastructure-as-code using tools like Terraform or CloudFormation.

  • Engage with cloud technologies and platforms to optimize performance and efficiency.

  • Deliver high-quality solutions, ensuring unit, integration, and end-to-end testing.

 

Qualifications

Qualifications 

  • 6+ years of software engineering experience.

  • Bachelor of Science degree (or equivalent) in computer science, engineering, or relevant field.

  • Demonstrates understanding and usage of software development principles [SOLID, DRY, SOC] and architectural principles.

  • Proficiency with fundamental front-end languages JavaScript/TypeScript [Angular, React], HTML, SCSS, CSS [Tailwind, Bootstrap], NX

  • Proficiency with server-side languages and frameworks such as NodeJS/Express [NestJS, NextJS] Python [Flask, Django], C# / Java [Spring Boot]

  • Hands-on experience with SQL [Postgres], NoSQL databases, and domain modeling.

  • Hands-on experience developing RESTful APIs and an interest in API design, microservices, and event-driven architectures.

  • Proficiency with Git and Bitbucket/Github workflows.

  • Familiarity with cloud platforms like AWS, Azure, or GCP.

  • Knowledge of infrastructure-as-code tools (Terraform, CloudFormation) and DevOps practices.

  • Familiarity with authentication standards like OAuth, SAML, or OIDC is a plus.

  • Bonus: Experience integrating with AI services like Gemini and ChatGPT.

  • Ability to explain complex business and technical concepts to all audiences.

Why Join Expert Institute?

This is a unique opportunity to join a talented team that is punching above its weight in a novel and growing niche. At Expert Institute, you will be part of an innovative environment that values creativity and impact. You will play a pivotal role in our journey to become a leading legal technology provider.

See more jobs at Expert Institute

Apply for this job

+30d

Software Engineer

Expert InstituteMilwaukee, WI, Remote
DevOPSterraformnosqlsqlDesignazureapigitAWS

Expert Institute is hiring a Remote Software Engineer

Job Description

The Role 

As a Software Engineer on the Data Central team, you will have the opportunity to work closely with all the members of the content engineering team and the content research teams, to advance the automation of content curation and collection, and build a content delivery platform initially using API solutions to enable delivery of expert data to Expert Institute’s clients. 

Why You’ll Love This Role:

  • High-impact projects: Your work will directly influence our platform, helping legal professionals solve complex cases with expert data.

  • Innovation at its core: We foster a culture of continuous learning and creativity, encouraging you to work with the latest technologies like AI, cloud infrastructure, and more.

  • Collaborative environment: Be part of a cross-functional team where your voice matters and your contributions are highly valued.

  • Career growth: Grow your technical and leadership skills as you work on complex challenges in a fast-growing company.

Key Responsibilities

  • Collaborate with the engineering team to contribute to improving system, database designs, and best practices.

  • Work with cross-functional teams to deliver innovative solutions that align with business objectives.

  • Participate in both existing projects and new developments, always keeping the end user’s needs in mind.

  • Support the development and maintenance of APIs for web applications, focusing on ensuring robust security and authentication.

  • Assist in solving complex technical challenges by understanding requirements, designs, and trade-offs.

  • Learn and work with new technologies and frameworks, helping foster a culture of innovation.

  • Actively contribute to implementing infrastructure-as-code using tools such as Terraform or CloudFormation.

  • Stay engaged with cloud technologies, platforms, and tools, contributing to optimizing performance and efficiency.

  • Develop quality solutions, understanding the importance of unit, integration, and end-to-end testing.

 

Qualifications

Qualifications 

  • 2+ years of software engineering experience.

  • Bachelor of Science degree (or equivalent) in computer science, engineering, or relevant field.

  • Understanding of object-oriented programming concepts with SOLID principles and experience working in either Javascript/Node.js/Python programming languages.

  • Hands-on experience with SQL, NoSQL databases, and domain modeling.

  • Hands-on experience developing RESTful APIs and an interest in API design, microservices, and event-driven architectures.

  • Proficiency with Git and Bitbucket/Github workflows.

  • Familiarity with cloud platforms like AWS, Azure, or GCP.

  • Knowledge of infrastructure-as-code tools (Terraform, CloudFormation) and DevOps practices.

  • Familiarity with authentication standards like OAuth, SAML, or OIDC is a plus.

  • Bonus: Experience integrating with AI services like Gemini and ChatGPT.

  • Ability to explain complex business and technical concepts to all audiences.

Why Join Expert Institute?

This is a unique opportunity to join a talented team that is punching above its weight in a novel and growing niche. At Expert Institute, you will be part of an innovative environment that values creativity and impact. You will play a pivotal role in our journey to become a leading legal technology provider.

See more jobs at Expert Institute

Apply for this job

+30d

Sr Ruby on Rails Developer

SolvativeIndia - Remote
TDDnosqlDesignjqueryscrumrubycssjavascriptbackend

Solvative is hiring a Remote Sr Ruby on Rails Developer

Position: Senior Ruby on Rails Developer

Location: Development Center, Ahmedabad, Gujarat, India

Headquarters: Dallas, USACompany Overview:

At Solvative, we specialize in developing robust, scalable, and innovative digital solutions that solve forward for our clients. With headquarters in Dallas and a development center in Ahmedabad, our global team works closely to create high-quality software solutions. We’re currently looking for a skilled Senior Ruby on Rails Developer to join our team, bringing expertise in backend development and a commitment to maintaining best practices.

Role Summary:
The Senior Ruby on Rails Developer will be responsible for designing, building, and maintaining efficient, reusable, and reliable Ruby code, supporting our web applications. This role involves working closely with cross-functional teams, implementing robust and scalable features, and ensuring seamless integration of front-end and back-end elements. The ideal candidate has at least five years of Ruby on Rails experience and is passionate about clean, maintainable code.

Timing
: 8 am to 5 pm PST

Key Responsibilities:

  • Feature Development: Design, develop, and maintain robust, scalable, and secure application features within Ruby on Rails.
  • Code Quality: Write clean, maintainable, and efficient code, ensuring adherence to best practices and code standards.
  • Lifecycle Involvement: Participate in all phases of the development lifecycle, from planning and design to testing and deployment.
  • Best Practices: Embrace test-driven development (TDD), continuous integration (CI), SCRUM, refactoring, and code standards.
  • Innovation: Stay updated with relevant new technologies and advocate for their integration where appropriate.
  • Collaboration: Work closely with front-end developers to integrate user-facing elements with server-side logic.

Qualifications:

  • Minimum of 5 years’ experience in Ruby on Rails development.
  • Strong proficiency in front-end technologies, including JavaScript, HTML, CSS, JQuery, and ReactJS.
  • Proven experience in developing highly interactive, user-friendly applications.
  • Expertise with both relational and NoSQL databases.
  • In-depth understanding of object-oriented design and programming principles.
  • Strong skills in writing clean, efficient, and maintainable code.
  • Bachelor’s degree in Computer Science, Engineering, or a related field.

Why Join Solvative?

  • Be part of a team that values continuous learning, collaboration, and innovation.
  • Engage in a dynamic work environment with opportunities for career growth.
  • Competitive compensation package and benefits tailored to support your personal and professional development.

See more jobs at Solvative

Apply for this job

+30d

Associate Data Engineer (5476)

MetroStar SystemsHybrid - local to the D.C. metro area
Bachelor's degreenosqlsqlazureapic++pythonbackendfrontend

MetroStar Systems is hiring a Remote Associate Data Engineer (5476)

As Associate Data Engineer, you’ll support business analytics projects by providing data-driven insights to HQ. The purpose is to analyze data gathered and to make informed decisions, improve operations, and strategize for future initiatives. The successful candidate will be responsible for advising the Enterprise Team and its stakeholders and developing strategies that drive operational improvements and support future business endeavors.

We know that you can’t have great technology services without amazing people. At MetroStar, we are obsessedwithour people and have led a two-decade legacy of building the best and brightest teams. Because we know our future relies on our deep understanding and relentless focus on our people, we live by our mission: A passion for our people. Value for our customers.

If you think you can see yourself delivering our mission and pursuing our goals with us, then check out the job description below!

What you’ll do:

  • Designing, developing, and maintaining scalable data pipelines and building out new API integrations to support increasing data volume and complexity. 
  • Collaborating with analytics and business teams to improve data models that feed business intelligence tools, enhancing data accessibility and fostering data-driven decision-making across the organization. 
  • Implementing processes and systems to monitor data quality, ensuring production data is accurate and available for key stakeholders and business processes. 
  • Performing data analysis to troubleshoot and resolve data-related issues. 
  • Collaborate closely with a team of frontend and backend engineers, product managers, data analysts and business analysts. 
  • Defining company data assets (data models), and create Spark (Using PySpark) and SQL jobs to populate data models. 
  • Writing unit/integration tests, contributing to the engineering wiki, and documenting work. 

What you’ll need to succeed:

  • Willing and able to be onsite in Reston, VA at least 3 days/week
  • Bachelor’s degree 
  • Ability to obtain and maintain a Secret security clearance
  • 0-2 years of relevant experience; 4-6 years of additional experience in lieu of degree 
  • Proficiency in Data Warehousing, ETL (Extract, Transform, Load) Processes, SQL and NoSQL Databases, Data Modeling, Data Integration, and Data Quality Assurance. 
  • Proficiency in programming languages including Python and C# is required. Strong problem-solving, communication, and collaboration skills are also necessary. 
  • Experience with Microsoft Azure Cloud Platform such as Azure Data Factory, Azure Synapse Analytics, Azure Databricks (using PySpark) and Azure SQL Database. 
  • Strong problem-solving, communication, and collaboration skills. 

Like we said, we arebig fans of our people. That’s why we offer a generous benefits package, professional growth, and valuable time to recharge. Learn more about our company culture code and benefits. Plus, check out our accolades.

Don’t meet every single requirement? 

Studies have shown that women, people of color and the LGBTQ+ community are less likely to apply to jobs unless they meet every single qualification.  At MetroStar we are dedicated to building a diverse, inclusive, and authentic culture, so, if you’re excited about this role, but your previous experience doesn’t align perfectly with every qualification in the job description, we encourage you to go ahead and apply.  We pride ourselves on making great matches, and you may be the perfect match for this role or another one we have. Best of luck! – The MetroStar People & Culture Team

What we want you to know:

In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification form upon hire.

MetroStar Systems is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. The statements herein are intended to describe the general nature and level of work being performed by employees and are not to be construed as an exhaustive list of responsibilities, duties, and skills required of personnel so classified. Furthermore, they do not establish a contract for employment and are subject to change at the discretion of MetroStar Systems.

Not ready to apply now?

Sign up to join our newsletter here.

"EEO IS THE LAW MetroStar Systems, LLC (MetroStar) invites any employee and/or applicant to review the Company’s Affirmative Action Plan. This plan is available for inspection upon request by emailing msshr@metrostar.com."

See more jobs at MetroStar Systems

Apply for this job

+30d

Software Engineer - Java

EgnyteRemote, India
Full TimegolangMaster’s DegreenosqlsqlDesignqajavapython

Egnyte is hiring a Remote Software Engineer - Java

Description

EGNYTE YOUR CAREER. SPARK YOUR PASSION.

Egnyte is a place where we spark opportunities for amazing people. We believe that every role has meaning, and every Egnyter should be respected. With 17,000 customers worldwide and growing, you can make an impact by protecting their valuable data. When joining Egnyte, you’re not just landing a new career, you become part of a team of Egnyterswhodoers, thinkers, and collaborators are who embrace and live by our values:

IconDescription automatically generatedInvested Relationships

IconDescription automatically generatedFiscal Prudence

IconDescription automatically generatedCandid Conversations

 

ABOUT EGNYTE

Egnyte is the secure multi-cloud platform for content security and governance that enables organizations to better protect and collaborate on their most valuable content. Established in 2008, Egnyte has democratized cloud content security for more than 22,000 organizations, helping customers improve data security, maintain compliance, prevent and detect ransomware threats, and boost employee productivity on any app, any cloud, anywhere. For more information, visit www.egnyte.com.

 Our Engineering department is responsible for developing large-scale distributed components and services that power Egnyte's Cloud Platform. Our systems handle billions of requests daily, delivering sub-second latency in a fault-tolerant environment. We process andanalysemillions of files and events each day. Key areas of responsibility include Egnyte's Cloud File System, Content Classification, Lifecycle Management, UserBehaviourAnalysis, Object Store, Metadata Stores, Search Systems, Recommendation Systems, Synchronization, and intelligent caching of multi-petabyte datasets.

We seek candidates who are passionate about building large-scale distributed systems and excited by the challenges of scaling across multiple orders of magnitude.

WHAT YOU’LL DO: 

  • Design and develop highly scalable, elastic cloud architectures that integrate seamlessly with on-premises systems.
  • Identify and pursue technical opportunities to enhance the efficiency of Egnyte’s cloud platform.
  • Collaborate with multicultural, geographically distributed teams, and coordinate effectively with cross-functional teams across multiple time zones.
  • Stay ahead of industry trends to drive innovation and contribute to the development of new technologies.
  • Conceptualize, design, and implement changes to ensure key systems remain reliable, fully utilized, and well-supported.
  • Take full ownership of critical software projects, managing all phases from design and implementation to QA, deployment, and ongoing monitoring.

 

YOUR QUALIFICATIONS:

  • Bachelor’s orMaster’s degree in Computer Scienceor a related field.
  • 3+ years of relevant industry experience.
  • Proven track record in designing and developing complex systems.
  • Expertise in designing and building highly scalable and resilient cloudarchitectures, with experience in GCP or similar cloud platforms.
  • Advanced proficiency in Java; experience with Python and Golang is a plus.
  • Hands-on experience with both SQL and NoSQL databases.
  • Comprehensive experience managing all phases of software development, from design to implementation, QA, and maintenance.
  • Strong data-driven decision-making process

Bonus points:

  • Proven success in designing and developing large-scale, complex systems.
  • Expertise in multi-tenant, highly complex cloud solutions; experience with hybrid or on-premises solutions is a plus.
  • Experience in designing and developing distributed SaaS applications for a large customer base

COMMITMENT TO DIVERSITY, EQUITY, AND INCLUSION:

At Egnyte, we celebrate our differences and thrive on our diversity for our employees, our products, our customers, our investors, and our communities. Egnyters are encouraged to bring their whole selves to work and to appreciate the many differences that collectively make Egnyte a higher-performing company and a great place to be.

 

See more jobs at Egnyte

Apply for this job

+30d

Senior Backend Engineer OTC (m/w/d) - REF2072E

Deutsche Telekom IT SolutionsBudapest, Debrecen, Szeged, Pécs, Hungary, Remote
agilenosqlopenstacklinuxbackend

Deutsche Telekom IT Solutions is hiring a Remote Senior Backend Engineer OTC (m/w/d) - REF2072E

Job Description

The Public Cloud Portfolio Unit operates on a national and international level, for medium-sized and large companies. We develop, market and operate agile, cloud-native, forward-looking products and services for the digital world. We see ourselves as innovation drivers and make our customers' business fit for the digital future. Our mission: Together with our customer, shaping the safest, easiest and most efficient transformation to a digitized and cloud-native future.

 

Your Department

We run Open Telekom Cloud! Open Telekom Cloud is a public cloud standard product based on open source community software and driven by principles of DevSecOps. Lean structures, agile methods, highly motivated teams and an extremely dynamic business environment determine our actions. With this customer-oriented and agile orientation, we are the anchor point for the Public Cloud business in Deutsche Telekom Group.

We are measured by delivering a secure, stable and innovative platform. We work jointly with our platform partner and other partners out of the OpenStack ecosystem to create a highly innovative public cloud product based on European security and data protection standards.

We are looking for people who are professionals and evangelists with a great deal of enthusiasm for cloud technology and who are up to the challenges created by the development and operation of a hyper-scale public cloud.

We offer a unique insight into how a large public cloud works under the hood, intercultural teamwork, flat hierarchies, and an independent working-style.

Your Tasks

As "Backend Engineer OTC" you understand the latest developments in cloud and container technology. You will enhance our Open Telekom Cloud Database Services in a customer-oriented manner.

Do you like?

 

  • Solve complex problems in the daily operation of a hyper-scaler's cloud backend.
  • Work hardware-oriented at the console and use both command line and web console.
  • Develop and operate monitoring and quality assurance tools for Database Services
  • Consistently automate with common automation frameworks.
  • Work in a team of specialists where everyone helps each other in an open and trusting manner.

Qualifications

Your Profile

  • Completed studies in a technical, engineering or scientific subject or comparable professional training.
  • 5-7 years of professional experience in IT with a focus on modern cloud technologies.
  • Very good knowledge of Database Solutions in Cloud (Relational and NoSQL as well)
  • Good knowledge of infrastructure, network, hardware, storage, IaaS, PaaS, SaaS.
  • Extensive knowledge in infrastructure automation.
  • Strong experience in Linux and network related services.
  • High level of customer focus.
  • Driving new feature deployment and problem solving to enable higher customer satisfaction
  • Knowledge of agile development processes.
  • Ability to assess technical solutions and come up with creative approaches.
  • OpenStack and programming experience in open source projects is a plus.
  • Fluency in written and spoken English.

You will be working in the European Union to meet our customers' data security and privacy requirements.

See more jobs at Deutsche Telekom IT Solutions

Apply for this job

+30d

Stage Data Analyst (F/H)

ASINantes, France, Remote
agilescalanosqlairflowmongodbazurescrumjavapython

ASI is hiring a Remote Stage Data Analyst (F/H)

Description du poste

Dans un souci d’accessibilité et de clarté, les termes employés au masculin se réfèrent aussi bien au genre féminin que masculin.  

Afin de répondre aux enjeux de nos clients, et dans la continuité du développement de nos expertises Data, nous sommes à la recherche d’un stagiaire Data Analyst.

Intégré à l’équipe Data Nantaise, vous rejoignez un projet, sous la responsabilité d’un expert, et au quotidien :

  • Vous préparez et intégrez les données nécessaires à la préparation de rapports dataviz
  • Vous nettoyez et formatez les données pour les mettre à disposition de vos rapports
  • Vous concevez et mettez en forme vos données au travers des outils de la suite Power Platform : Power Automate, Power Bi, …
  • Vous développez, testez et mettez à disposition un ensemble de rapports PowerBi pour faciliter la prise de décision
  • Vous mettez en valeur, diffusez et vulgarisez les résultats obtenus
  • Vous appréhendez les méthodologies AgileScrum et cycle en W
  • Vous montez en compétences dans l’un ou plusieurs des environnements technologiques suivants :
    • L’écosystème Data: Spark, Hive, Kafka, Hadoop, Microsoft…
    • Les langages :Scala, Java, Python, DAX, PowerQuery…
    • Les bases de données NoSQL : MongoDB, Cassandra, CosmosDB…
    • Le stockage cloud: Azureet les différentes briques associées dont Power BI
    • Les ETL/Outils d'orchestration du marché : Airflow, Datafactory, Talend...

 

En rejoignant ASI,

  • Vous évoluerez dans une entreprise bientôt reconnue Société à mission, Team GreenCaring et non GreenWashing porteuse d’une démarche RSE incarnée et animée, depuis plus de 10 ans. (Equipe RSE dédiée, accord forfaits mobilités durables…) 
  • Vous intégrerez les différentes communautés expertes d'ASI, pour partager des bonnes pratiques et participer aux actions d'amélioration continue. 

 

Qualifications

De formation supérieure en informatique, mathématiques ou spécialisée en Big Data (de type école d’ingénieurs ou université) en cours de validation (Bac+5), vous êtes à la recherche d’un stage de fin d’études d’une durée de 4 à 6 mois.

  • Le respect et l’engagement font partie intégrante de vos valeurs.
  • Passionné par la donnée, vous êtes rigoureux et vos qualités relationnelles vous permettent de vous intégrer facilement dans l’équipe.

Le stage devant déboucher sur une proposition d'emploi concrète en CDI.

Désireux d’intégrer une entreprise à votre image, vous vous retrouvez dans nos valeurs de confiance, d’écoute, de plaisir et d’engagement. 

A compétences égales, ce poste est ouvert aux personnes en situation de handicap. 

See more jobs at ASI

Apply for this job

+30d

Data Engineer (F/H)

ASINantes, France, Remote
S3agilenosqlairflowsqlazureapijavac++

ASI is hiring a Remote Data Engineer (F/H)

Description du poste

Dans un souci d’accessibilité et de clarté, les termes employés au masculin se réfèrent aussi bien au genre féminin que masculin.  

Simon, Responsable de l’équipe Data Nantaise, est à la recherche d’un Data Engineer pour mettre en place, intégrer, développer et optimiser des solutions de pipeline sur des environnements Cloud et On Premise pour nos projets clients.

Au sein d’une équipe dédiée et principalement en contexte agile : 

  • Vous participez à la rédaction de spécifications techniques et fonctionnelles
  • Vous maitrisez les formats de données structurés et non structurés et savez les manipuler
  • Vous connectez une solution ETL / ELT à une source de données
  • Vous concevez et réalisez un pipeline de transformation et de valorisation des données, et ordonnancez son fonctionnement
  • Vous prenez en charge les développements de médiations 
  • Vous veillez à la sécurisation des pipelines de données
  • Vous concevez et réalisez des API utilisant les données valorisées
  • Vous concevez et implémentez des solutions BI
  • Vous participez à la rédaction des spécifications fonctionnelles et techniques des flux
  • Vous définissez des plans de tests et d’intégration
  • Vous prenez en charge la maintenance évolutive et corrective
  • Vous traitez les problématiques de qualité de données

En fonction de vos compétences et appétences, vous intervenez sur l’une ou plusieurs des technologies suivantes :

  • L’écosystème data notamment Microsoft Azure
  • Les langages : SQL, Java
  • Les bases de données SQL et NoSQL
  • Stockage cloud: S3, Azure Blob Storage…
  • Les ETL/ESB et autres outils : Talend, Spark, Kafka NIFI, Matillion, Airflow, Datafactory, Glue...

 

En rejoignant ASI,

  • Vous évoluerez au sein d’une entreprise aux modes de fonctionnement internes flexibles garantis par une politique RH attentive (accord télétravail 3J/semaine, accord congé parenthèse…) 
  • Vous pourrez participer (ou animer si le cœur vous en dit) à nos nombreux rituels, nos événements internes (midi geek, dej’tech) et externes (DevFest, Camping des Speakers…)  
  • Vous évoluerez dans une entreprise bientôt reconnue Société à mission, Team GreenCaring et non GreenWashing porteuse d’une démarche RSE incarnée et animée, depuis plus de 10 ans. (Equipe RSE dédiée, accord forfaits mobilités durables…) 

Qualifications

Vous êtes issu d’une formation supérieure en informatique, mathématiques ou spécialisé en Big Data, et avez une expérience minimale de 3 ans en ingénierie des données et d'une expérience opérationnelle réussie dans la construction de pipelines de données structurées et non structurées.

  • Attaché à la qualité de ce que vous réalisez, vous faites preuve de rigueur et d'organisation dans la réalisation de vos activités.
  • Doté d'une bonne culture technologique, vous faites régulièrement de la veille pour actualiser vos connaissances.
  • Un bon niveau d’anglais, tant à l’écrit qu’à l’oral est recommandé.

Désireux d’intégrer une entreprise à votre image, vous vous retrouvez dans nos valeurs de confiance, d’écoute, de plaisir et d’engagement. 

Le salaire proposé pour ce poste est compris entre 36 000 et 40 000 €, selon l'expérience et les compétences, tout en respectant l'équité salariale au sein de l'équipe. 

 

A compétences égales, ce poste est ouvert aux personnes en situation de handicap. 

See more jobs at ASI

Apply for this job

+30d

Stage Data Engineer (F/H)

ASINantes, France, Remote
S3agilescalanosqlairflowmongodbazurescrumjavapython

ASI is hiring a Remote Stage Data Engineer (F/H)

Description du poste

Dans un souci d’accessibilité et de clarté, les termes employés au masculin se réfèrent aussi bien au genre féminin que masculin.  

Afin de répondre aux enjeux de nos clients, et dans la continuité du développement de nos expertises Data, nous sommes à la recherche d’un stagiaire Data Engineer.

Intégré à l’équipe Data Nantaise, vous rejoignez un projet, sous la responsabilité d’un expert, et au quotidien :

  • Vous avez un tuteur dédié pour suivre votre évolution
  • Vous participez au développement d’une chaîne de traitement de l’information de bout en bout
  • Vous intervenez sur de l’analyse descriptive/inférentielle ou prédictive
  • Vous participez aux spécifications techniques
  • Vous appréhendez les méthodologies Agile Scrum et cycle en W
  • Vous montez en compétences dans l’un ou plusieurs des environnements technologiques suivants :
    • L’écosystème Data: Spark, Hive, Kafka, Hadoop…
    • Les langages : Scala, Java, Python…
    • Les bases de données NoSQL : MongoDB, Cassandra…
    • Le stockage cloud: S3, Azure…
    • Les ETL/Outils d'orchestration du marché : Airflow, Datafactory, Talend...

 

En rejoignant ASI,

  • Vous évoluerez dans une entreprise bientôt reconnue Société à mission, Team GreenCaring et non GreenWashing porteuse d’une démarche RSE incarnée et animée, depuis plus de 10 ans. (Equipe RSE dédiée, accord forfaits mobilités durables…) 
  • Vous intégrerez les différentes communautés expertes d'ASI, pour partager des bonnes pratiques et participer aux actions d'amélioration continue. 

Qualifications

De formation supérieure en informatique, mathématiques ou spécialisée en Big Data (de type école d’ingénieurs ou université) en cours de validation (Bac+5), vous êtes à la recherche d’un stage de fin d’études d’une durée de 4 à 6 mois.

  • Le respect et l’engagement font partie intégrante de vos valeurs.
  • Passionné par la donnée, vous êtes rigoureux et vos qualités relationnelles vous permettent de vous intégrer facilement dans l’équipe.

Le stage devant déboucher sur une proposition d'emploi concrète en CDI.

 

Désireux d’intégrer une entreprise à votre image, vous vous retrouvez dans nos valeurs de confiance, d’écoute, de plaisir et d’engagement. 

 

A compétences égales, ce poste est ouvert aux personnes en situation de handicap. 

See more jobs at ASI

Apply for this job