DevOps Engineer (AWS + Data)
Location: 100% remote
Tasks
Enabling reliable data pipelines
Supporting analytics workloads
Maintaining cloud‑native data environments (Databricks/Snowflake) with strong CI/CD and Infrastructure-as-Code practices
Requirements
Strong DevOps experience in AWS with Infrastructure as Code using Terraform, supporting scalable data platforms (Databricks preferred, Snowflake acceptable)
Hands-on experience building and managing CI/CD pipelines using Jenkins and/or GitHub Actions for infrastructure and data workloads
Working knowledge of Kubernetes (EKS) and containerization (Docker) for deploying data services
Monitoring and observability using CloudWatch and Grafana for data pipeline reliability and performance
Experience supporting dbt pipelines and orchestrating workflows using Airflow (or similar tools like Prefect/Dagster)
Beginner to intermediate hands-on experience in Python, PySpark, and SQL to collaborate effectively with Data Engineering teams
Experience working closely with data teams to manage and optimise data infrastructure, lakehouse/warehouse environments, and ETL/ELT workflows
Offer
Multisport card
Private healthcare
Access to an e‑learning platform
Group life insurance
DevOps Engineer (AWS + Data)
DevOps Engineer (AWS + Data)