Senior Data Engineer
✅ Your responsibilities:
Design, build, and deliver customer data ingestion and data warehousing solutions.
Implement scalable, repeatable ETL/ELT pipelines following data engineering best practices.
Build and maintain data transformation workflows using DBT.
Model, optimize, and maintain analytical datasets in Google BigQuery.
Orchestrate, schedule, and monitor data pipelines using Apache Airflow (Google Composer).
Work closely with stakeholders from data discovery through implementation and data quality validation.
Define and maintain data validation and QA/QC test suites.
Support production environments and ensure pipeline reliability, including on-call rotations.
🧠 Our requirements:
4+ years of experience in Data Engineering (mid/senior level).
Strong proficiency in Python for data pipelines and supporting services.
Hands-on experience with DBT and ELT-based data modeling.
Solid knowledge of Google BigQuery or comparable cloud data warehouses.
Practical experience with Google Cloud Storage for data persistence and exchange.
Experience with workflow orchestration tools such as Apache Airflow.
Good understanding of scalable data architectures, data quality, and reliability concepts.
Experience working in cloud-based environments (preferably GCP).
🌟Preferred Qualifications:
Experience with Kafka or other event streaming platforms.
Familiarity with Google Dataflow for batch and streaming data processing.
Experience with Docker and containerized workloads.
Knowledge of Terraform and infrastructure as code.
Exposure to Google Kubernetes Engine (GKE).
Experience supporting production systems with on-call tooling (e.g. PagerDuty).
🌟 What we offer:
100% remote work (EU time zones).
Long-term, stable cooperation on a mature data platform.
Work with a modern, cloud-native data stack on GCP.
High-impact role focused on data pipelines, quality, and scalability.
Collaborative, experienced engineering team with clear ownership.
Senior Data Engineer
Senior Data Engineer