Data Engineer / DataOps Engineer

Data

Data Engineer / DataOps Engineer

Data
Centrum, Warsaw

emagine Polska

Full-time
Any
Mid
Remote

Job description

🌍Remote work: fully remote.

📑Assignment type: B2B.

📕Project language: English.

Project length: > 12 months + prolongations.

Start: ASAP / 1 month.

💻Workload: full time.

⚙️Recruitment process: 2 interviews with the client.

💼 Industry: IT Services / Digital Consulting

🔍Additional information: After receiving the offer, a background check is carried out (references, criminal record check, etc.).

Summary: The primary purpose of the Data Engineer / DataOps Engineer role is to develop and manage data pipelines and workflows, enhancing the organization's data platform capabilities. This position is crucial for ensuring that data is processed reliably and efficiently, enabling better decision-making across the company.

Responsibilities:

  • Develop and maintain end-to-end data pipelines using Snowflake as the core data platform.

  • Build ELT workflows using dbt and manage orchestration with Airflow.

  • Implement and support DataOps processes, including CI/CD automation, monitoring, and workload deployment on Kubernetes.

  • Optimize Snowflake performance, including warehouses, storage usage, and query efficiency.

  • Ensure data reliability through data validation, testing, and monitoring practices.

  • Integrate various data sources and manage ingestion processes into Snowflake.

  • Collaborate with cross-functional teams to deliver reliable, production-ready data solutions.

  • Follow engineering best practices, maintain coding standards, and support continuous improvement.

  • Support team knowledge sharing and mentor junior developers when needed.

Key Requirements:

  • 5+ years of professional practice in data engineering.

  • Strong, practical experience with Snowflake (views, tables, performance tuning, orchestrated ELT processes).

  • Solid expertise using dbt for SQL-based transformations.

  • Hands-on experience with Airflow for workflow scheduling and automation.

  • Experience deploying and maintaining containerized workloads on Kubernetes.

  • Familiarity with cloud environments, with strong understanding of Microsoft Azure services.

  • Practical experience building ETL/ELT pipelines and maintaining production data workflows.

  • Good understanding of Git-based development, CI/CD pipelines, and general DevOps principles.

  • Analytical mindset and ability to troubleshoot issues in complex systems.

Nice to Have:

  • Experience with event streaming or messaging systems.

  • Familiarity with data quality tools.

  • Exposure to observability or platform engineering tooling.

  • Understanding of MLOps concepts or ML workflow integration.

Tech stack

    English

    B1

    Microsoft Platform

    advanced

    Machine Learning (ML)

    advanced

    Git

    advanced

    DataStage (ETL)

    advanced

    SQL

    advanced

    Testing

    advanced

    Cloud

    advanced

    ETL

    advanced

    Microsoft Azure

    advanced

    CI/CD

    advanced

Office location

Data Engineer / DataOps Engineer

Summary of the offer

Data Engineer / DataOps Engineer

Centrum, Warsaw
emagine Polska
By applying, I consent to the processing of my personal data for the purpose of conducting the recruitment process. Informujemy, że administratorem danych jest emagine z siedzibą w Warszawie, ul.Domaniewskiej 39A (dalej jako "administrator"). Masz pr... MoreThis site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.