#1 Job Board for tech industry in Europe

Data Engineer
New
Data

Data Engineer

38 - 43 USD/hNet per hour - B2B
38 - 43 USD/hNet per hour - B2B
Type of work
Full-time
Experience
Mid
Employment Type
B2B
Operating mode
Hybrid

Tech stack

    Python

    regular

    Azure Data Factory

    regular

    ADLS

    regular

    SQL

    regular

    ETL/ELT

    regular

    CI/CD

    junior

    Docker

    nice to have

    Kubernetes

    nice to have

Job description

Online interview

Project information:

  • Industry: Insurance and IT services

  • Rate: up to 160 zł/h netto + VAT

  • Location: Warsaw (first 2-3 months of office visits once a week, then occasionally)

  • Project language: Polish, English


Summary

The Data Engineer will be responsible for designing, building, and maintaining Data Hubs that integrate multiple data sources for efficient analytics and operational purposes, with a focus on real-time data processing.


Main Responsibilities

  • Data Hub Development – Design and implement scalable Data Hubs to support enterprise-wide data needs.

  • Data Pipeline Engineering – Build and optimize ETL/ELT pipelines for efficient data ingestion, transformation, and storage.

  • Logical Data Modeling – Structure Data Hubs to ensure efficient access patterns and support diverse use cases.

  • Real-Time Analytics – Enable real-time data ingestion and updating models.

  • Data Quality & Monitoring – Develop monitoring features to ensure high data reliability.

  • Performance Optimization – Optimize data processing for large-scale datasets.

  • Automation & CI/CD – Implement CI/CD pipelines for automating data workflows.

  • Collaboration – Align data solutions with enterprise needs through teamwork.

  • Monitoring & Maintenance – Continuously improve data infrastructure reliability.

  • Agile Practices – Participate in Scrum/Agile methodologies.

  • Documentation – Create and maintain clear documentation for data models and pipelines.


Key Requirements

  • Strong Python skills (or other relevant language)

  • Experience with Azure Data Factory, ADLS, and Azure SQL

  • Hands-on experience in building ETL/ELT pipelines

  • Experience with real-time data processing

  • Understanding of data preparation for AI/ML applications

  • Experience in building data validation and monitoring features

  • Proficiency in SQL for data transformation

  • Familiarity with CI/CD and infrastructure-as-code principles

  • Understanding data security and compliance best practices

  • Proficient in English (B2 level minimum)


Nice to Have

  • Data Governance knowledge

  • Experience with containerization technologies (Docker, Kubernetes/AKS)

  • Agile collaboration experience

  • Ability to produce high-quality technical documentation


38 - 43 USD/h

Net per hour - B2B

Apply for this job

File upload
Add document

Format: PDF, DOCX, JPEG, PNG. Max size 5 MB

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Informujemy, że administratorem danych jest emagine z siedzibą w Warszawie, ul.Domaniewskiej 39A (dalej jako "administra... more