#1 Job Board for tech industry in Europe

Data Modeller
New
Data

Data Modeller

Wrocław
332 - 415 USD/dayNet per day - B2B
332 - 415 USD/dayNet per day - B2B
Type of work
Full-time
Experience
Senior
Employment Type
B2B
Operating mode
Remote

Tech stack

    Polish

    C2

    ELT

    master

    Data Warehouse

    master

    Azure Databricks

    master

    Python

    master

    SQL

    advanced

Job description

Online interview

Join Us and Build a Cutting-Edge Data Platform!


As a Data Modeller, you will be working for our client in the debt collection sector, helping to build and maintain a robust, cloud-native data platform. The role focuses heavily on data modelling, requiring an expert who can translate conceptual business needs into logical and physical data models, build data contracts, and implement scalable ELT pipelines using Azure Databricks.


Your main responsibilities: 

  • Design and maintain logical and physical data models based on DDD (Domain-Driven Design) principles

  • Translate conceptual models and business glossaries into technical data structures for the Data Warehouse

  • Perform data mapping and create data contracts between the Data Platform and source systems

  • Collaborate with source system owners to define data contract requirements

  • Work on data ingestion processes from source systems using various methods: Direct database queries (bulk read/CDC), API communication, Event streaming

  • Implement ELT processes across Bronze, Silver, and Gold layers in Azure Databricks

  • Ensure alignment of data models with business and analytical requirements


You're ideal for this role if you have:

  • Strong experience in Data Modelling (logical & physical), preferably in DDD-based environments

  • Proven ability to work with Data Governance inputs: glossaries, conceptual models, HLD/LLD documentation

  • Experience preparing and maintaining data contracts

  • Solid knowledge of data ingestion techniques and working with source systems

  • Experience with Azure Databricks (or similar cloud platforms like GCP)

  • Ability to develop and maintain ELT pipelines in cloud-native environments


Nice to have:

  • Experience in writing clear technical documentation (e.g. data contracts, field definitions, extraction rules)

  • Background in mapping source data to target DWH structures

  • Ability to interpret and work with ERDs and relational models

  • Knowledge of master data management practices

  • Familiarity with dbdiagram.io

  • Awareness of Data Quality, Data Lineage, and metadata management concepts

  • Experience using tools like Azure Purview or other metadata management platforms



332 - 415 USD/day

Net per day - B2B

Apply for this job

File upload
Add document

Format: PDF, DOCX, JPEG, PNG. Max size 5 MB

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Informujemy, że administratorem danych jest ITDS z siedzibą w Warszawie, ul. Złota 59 (dalej jako "administrator"). Masz... more