Senior Data Engineer

Data

Senior Data Engineer

Data
Warszawa, Warszawa

Experis Manpower Group

Full-time
B2B
Senior
Remote
43 - 46 USD
Net per hour - B2B

Job description

Location: 100% remote work (Poland)  
Start Date: ASAP / within 1 month

 

About the Role


This position supports a major transformation initiative migrating a large-scale SQL Server environment to a Databricks / Delta Lake platform within an enterprise Exposure platform. The work focuses fully on robust data engineering and software development across a long-lived, mission-critical codebase.

 

Responsibilities

 

  • Transform complex SQL stored procedures and business logic into clean, scalable Python / PySpark code

  • Re design and implement SQL logic within Databricks following software engineering best practices

  • Develop production-grade transformation components, packages, and reusable modules

  • Design and evolve data models across Bronze, Silver, and Gold layers within a Medallion Architecture

  • Work with extremely large data volumes and highly parallel, event-driven processing

  • Collaborate in code reviews and technical design discussions

  • Ensure maintainability, modularity, and long-term code quality

Nice-to-Have Responsibilities

 

  • Support orchestration workflows in tools such as Azure Data Factory

  • Contribute to CI/CD pipelines and DevOps processes

  • Provide light support around analytics or reporting tools (non-core)

 

Requirements

 

  • Very strong Python and PySpark programming skills

  • Hands-on experience building on Databricks and Delta Lake

  • Experience working in large, shared enterprise codebases (beyond notebook-level development)

  • Strong SQL capabilities with the ability to interpret complex logic

  • Solid grounding in object-oriented programming and clean code principles

  • Experience with enterprise software development workflows, modularization, refactoring

  • Strong data modeling background for both transactional and analytical systems

  • Familiarity with layered data architectures (Bronze / Silver / Gold) and model redesign during platform migrations

  • Ability to explain and reason through code line by line

 

Nice-to-Have Skills

 

  • Azure Data Factory

  • Azure DevOps, Git, CI/CD

  • Power BI or similar analytics tools

  • Infrastructure or DevOps knowledge

 

What This Role Is Not

 

  • Not a Data Analyst position

  • Not focused on Power BI or reporting

  • Not a notebook-only or exploratory data science role

 

Interview Expectations

 

  • Ability to walk through real production-grade Python / PySpark code

  • Clearly explain how the code works, why it was implemented that way, and how it could be improved

  • Demonstrate strong software engineering thinking

  • Show confidence working in large, long-term data engineering environments

 

Offer

 

  • B2B contract via Experis

  • Multisport card

  • Private healthcare (Medicover)

  • Access to an e-learning platform

  • Group life insurance 

Tech stack

    English

    C1

    PySpark

    regular

    Databricks

    regular

    Delta Lake

    regular

    SQL

    regular

    Python

    regular

    CI/CD

    nice to have

    Power BI

    nice to have

    Azure Data Factory

    nice to have

    Azure DevOps

    nice to have

    German

    nice to have

Office location