Senior Data Engineer

Data

Senior Data Engineer

Data
rondo Ignacego Daszyńskiego 2b, Warsaw

Eucloid Data Solutions

Full-time
Permanent
Senior
Office
5 029.97 - 5 588.86 USD
Gross per month - Permanent

Job description

Role: Senior Data Engineer

Duration: 12 months with possibility for extension

Location: Warsaw, Poland

Visa/Work Permit: Candidates from countries outside Poland must have their own arrangements


About Eucloid


At Eucloid, innovation meets impact. As a leader in AI and Data Science, we create

solutions that redefine industries—from Hi-tech and D2C to Healthcare and SaaS. With

partnerships with giants like Databricks, Google Cloud, and Adobe, we’re pushing

boundaries and building next-gen technology.


Join our talented team of engineers, scientists, and visionaries from top institutes like

IITs, IIMs, and NITs. At Eucloid, growth is a promise,


What You’ll Do

  • Design, build, and optimize scalable data pipelines supporting enterprise banking and financial services use cases

  • Develop and maintain Databricks-based data solutions using Apache Spark and Delta Lake

  • Build, monitor, and troubleshoot ETL / ELT workflows, including failure handling and recovery mechanisms

  • Perform performance tuning, capacity planning, and reliability testing for production data pipelines

  • Collaborate with solution architects, analysts, and cross-functional engineering teams to deliver end-to-end data solutions

  • Investigate data quality issues, identify root causes, and implement long-term fixes

  • Create and maintain technical documentation for data pipelines and platform components

  • Ensure all data solutions follow cloud-first, security-aware, and governance-aligned principles

  • Contribute to the migration of legacy data warehouse platforms (on-prem or cloud DWH) to Databricks-based lakehouse architectures, ensuring data consistency, reliability, and minimal business disruption

  • Design and operate data pipelines aligned to non-functional requirements (NFRs), including high availability (HA), disaster recovery (DR), and defined RTO/RPO objectives


What Makes You a Fit

Academic Background:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, Statistics, or a related discipline


Technical Expertise

  • 3–5 years of hands-on experience in data engineering roles

  • Strong proficiency in SQL for analytical queries and data transformations

  • Advanced experience with Python and Apache Spark

  • Hands-on experience with Databricks Lakehouse architecture, including Apache Spark and Delta Lake

  • Experience working with at least one major cloud platform (AWS, Azure, or GCP)

  • Solid understanding of distributed systems and large-scale data processing architectures

  • Familiarity with modern data stack tools such as Airflow, dbt, Terraform, or similar orchestration and transformation tools

  • Familiarity with Databricks Unity Catalog for data governance, access control, and lineage management

  • Awareness of Databricks cost governance and optimisation practices in largescale, multi-workspace environments


Domain & Delivery Experience

  • Experience delivering data platforms in regulated BFSI / banking environments, supporting requirements such as BCBS 239, GDPR and internal data governance

    standards

  • Exposure to regulated data platforms, including data quality, access controls and audit requirements

  • Exposure to risk, finance, AML, or regulatory reporting data domains within financial services

  • Understanding of data lineage, auditability, reconciliation, and data quality controls required for banking-grade data platforms

  • Exposure to operating data pipelines with strict SLAs and enterprise reliability expectations


Additional Skills

  • Strong debugging, troubleshooting, and problem-solving skills in complex, production-grade data environments

  • Ability to work independently in complex, multi-team environments

  • Experience contributing to architecture governance, technical standards and design reviews

  • Ability to mentor junior engineers and provide technical guidance within delivery teams


Nice to Have

  • Exposure to Data Mesh or domain-oriented data platform designs

  • Experience supporting AI/ML or advanced analytics use cases on top of data platforms

  • Prior exposure to UK or European banking environments

  • Experience working on long-running, multi-year data transformation programs


Engagement Details

  • Employment Type: Full-time, fixed-term (12 months)

  • Location: Poland

  • Start Date: Early February 2026

  • Extension: Possible based on performance and program continuity


About Our Leadership

  • Anuj Gupta – Former Amazon leader with over 22 years of experience in building and managing large engineering teams. (B.Tech, IIT Delhi; MBA, ISB Hyderabad)

  • Raghvendra Kushwah – Business consulting expert with 21+ years at Accenture and Cognizant (B.Tech, IIT Delhi; MBA, IIM Lucknow)


Equal Opportunity Statement

Eucloid is an equal-opportunity employer. We celebrate diversity and are committed to creating an inclusive environment.


Submit your resume to saurabh.bhaumik@eucloid.com with the subject line “Application: Senior Data Engineer Poland

Tech stack

    English

    B2

    SQL

    advanced

    ETL

    regular

    Databricks

    regular

    Delta Lake

    regular

    Python

    regular

    Data Engineering

    regular

    Apache Spark

    regular

    data governance

    nice to have

    Amazon AWS

    nice to have

    Apache Airflow

    nice to have

Office location

Published: 25.12.2025