Senior Data Engineer with Databricks

51 USDNet per hour - B2B
Data

Senior Data Engineer with Databricks

Data
-, Poland (Remote)

emagine Polska

Full-time
B2B
Senior
Remote
51 USD
Net per hour - B2B

Job description

We are looking for an experienced Senior Data Engineer / Data Architect with a strong background in designing and implementing modern data platforms using Databricks Lakehouse, Data Fabric, and large-scale distributed data systems. The ideal candidate has extensive hands-on experience with enterprise-grade data environments, advanced data modelling, scalable ETL/ELT pipelines, and cloud-native architectures.

Main Responsibilities

  • Design and develop end-to-end data architectures based on Databricks Lakehouse and Data Fabric principles.

  • Build scalable batch and streaming data pipelines using Spark (Structured Streaming, PySpark, SQL, Scala).

  • Implement medallion architecture (Bronze/Silver/Gold) and optimize compute workloads using Delta Lake, Z-Ordering, cluster tuning, and performance best practices.

  • Define and implement data governance, lineage, and access control using Unity Catalog, RBAC, and enterprise security standards.

  • Integrate diverse data sources using Airbyte, Kafka, REST APIs, and CDC frameworks (e.g., Debezium).

  • Collaborate with stakeholders to translate business requirements into high-quality data products.

  • Drive adoption of data engineering best practices, coding standards, CI/CD automation, and data quality frameworks (e.g., Great Expectations).

  • Mentor team members and contribute to architectural decisions, roadmaps, and long-term platform strategy.

Key Requirements

  • Strong expertise in Databricks (clusters, workflows, DLT, SQL, notebooks, Unity Catalog).

  • Hands-on experience with Spark and distributed processing at scale.

  • Deep understanding of modern data architectures: Lakehouse, Data Fabric, Data Mesh, event-driven workflows.

  • Proficiency in building ETL/ELT pipelines using Python, SQL, and/or Scala.

  • Knowledge of data modelling, metadata management, data cataloging, and domain-oriented design.

  • Experience with cloud platforms (Azure, GCP, or AWS) and object storage systems.

  • Familiarity with DevOps practices, Git-based workflows, CI/CD, Infrastructure as Code (Terraform), and data testing.

  • Strong analytical and problem-solving skills, with the ability to operate in complex enterprise environments.

Nice to Have

  • Experience with streaming platforms: Kafka, Pub/Sub, Event Hubs.

  • Exposure to dbt for analytics engineering.

  • Knowledge of MLOps concepts or integration with ML pipelines.

Tech stack

    Databricks

    master

    Python

    advanced

    SQL

    advanced

    PySpark

    regular

Office location

Published: 15.12.2025

Senior Data Engineer with Databricks

51 USDNet per hour - B2B
Summary of the offer

Senior Data Engineer with Databricks

-, Poland (Remote)
emagine Polska
51 USDNet per hour - B2B
By applying, I consent to the processing of my personal data for the purpose of conducting the recruitment process. Informujemy, że administratorem danych jest emagine z siedzibą w Warszawie, ul.Domaniewskiej 39A (dalej jako "administrator"). Masz pr... MoreThis site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.