#1 Job Board for tech industry in Europe

  • Job offers
  • Data Architect
    New
    Data

    Data Architect

    Warszawa
    Type of work
    Full-time
    Experience
    Senior
    Employment Type
    B2B
    Operating mode
    Hybrid

    Tech stack

      Databricks

      advanced

      PySpark

      advanced

    Job description

    Online interview
    Friendly offer

    Kevin Edward is looking for a highly skilled Senior Data Engineer with deep expertise in Databricks and Lakehouse architecture. This role is ideal for a data professional with strong experience in PySpark, Delta Lake, Delta Live Tables, and Unity Catalog, who can design, develop, and optimize data workflows in a high-performance environment.


    Key Responsibilities:

    • Establish and implement Databricks Lakehouse architecture for enterprise solutions.
    • Ingest and transform batch and streaming data using Databricks Lakehouse Platform.
    • Design and maintain Delta Live Tables, Delta Sharing, and Unity Catalog for seamless data governance and security.
    • Leverage Databricks Workflows to create and manage end-to-end data pipelines and ensure efficient data processing.
    • Develop and optimize Delta Lake tables for monitoring data quality, reliability, and performance metrics.
    • Orchestrate diverse workloads using PySpark, Delta Live Tables, and Databricks tools.
    • Implement security best practices for data access, governance, and compliance.
    • Size and optimize Databricks clusters for performance and cost efficiency.
    • Collaborate with cross-functional teams in an Agile environment to understand business requirements and improve data workflows.

    Required Skills & Experience:

    • 10+ years of experience in data engineering and working with large datasets.
    • Must-have skills:
    • Databricks (Lakehouse, Workflows, Unity Catalog, Delta Sharing)
    • Delta Lake, PySpark, Lake flow, Delta Live Tables
    • Data modelling & cluster sizing
    • Security implementation in Databricks environments
    • Good-to-have skills:
    • Experience with Azure Data Factory (ADF) or Fivetran for data ingestion.
    • Exposure to Power BI for data visualization.
    • Strong ability to analyse, optimize, and troubleshoot large-scale data pipelines.
    • Excellent communication and customer-handling skills to work with stakeholders.
    • Databricks Champions Certification is preferred.


    Undisclosed Salary

    B2B

    Apply for this job

    File upload
    Add document

    Format: PDF, DOCX, JPEG, PNG. Max size 5 MB

    This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
    Kevin Edward Consultancy

    Check similar offers

    Stibo STEP MDM Consultant

    New
    7N
    8.9K - 12.5K USD/month
    Warszawa
    , Fully remote
    Fully remote
    Stibo STEP
    MDM
    Data