#1 Job Board for tech industry in Europe

  • Job offers
  • Data Engineer (Azure / Databricks)
    New
    Data

    Data Engineer (Azure / Databricks)

    Kraków
    6 113 - 8 239 USD/monthNet per month - B2B
    6 113 - 8 239 USD/monthNet per month - B2B
    Type of work
    Full-time
    Experience
    Senior
    Employment Type
    B2B
    Operating mode
    Hybrid

    Tech stack

      Microsoft Azure

      advanced

      Python

      advanced

      SQL

      advanced

      ETL

      advanced

      Azure Databricks

      regular

    Job description

    Online interview

    Join us, and transform complex queries into elegant solutions!


    Kraków - based opportunity with hybrid work model (6 days per month in the office).


    As a Data Engineer, you will be working for our client, a global financial institution engaged in modernizing its financial IT systems to meet critical regulatory requirements. The project involves preparing core finance applications for seamless operation on a new cloud infrastructure. You will be supporting the migration and transformation of large-scale ETL workflows and complex SQL logic from one cloud provider to another, with a strong emphasis on automation, performance optimization, and quality assurance. You will work alongside cross-functional teams to ensure the project’s success in a dynamic and fast-paced environment.


    Your main responsibilities:

    • Migrating complex BigQuery SQL transformations to Azure Spark SQL
    • Building and executing ETL workflows using Azure Databricks
    • Creating automation tools for data and code migration between cloud platforms
    • Analyzing existing SQL logic and transforming it for new cloud environments
    • Writing Python scripts to support migration utilities and ETL automation
    • Documenting processes to support production readiness and handover
    • Collaborating with developers, product owners, and technical leads
    • Identifying and resolving performance bottlenecks in SQL workflows
    • Supporting the CI/CD process by integrating SQL and ETL components
    • Participating in Agile ceremonies and contributing to team planning


    You're ideal for this role if you have:

    • Experience working with at least one cloud provider, preferably Microsoft Azure
    • Strong expertise in SQL, particularly Spark SQL or BigQuery SQL
    • Hands-on experience building and maintaining complex ETL pipelines
    • Proficiency in Python programming
    • Understanding of SQL coding standards and performance optimization techniques
    • Familiarity with CI/CD pipelines and automation tools
    • Strong problem-solving skills and adaptability
    • Ability to manage time effectively under tight deadlines
    • Experience working within Agile development teams
    • Excellent communication and documentation skills


    It is a strong plus if you have:

    • Prior experience with Azure Databricks
    • Background in financial IT systems or regulatory projects
    • Knowledge of data migration tools and strategies
    • Experience with version control tools like Git
    • Familiarity with big data tools and ecosystems
    • Exposure to production support and post-deployment processes


    6 113 - 8 239 USD/month

    Net per month - B2B

    Apply for this job

    File upload
    Add document

    Format: PDF, DOCX, JPEG, PNG. Max size 5 MB

    This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
    Informujemy, że administratorem danych jest ITDS z siedzibą w Warszawie, ul. Złota 59 (dalej jako "administrator"). Masz...more

    Check similar offers

    Senior Data Engineer

    New
    Altimetrik Poland
    7.18K - 9.04K USD/month
    Kraków
    , Fully remote
    Fully remote
    Spark
    Scala
    Apache Kafka