#1 Job Board for tech industry in Europe

Data Engineer (Azure Databricks or BigQuery)
New
Data

Data Engineer (Azure Databricks or BigQuery)

5 350 - 6 688 USD/monthNet per month - B2B
5 350 - 6 688 USD/monthNet per month - B2B
Type of work
Full-time
Experience
Senior
Employment Type
B2B
Operating mode
Remote

Tech stack

    Polish

    C1

    English

    B2

    ELT/ETL

    advanced

    SQL

    advanced

    Python

    advanced

    Azure Databricks

    regular

    BigQuery

    regular

    IaC

    junior

    Terraform

    junior

Job description

Online interview

Join Us and Build a Cutting-Edge Data Platform!


As a Data Platform Engineer, you will be working for our client, a leader in finance, developing a cloud-based data platform to enhance data processing and analytics capabilities. This greenfield project focuses on building an ELT pipeline using Azure Databricks or BigQuery on GCP, leveraging native cloud services for data ingestion, transformation, and orchestration. You will play a key role in designing and optimizing the data infrastructure, ensuring scalability, performance, and seamless integration with business processes.


Your main responsibilities: Design and implement scalable ELT pipelines on Azure Databricks or BigQuery

  • Develop and maintain cloud-native data ingestion, transformation, and orchestration workflows
  • Optimize database structures and data models for performance and scalability
  • Ensure data quality, governance, and compliance with industry standards
  • Collaborate with data scientists and analysts to enable advanced analytics
  • Automate deployment and monitoring of data pipelines
  • Troubleshoot and resolve data infrastructure issues
  • Implement security best practices for data processing and storage
  • Work in an Agile environment, contributing to continuous improvement
  • Document processes, architectures, and best practices



You're ideal for this role if you have:

  • Experience with cloud-based data platforms (Azure Databricks or BigQuery)
  • Strong knowledge of ELT/ETL processes and data pipeline development
  • Hands-on experience with data ingestion, transformation, and orchestration tools
  • Proficiency in SQL and data modeling techniques
  • Familiarity with programming languages such as Python or Scala
  • Understanding of data governance, security, and compliance requirements
  • Experience with infrastructure as code (Terraform, ARM templates, or similar)
  • Ability to troubleshoot and optimize large-scale data workflows
  • Strong problem-solving skills and the ability to work in a cross-functional team
  • Good English, both written and spoken


5 350 - 6 688 USD/month

Net per month - B2B

Apply for this job

File upload
Add document

Format: PDF, DOCX, JPEG, PNG. Max size 5 MB

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Informujemy, że administratorem danych jest ITDS z siedzibą w Warszawie, ul. Złota 59 (dalej jako "administrator"). Masz...more