Data ETL Engineer

35 - 43 USDNet per hour - B2B
Data

Data ETL Engineer

Data
Kraków, Kraków

Crestt

Freelance
B2B
Mid
Hybrid
35 - 43 USD
Net per hour - B2B

Job description

Technical Requirements

Must have:

  • 3+ years of experience in SQL development and query optimization, particularly in BigQuery environments.

  • Experience designing and implementing ETL/ELT pipelines and data transformation processes.

  • Hands-on experience with GCP data services such as BigQuery, Data Fusion, Cloud Composer/Airflow, or similar tools.

  • Practical experience with Data Vault modeling.

  • Programming experience in Python and familiarity with Terraform.

  • Experience with CI/CD pipelines and DevOps tools (e.g., Git, Jenkins, Ansible).

  • Experience working in Agile environments and DataOps practices.

  • Strong analytical and problem-solving skills.

  • Important: The client requires a visit to Kraków for two days each month. 

Nice to have:

  • Experience designing data ingestion pipelines for formats such as CSV, JSON, and XML.

  • Experience integrating data from REST or SOAP APIs, SFTP servers, and enterprise data sources.

  • Knowledge of data contract best practices.

  • Experience with Java development or building custom plugins for data integration tools.

  • Experience with continuous testing and delivery for cloud-based data platforms.

  • Strong communication and collaboration skills.

  • Ability to work independently and manage multiple tasks.

  • Proactive mindset with a strong problem-solving approach.

  • Willingness to learn and continuously improve technical skills.

  • Team-oriented attitude and ability to work effectively in cross-functional teams.

Required Technical Skills

SQL● BigQuery● ETL & Data Management Tools● CI/CD● Python● Terraform● Agile

Main Responsibilities

  • Design, build, test, and deploy data models and transformations in BigQuery using SQL and related technologies.

  • Develop and maintain ETL/ELT pipelines to transform raw and unstructured data into structured datasets using Data Vault modeling.

  • Integrate data from multiple sources, including on-premise systems, APIs, and cloud-based platforms.

  • Monitor and troubleshoot data pipelines for performance issues, failures, or data inconsistencies.

  • Optimize ETL/ELT processes for performance, scalability, and cost efficiency.

  • Review and implement business and technical requirements in data transformation processes.

  • Ensure solutions meet non-functional requirements, including security, reliability, scalability, and compliance with IT standards.

  • Manage code repositories and CI/CD pipelines using tools such as Git and Jenkins.

  • Collaborate with DevOps and data teams to enable automated deployment, testing, and monitoring.

  • Provide bug fixes, enhancements, and technical documentation, and support knowledge transfer to operational teams.

Location Requirements

Hybrid from Kraków, 2 days per week in the office

8-10 months or longer contract

Tech stack

    Polish

    C1

    English

    B2

    SQL

    advanced

    BigQuery

    regular

    ETL

    regular

    ELT

    regular

    CI/CD

    regular

    DevOps tools

    regular

    Python

    regular

    Agile

    regular

    Data Vault modeling

    regular

    Terraform

    junior

Office location

Data ETL Engineer

35 - 43 USDNet per hour - B2B
Summary of the offer

Data ETL Engineer

Kraków, Kraków
Crestt
35 - 43 USDNet per hour - B2B
By applying, I consent to the processing of my personal data for the purpose of conducting the recruitment process. Informujemy, że administratorem danych jest Crestt z siedzibą w Warszawie, ul. Rejtana 17 (dalej jako "administrator"). Masz prawo do ... MoreThis site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Salary
35 - 43 USD
Net per hour - B2B
Applied -
28 day left (until 30.04.2026)
Applied -