Currency

Senior Data Engineer (GCP)

41 - 49 USDNet per hour - B2B
Data

Senior Data Engineer (GCP)

Data

-, Gdańsk +4 Locations

DS STREAM

Full-time
B2B
Senior
Remote
41 - 49 USD
Net per hour - B2B

Tech stack

    English

    C1

    CI/CD

    advanced

    GCP

    advanced

    Airflow

    regular

    REST API

    regular

    SQL

    regular

    Python

    regular

    Composer

    nice to have

    Pub/Sub

    nice to have

Job description

We are looking for experienced Data Engineers to join our team at DS STREAM!


Role Overview


We are looking for a data engineer to support ongoing data platform and pipeline initiatives. This role focuses on assisting with the development, testing, and maintenance of data workflows, ensuring data quality, and supporting integration with external APIs. The engineer will gain exposure to complex programmatic advertising data challenges while contributing to the stability and scalability of core data infrastructure.


Key Responsibilities

  • Support the design, development, and maintenance of ETL/ELT workflows in Airflow (Composer).

  • Write and maintain Python scripts for data ingestion, transformation, and validation.

  • Assist in integrating data from external REST APIs and potentially GraphQL APIs.

  • Contribute to data quality by refining test coverage, adding validation rules, and handling new file types.

  • Help manage CI/CD processes for data pipelines.

  • Perform exploratory data work, prepare queries against external data sources, and assist with schema evolution.

  • Document workflows, processes, and test cases to support ongoing knowledge transfer.


Preferred Skills & Experience

  • 5 + years of experience in a data engineering role.

  • Strong knowledge of Python (including testing frameworks).

  • Hands-on experience with Airflow (Composer preferred) for orchestrating data pipelines.

  • Familiarity with Google Cloud Platform (GCP).

  • Experience working with REST APIs

  • SQL knowledge, especially for querying and modeling data in analytical warehouses.

  • Understanding of CI/CD processes for data workflows.

  • Strong debugging and problem-solving skills, with attention to detail in data quality.


Soft Skills

  • Strong English language communication and documentation habits.

  • Eagerness to learn new technologies and frameworks in a fast-evolving environment.


You will get

  • 100% remote work with possibility to use office in Warsaw

  • Flexible working hours

  • Company events

  • Broad access to webinars, workshops and certificates

  • Money return for passed certifications

  • Opportunity to take advantage of Microsoft Azure and Google Cloud Platform training package available only for partners of these providers

  • Possibility to use in-house incubator for development of own ventures

  • A supportive community with 50+ consultants with 10+ years’ experience in Data Warehouse and Big Data

  • and much more!

Tech stack

    English

    C1

    CI/CD

    advanced

    GCP

    advanced

    Airflow

    regular

    REST API

    regular

    SQL

    regular

    Python

    regular

    Composer

    nice to have

    Pub/Sub

    nice to have

Office location

Published: 26.11.2025

About the company

DS STREAM

DS STREAM is a fast-growing AI & Data consulting company focused on analytics and data management for global brands. Our team of 150+ senior experts transforms know-how into the architecture of effective Data Science, Ma...

Company profile

Senior Data Engineer (GCP)

41 - 49 USDNet per hour - B2B
Summary of the offer

Senior Data Engineer (GCP)

-, Gdańsk

DS STREAM

41 - 49 USDNet per hour - B2B
By applying, I consent to the processing of my personal data for the purpose of conducting the recruitment process. Przesyłając CV wyrażasz zgodę na przetwarzanie przez DS STREAM sp. z o.o. z siedzibą w Warszawie (03-840), ul. Grochowska 306/308 dany... MoreThis site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.