#1 Job Board for tech industry in Europe

Cloud Data Engineer
Data

Cloud Data Engineer

Type of work
Full-time
Experience
Senior
Employment Type
B2B
Operating mode
Remote
in4ge sp. z o.o.

in4ge sp. z o.o.

Łączymy specjalistów IT z projektami, które mają sens. Współpracujemy z firmami technologicznie świadomymi i budujemy zespoły, które naprawdę działają. Stawiamy na #RightPeople.

Tech stack

    English

    B2

    Spanish

    C1

    Airflow

    advanced

    DBT

    advanced

    Data

    advanced

    Spark

    advanced

    BigQuery

    advanced

    SQL

    advanced

    Git

    regular

    GCP

    regular

    Programming

    junior

    Data Lake

    junior

Job description

Online interview

We are currently seeking a highly skilled Cloud Data Engineer to join the Data & Analytics team. This role is ideal for an experienced professional with deep expertise in Google Cloud Platform (GCP), data engineering and cloud-based data solutions. The successful candidate will play a key role in designing and implementing scalable data architectures, optimizing data workflows, and driving best practices in cloud data engineering.


Responsibilities

  • Design, develop, and optimize data pipelines and architectures within the GCP ecosystem.

  • Build efficient, high-performing data models and queries in BigQuery, including advanced partitioning, clustering, and performance tuning.

  • Implement and maintain data transformation workflows using dbt and orchestration tools such as Apache Airflow (Composer).

  • Develop data processing solutions leveraging Dataproc (Apache Spark).

  • Apply best practices in data warehousing, data lakes and data governance.

  • Collaborate with data analysts, architects, and business stakeholders to understand requirements and deliver robust data solutions.

  • Ensure code quality and maintainability through version control tools such as Git.


Requirements

  • 4+ years of experience in data engineering, with at least 2+ years on GCP data services.

  • Strong SQL skills and a track record in optimizing large datasets.

  • Expertise in BigQuery, dbt, Spark, and Airflow, enabling the development and automation of efficient, scalable data pipelines and transformations to support data-driven decision making.

  • Proficiency in Python, Java, or Scala for data engineering tasks.

  • Solid understanding of data warehousing and data lake best practices.

  • Experience with Git for version control.

  • Certifications: Professional Cloud Architect or Professional Data Engineer.

  • Language skills: Spanish – fluent, English – good (minimum B2).

Nice to Have:

  • Experience with Dataflow (Apache Beam).

  • Knowledge of CI/CD for data workflows.

  • Familiarity with data security and governance in the cloud.


We offer

  • Fully remote work with flexible working hours.

  • Long-term collaboration on B2B contract.

  • Opportunity to work on complex cloud projects for international clients.

  • Professional growth in a highly skilled and supportive team.

  • Collaborative and open working culture.



💡 Don’t miss out on tailored opportunities!

We have many ongoing recruitments, and new projects are constantly coming in. By giving your consent to process your data for future recruitment processes, we’ll be able to invite you to roles that match your experience and expectations!

PS: We’ll only reach out to you when we have projects that might genuinely interest you — without your consent, we won’t be able to do that.


Our recruitment process is transparent and focused on finding the right candidate for our clients. When you apply, you can count on our objectivity, respect, and full professionalism.

We look forward to receiving your CV.


We connect you with the right people.

Undisclosed Salary

B2B