#1 Job Board for tech industry in Europe

GCP Cloud Data Engineer
Data

GCP Cloud Data Engineer

Type of work
Full-time
Experience
Mid
Employment Type
B2B
Operating mode
Remote
in4ge sp. z o.o.

in4ge sp. z o.o.

Łączymy specjalistów IT z projektami, które mają sens. Współpracujemy z firmami technologicznie świadomymi i budujemy zespoły, które naprawdę działają. Stawiamy na #RightPeople.

Tech stack

    English

    B2

    Italian

    C1

    Data

    advanced

    GCP

    advanced

    SQL

    advanced

    BigQuery

    advanced

    Cloudera

    regular

    Hadoop

    regular

    Apache Airflow

    regular

    Git

    regular

    Programming

    junior

    CI/CD

    nice to have

Job description

Online interview

We are actively seeking a highly skilled Senior Cloud Data Engineer with proven expertise in Google Cloud Platform (GCP) and mandatory hands-on experience with Cloudera, Hadoop, and Apache Airflow. Fluency in Italian and good command of English are essential for this role.


Responsibilities

  • Design, implement, and optimize scalable data pipelines and architectures on GCP.

  • Develop and tune complex SQL queries and data models in BigQuery, applying advanced techniques such as partitioning and clustering to optimize performance.

  • Lead data workflow orchestration and automation using Apache Airflow (Composer).

  • Work extensively with Cloudera and Hadoop (Spark/Dataproc) ecosystems to process and analyze large datasets.

  • Apply best practices in data warehousing, data lakes, and cloud data governance.

  • Collaborate with cross-functional teams to deliver high-quality, scalable data solutions aligned with business needs.

  • Maintain code quality and version control standards using Git.


Requirements

  • Minimum 4 years of hands-on experience as a Data Engineer, with at least 2 years dedicated to Google Cloud Platform data services.

  • Proven, hands-on experience with Cloudera, Hadoop and Apache Airflow — these are mandatory.

  • Advanced proficiency in SQL, including schema design and query optimization for large datasets.

  • Deep expertise in BigQuery, including performance tuning and advanced SQL.

  • Experience with at least one GCP data processing service: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow).

  • Proficiency in at least one programming language such as Python, Java, or Scala for data pipeline development.

  • Strong understanding of data warehousing and data lake concepts and best practices.

  • Experience with version control tools, preferably Git.

  • Certifications: Professional Cloud Architect or Professional Data Engineer.

  • Language skills: Fluent in Italian, Good command of English (minimum B2 level).

Nice to have

  • Experience with CI/CD pipelines and automated deployment for data engineering workflows.

  • Familiarity with cloud security, data privacy, and compliance standards.


We offer

  • Fully remote work with flexible working hours.

  • Long-term collaboration on B2B contract.

  • Opportunity to work on complex cloud projects for international clients.

  • Professional growth in a highly skilled and supportive team.

  • Collaborative and open working culture.



💡 Don’t miss out on tailored opportunities!

We have many ongoing recruitments, and new projects are constantly coming in. By giving your consent to process your data for future recruitment processes, we’ll be able to invite you to roles that match your experience and expectations!

PS: We’ll only reach out to you when we have projects that might genuinely interest you — without your consent, we won’t be able to do that.


Our recruitment process is transparent and focused on finding the right candidate for our clients. When you apply, you can count on our objectivity, respect, and full professionalism.

We look forward to receiving your CV.


We connect you with the right people.

Undisclosed Salary

B2B