#1 Job Board for tech industry in Europe

Data Engineer Hadoop
New
Data

Data Engineer Hadoop

49 - 60 USD/hNet per hour - B2B
49 - 60 USD/hNet per hour - B2B
Type of work
Full-time
Experience
Senior
Employment Type
B2B
Operating mode
Hybrid

Tech stack

    Hadoop

    advanced

    Hive

    advanced

    Scala

    advanced

    Apache Spark

    advanced

    Apache

    regular

    Dataflow

    regular

    BigQuery

    regular

    HDFS

    regular

    SQL

    regular

    Dataproc

    regular

Job description

Hadoop Data Engineer (GCP, Spark, Scala) – Kraków / Hybrid

We are looking for an experienced Hadoop Data Engineer to join a global data platform project built in the Google Cloud Platform (GCP) environment. This is a great opportunity to work with distributed systems, cloud-native data solutions, and a modern tech stack. The position is based in Kraków (hybrid model – 2 days per week in the office).



Your responsibilities:

  • Design and build large-scale, distributed data processing pipelines using Hadoop, Spark, and GCP
  • Develop and maintain ETL/ELT workflows using Apache Hive, Apache Airflow (Cloud Composer), Dataflow, DataProc
  • Work with structured and semi-structured data using BigQuery, PostgreSQL, Cloud Storage
  • Manage and optimize HDFS-based environments and integrate with GCP components
  • Participate in cloud data migrations and real-time data processing projects
  • Automate deployment, testing, and monitoring pipelines (CI/CD using Jenkins, GitHub, Ansible)
  • Collaborate with architects, analysts, and product teams in Agile/Scrum setup
  • Troubleshoot and debug complex data logic at the code and architecture level
  • Contribute to cloud architecture patterns and data modeling decisions


Must-have qualifications:

  • Minimum 5 years of experience as a Data Engineer / Big Data Engineer
  • Hands-on expertise in Hadoop, Hive, HDFS, Apache Spark, Scala, SQL
  • Solid experience with GCP and services like BigQuery, Dataflow, DataProc, Pub/Sub, Composer (Airflow)
  • Experience with CI/CD processes and DevOps tools: Jenkins, GitHub, Ansible
  • Strong data architecture and data engineering skills in large-scale environments
  • Experience working in enterprise environments and with external stakeholders
  • Familiarity with Agile methodologies such as Scrum or Kanban
  • Ability to debug and analyze application-level logic and performance


Nice to have:

  • Google Cloud certification (e.g., Professional Data Engineer)
  • Experience with Tableau, Cloud DataPrep, or Ansible
  • Knowledge of cloud design patterns and modern data architectures


Work model:

  • Hybrid – 2 days per week from the Kraków office (rest remotely)
  • Opportunity to join an international team and contribute to global-scale projects


To learn more about Antal, please visit www.antal.pl

 

49 - 60 USD/h

Net per hour - B2B