#1 Job Board for tech industry in Europe

  • Job offers
  • Data Engineer Hadoop
    New
    Data

    Data Engineer Hadoop

    Kraków
    48 - 58 USD/monthNet per month - B2B
    48 - 58 USD/monthNet per month - B2B
    Type of work
    Full-time
    Experience
    Senior
    Employment Type
    B2B
    Operating mode
    Hybrid

    Tech stack

      Hadoop

      advanced

      Hive

      advanced

      Apache Spark

      advanced

      Scala

      advanced

    Job description

    Hadoop Data Engineer (GCP, Spark, Scala) – Kraków / Hybrid

    We are looking for an experienced Hadoop Data Engineer to join a global data platform project built in the Google Cloud Platform (GCP) environment. This is a great opportunity to work with distributed systems, cloud-native data solutions, and a modern tech stack. The position is based in Kraków (hybrid model – 2 days per week in the office).

    Your responsibilities:

    • Design and build large-scale, distributed data processing pipelines using Hadoop, Spark, and GCP

    • Develop and maintain ETL/ELT workflows using Apache Hive, Apache Airflow (Cloud Composer), Dataflow, DataProc

    • Work with structured and semi-structured data using BigQuery, PostgreSQL, Cloud Storage

    • Manage and optimize HDFS-based environments and integrate with GCP components

    • Participate in cloud data migrations and real-time data processing projects

    • Automate deployment, testing, and monitoring pipelines (CI/CD using Jenkins, GitHub, Ansible)

    • Collaborate with architects, analysts, and product teams in Agile/Scrum setup

    • Troubleshoot and debug complex data logic at the code and architecture level

    • Contribute to cloud architecture patterns and data modeling decisions

    Must-have qualifications:

    • Minimum 5 years of experience as a Data Engineer / Big Data Engineer

    • Hands-on expertise in Hadoop, Hive, HDFS, Apache Spark, Scala, SQL

    • Solid experience with GCP and services like BigQuery, Dataflow, DataProc, Pub/Sub, Composer (Airflow)

    • Experience with CI/CD processes and DevOps tools: Jenkins, GitHub, Ansible

    • Strong data architecture and data engineering skills in large-scale environments

    • Experience working in enterprise environments and with external stakeholders

    • Familiarity with Agile methodologies such as Scrum or Kanban

    • Ability to debug and analyze application-level logic and performance

    Nice to have:

    • Google Cloud certification (e.g., Professional Data Engineer)

    • Experience with Tableau, Cloud DataPrep, or Ansible

    • Knowledge of cloud design patterns and modern data architectures

    Work model:

    • Hybrid – 2 days per week from the Kraków office (rest remotely)

    • Opportunity to join an international team and contribute to global-scale projects

    To learn more about Antal, please visit www.antal.pl

     

    48 - 58 USD/month

    Net per month - B2B

    Check similar offers

    Mid/Senior PowerBI Engineer

    New
    Altimetrik Poland
    5.07K - 6.27K USD/month
    Kraków
    , Fully remote
    Fully remote
    AWS
    Power BI
    Data modeling