#1 Job Board for tech industry in Europe

  • Job offers
  • Data Engineer (Hadoop)
    New
    Data

    Data Engineer (Hadoop)

    Type of work
    Full-time
    Experience
    Senior
    Employment Type
    B2B
    Operating mode
    Remote

    Tech stack

      Polish

      C1

      English

      C1

      Hadoop

      master

      Hive

      master

      HDFS

      master

      Apache Spark

      master

      Scala

      regular

      Google Cloud Platform

      regular

      SQL

      regular

      CI/CD

      regular

      Big Query

      nice to have

      Cloud Dataflow

      nice to have

    Job description

    Online interview

    Industry: banking

    Location: fully remote (candidates must be based in Poland)

    Languages: fluent Polish and English

    Contract: B2B


    The Hadoop Data Engineer plays a critical role in enhancing the data processing capabilities within the organization, leveraging cloud technologies for efficient data handling and migration. The primary objective is to build and maintain robust data processing architectures that facilitate the flow of information and insights in a scalable manner.


    Main Responsibilities:

    • Develop and maintain data processing systems using Hadoop, Apache Spark, and Scala.
    • Design and implement data migration processes on the Google Cloud platform.
    • Create solutions for data handling and transformation utilizing SQL and other relevant tools.
    • Collaborate with stakeholders to ensure data architecture aligns with business needs.
    • Engage in automated testing and integration to ensure smooth deployment processes.
    • Debug code issues and communicate findings with the development team.
    • Apply big data modeling techniques for effective data representation.
    • Adapt to dynamic environments and embrace a proactive learning attitude.


    Key Requirements:

    • 5+ years of experience in Hadoop, Hive, HDFS, and Apache Spark.
    • Proficiency in Scala programming.
    • Hands-on experience with Google Cloud Platform, especially Big Query and Cloud Dataflow.
    • Strong understanding of SQL and relational database technologies.
    • Experience with version control tools (Git, GitHub) and CI/CD processes.
    • Ability to design large scale distributed data processing systems.
    • Strong interpersonal skills and teamwork abilities.
    • Experience in Enterprise Data Warehouse technologies.
    • Exposure to Agile project methodologies (Scrum, Kanban).
    • Google Cloud Certification - nice to have.
    • Experience with customer-facing roles in enterprise settings.
    • Exposure to Cloud design patterns.


    Undisclosed Salary

    B2B

    Apply for this job

    File upload
    Add document

    Format: PDF, DOCX, JPEG, PNG. Max size 5 MB

    This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
    Informujemy, że administratorem danych jest emagine z siedzibą w Warszawie, ul.Domaniewskiej 39A (dalej jako "administra...more

    Check similar offers

    Specjalista IDMS (Mainframe)

    New
    4IT Solutions
    43 - 48 USD/h
    Warszawa
    , Fully remote
    Fully remote
    IDMS in Mainframe
    DB2