Currency

Hadoop Administrator

5 476 - 7 119 USDNet per month - B2B
Data

Hadoop Administrator

Data

Łucka, Warszawa

Altimetrik Poland

Full-time
B2B
Mid
Hybrid
5 476 - 7 119 USD
Net per month - B2B

Tech stack

    English

    C1

    Hadoop

    advanced

    Python

    regular

    Apache Spark

    regular

    Big Data

    regular

    Hive

    regular

    AWS

    regular

    GCP

    regular

    SQL

    regular

    Docker

    regular

    Kubernetes

    regular

Job description

2 days per week from the office in Warsaw

Working hours 10:00 - 18:00


Altimetrik Poland is a digital enablement company. We deliver bite-size outcomes to enterprises and startups from all industries in an agile way to help them scale and accelerate their businesses. We are unique in Poland's IT market. Our differentiators are an innovation-first approach, a strong focus on core development, and an ability to attack the challenging and complex problems of the biggest companies in the world.


As a Hadoop Administrator you will be part of a team that maintains and supports Data Platform and provides support for key cloud based Big Data. You will be responsible for driving innovation for our partners and clients, within and globally. You will work on open-source Big Data clusters focusing on Cloud, ensuring their availability, performance, reliability, and improving operational efficiency.


Responsibilities:

  • Familiarity with big data tools (Big Data, Spark, etc.) and frameworks (HDFS, MapReduce, Hive, Yarn etc.).

  • Design, build and manage Big Data infrastructure.

  • Manage and optimize Apache Big Data clusters for high performance, reliability, and scalability.

  • Develop tools and processes to monitor and analyze system performance and to identify potential issues.

  • Collaborate with other teams to design and implement Solutions to improve reliability and efficiency of the Big data cloud platforms.

  • Ensure security and compliance of the platforms within organizational guidelines.

  • Effective root cause analysis of major production incidents and the development of learning documentation (identify and implement high-availability solutions for services with a single point of failure).

  • Planning and performing capacity expansions and upgrades in a timely manner to avoid any scaling issues and bugs. This includes automating repetitive tasks to reduce manual effort and prevent human errors.

  • Tune alerting and set up observability to proactively identify issues and performance problems.

  • Reviewing new use cases and cluster hardening techniques to build robust and reliable platforms.

  • Creating standard operating procedure documents and guidelines on effectively managing and utilizing the platforms.

  • Leveraging DevOps tools, disciplines (Incident, problem, and change management), and standards in day-to-day operations.

  • Perform security remediation, automation, and self-healing as per the requirement.

  • Developing automations and reports to minimize manual effort. This can be achieved through various automation tools such as Shell scripting, Ansible, or Python scripting, or by using any other programming language.


And if you possess...

  • Experience with managing and optimizing Big Data.

  • Demonstrated experience with AWS or GCP cloud platforms.

  • Proficient in scripting languages (Python, Bash) and SQL.

  • Familiarity with big data tools (Big Data, Spark etc.) and frameworks (HDFS, MapReduce, Hive, Yarn etc.).

  • Strong knowledge in system architecture and design patterns for high-performance computing.

  • Good understanding of data security and privacy concerns.

  • Experience with infrastructure automation technologies like Docker, Kubernetes, Ansible, Terraform is a plus.

  • Excellent problem-solving and troubleshooting skills.

  • Strong communication and collaboration skills.

  • Observability: knowledge on observability tools like Grafana, Opera and Splunk.

  • Understanding of Linux, networking, CPU, memory, and storage.

  • Knowledge and ability to code or program in one of Java, Python.

  • Excellent interpersonal skills, along with superior verbal and written communication abilities


🔥We grow fast.

🤓We learn a lot.

🤹We prefer to do things instead of just talking about them.


If you would like to work in an environment that values trust and empowerment... don't hesitate, just apply!


Tech stack

    English

    C1

    Hadoop

    advanced

    Python

    regular

    Apache Spark

    regular

    Big Data

    regular

    Hive

    regular

    AWS

    regular

    GCP

    regular

    SQL

    regular

    Docker

    regular

    Kubernetes

    regular

Office location

Published: 30.10.2025

Hadoop Administrator

5 476 - 7 119 USDNet per month - B2B
Summary of the offer

Hadoop Administrator

Łucka, Warszawa

Altimetrik Poland

5 476 - 7 119 USDNet per month - B2B
By applying, I consent to the processing of my personal data for the purpose of conducting the recruitment process. Informujemy, że administratorem danych jest Altimetrik z siedzibą w Warszawie, ul. Towarowa 28 (dalej jako "administrator"). Masz praw... MoreThis site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
ADVERTISEMENT: Recommended by Just Join IT