All offersWarszawaPythonBig Data Developer (Python/Scala)
Big Data Developer (Python/Scala)
Python
Madiff Sp. z o.o.

Big Data Developer (Python/Scala)

Madiff Sp. z o.o.
Warszawa
Type of work
Undetermined
Experience
Mid
Employment Type
B2B
Operating mode
Office

Tech stack

    Python
    regular
    Scala
    regular
    Spark
    regular
    Hadoop
    regular
    SQL
    junior
    Linux
    junior
    Kafka
    nice to have
    ETL tools
    nice to have

Job description

Online interview
MADIFF is an Innovation & IT Engineering Consulting Company that delivers unique value in the following sectors: banking, insurance, telecom, industries, life sciences, energy, automotive and railway & infrastructure industries.

MADIFF's philosophy is to make team with the Client in order to exceed the objectives settled and its expectations. We are driven by a creative and innovative consulting approach strongly oriented to getting results: We “MAKE THE DIFFERENCE”.

We are building and supporting software and applications in following areas: Robotics, Ai, Blockchain, Mobile / Data Analytics, Rechtech, EventTech, Adtech, BigData, Drone, eHealth, Fintech.

Big Data Developer
Location: Warszawa

Description of project:

  • you will become a member of the team which is providing tools to support sales activities by providing business a powerful analytics tools. Those tools are built at the top of the Hortonworks Data Platform and can be exposed in Docker containers;
  • you will join the team who will also build new Data Wharehouse in one of the biggest banks in the world
  • you will be responsible for the implementation of Data Flows, which ingest data from different data sources, e.g., Kafka topics, SAP, S3 and store them in Hadoop, to process at the and by the Spark job;
  • as a Big Data Engineer, you will work closely to Data Science Team on the implementation of analytical models and machine learning algorithms;
  • during the project you will work closely with local team and some experts from Europe;
  • you will use Slack or Skype to communicate.

Tech Requirements: 
  • programming skills in Python/Scala;
  • knowledge of Big Data Tech Stack: Spark, Hadoop (HDFS, Sqoop);
  • Kafka - nice to have;
  • knowledge of SQL i relational databases;
  • knowledge of Linux environment;
  • knowledge of ETL tools - nice to have;
  • Apache Airflow - nice to have;
  • ability to work in Scrum Methodology.