You’ll join an international team, where you will get the chance of working with some of the newest technology and methods in the market on the Hadoop Ecosystem platform.
What you’ll be doing:
- Being responsible for a successful design and implementation of a high-performing, flexible, robust, scalable and easily maintainable global reporting solution for the Bank
- Deliver old legacy reports to all 10M customers
- Continuously develop the application
- Solve complex problems on a daily basis
The role is based in Warsaw/Gdynia. Welcome to a team where you will be working with people passionate about Big Data.
Who are you
We imagine that you enjoy learning and are excited about bringing your ideas to the table. You’re dependable, willing to speak up – even when it’s difficult – and committed to empowering others.
Your profile and background:
- Knowledge and experience in Scala, Apache Spark and Apache Hadoop (at least 6 months)
- Commercial experience and understanding with distributed systems
- Knowledge of Linux Shell Script, Hive, Kafka, SQL
- Basic Knowledge about functional programming
- Good spoken and written English