We are looking for a person for our Client from banking industry who will be part of an agile squad (using SCRUM methodology). He or she will be developing end-to-end ETL processes with Spark and other BigData technologies. This includes transferring data from/to the datalake, technical validations, business logic, etc.
Warszawa, al. Jerozolimskie 93
Key Accountabilities:
- Understanding all aspects of the Big Data ecosystem so that informed choices can be made regarding network and hardware, operating system and configuration through to end user tooling and provisioning.
- Implementing and maintaining infrastructure, as designed by the Big Data Architects, in accordance with the group operational and security standards and policies.
- The management of core eco system components Hive, Unix, Bash, SPARK.
- Ensuring standard operational requirements are met through implementation of infrastructure to perform monitoring, contingency, user provisioning
What does the recruitment process look like?
1. After submitting your application, we will carefully review your resume and if it meets the requirements, you will be contacted by a recruiter who will arrange a phone interview with you to get to know you better.
2. In the next stage, you will meet with a Tech Lead during a technical interview.
3. At the end, you can expect an offer or a phone feedback thanking you for participating in the recruitment process.