🌍Location: hybrid - once a week office in Gdańsk.
⏰Start date for assignment: ASAP/1 month.
💰Rate: 150-170 zl/h.
⏳Duration of assignment: 12 months first contract (prolongation expected).
📕Language: English + Polish.
⚙️Industry: banking.
💻Workload: Full time.
Responsibilities:
- Develop Scala/Spark programs, scripts, and macros for data extraction, transformation and analysis.
- Design and implement solutions to meet business requirements.
- Support and maintain existing Hadoop applications and related technologies.
- Develop and maintain metadata, user access and security controls.
- Develop and maintain technical documentation, including data models, process flows and system diagrams.
Requirements:
- Minimum 3-5 years of experience from Scala/Spark related projects and/or engagements.
- Create Scala/Spark jobs for data transformation and aggregation as per the complex business requirements.
- Should be able work in a challenging and agile environment with quick turnaround times and strict deadlines.
- Perform Unit tests of the Scala code.
- Raise PR, trigger build and release JAR versions for deployment via Jenkins pipeline.
- Should be familiar with CI/CD concepts and the processes.
- Peer review the code.
- Perform RCA of the bugs raised.
- Should have excellent understanding of Hadoop ecosystem.
- Should be well versed with below technologies:
- Jenkins.
- HQL(Hive Queries).
- Oozie.
- Shell scripting.
- GIT.
- Splunk.
Nice to have:
- Relevant certifications (e.g., Scala, Spark, Hadoop, Performance).
- Knowledge of other programming languages (e.g., Python, R).
- Insight to cloud-based solutions such as Snowflake.
- Experience in Financial Services, preferable in the Credit risk domain.