Join us, and bring your expertise to a world-class tech initiative.
Krakow-based opportunity with the possibility to work 75% remote.
As a Big Data Engineer, you will be working for our client, a global financial institution. The team focuses on enhancing security and compliance by building data processing solutions and storage technologies to monitor and prevent illegal or prohibited activity across various channels. You will collaborate with cross-functional teams, develop scalable components, and ensure the continuous integration and testing of all code to meet high standards while driving the team towards achieving strategic objectives in a truly agile and dynamic environment.
Your main responsibilities:
- Collaborating with Product Owners and IT Analysts to understand stakeholder requirements
- Building and managing data processing components and storage technologies
- Onboarding new data sources and constructing data pipelines for application developers
- Working with Data Scientists to integrate developed analytics into production
- Ensuring code quality by adhering to continuous integration and test-driven development principles
- Validating designs with Architects and ensuring adherence to agreed technical approaches
- Carrying out unit testing to ensure high-quality component delivery
- Contributing to a cross-functional, agile team focused on delivering incremental value
- Participating in the ongoing improvement of surveillance technology solutions
- Monitoring production systems to ensure smooth and uninterrupted operation
You're ideal for this role if you have:
- Extensive experience with ETL and Big Data technologies
- Practical experience with Apache Spark
- Proficiency in programming languages such as Scala and Java
- Hands-on experience with Apache Hadoop (Yarn, HDFS)
- Strong understanding of agile development methodologies
- Solid experience in working with cloud platforms, ideally Google Cloud
- Familiarity with the Spring framework
- Ability to write unit tests and ensure code quality standards
- Excellent communication skills, both written and oral, in a global team setting
- A passion for continuous improvement and technical innovation
It is a strong plus if you have:
- Experience with the ELK stack
- Experience deploying services on Google Cloud Platform
- Knowledge of Kubernetes and Airflow
- Practical experience with Maven, Git, Jenkins, Linux, and Bash scripting
- Basic knowledge of Python
Internal number #6600