Join us, and transform massive data into meaningful insights!
Kraków - based opportunity with hybrid work model (2 days/week in the office).
As a Hadoop Data Engineer, you will be working for our client, a globally recognized financial institution driving innovation in digital infrastructure and data intelligence. You will be contributing to the development and deployment of large-scale, cloud-based data solutions that support business-critical analytics and data processing. Working within a hybrid model and cross-functional teams, you will play a key role in transforming traditional data platforms into scalable, automated, and high-performance environments on a modern tech stack including GCP, Hadoop, Spark, and Scala.
- Designing and deploying distributed data processing systems using Hadoop and Spark
- Developing and optimizing data pipelines using Airflow and Scala
- Integrating data platforms with GCP services such as BigQuery and Dataflow
- Collaborating with DevOps teams to implement CI/CD pipelines using Jenkins
- Building automated workflows to support data ingestion, transformation, and storage
- Writing and maintaining high-performance SQL queries and scripts
- Monitoring data pipeline performance and troubleshooting issues proactively
- Supporting data migration projects from on-prem to cloud-based environments
- Applying best practices for code versioning, testing, and deployment
- Working closely with stakeholders to understand data requirements and propose scalable solutions
- 5+ years of hands-on experience with Hadoop ecosystem (HDFS, Hive)
- Strong programming skills in Scala and experience with Apache Spark
- Proven experience working with Google Cloud Platform services
- Proficiency in building and scheduling workflows using Apache Airflow
- Expertise in writing complex SQL queries and data transformation logic
- Familiarity with DevOps tools such as Jenkins, Git, and GitHub
- Understanding of distributed data architectures and big data modeling techniques
- Experience with automated unit and integration testing practices
- Strong debugging skills and the ability to analyze technical code at scale
- Excellent communication and teamwork skills in a global environment
- Hands-on experience with Google Cloud tools like Cloud Dataflow, DataPrep, and Composer
- Background in data visualization tools such as Tableau
- Exposure to Enterprise Data Warehouse technologies
- Experience in customer-facing roles and stakeholder engagement
- Familiarity with agile methodologies such as Scrum or Kanban
We offer you:
ITDS Business Consultants is involved in many various, innovative and professional IT projects for international companies in the financial industry in Europe. We offer an environment for professional, ambitious and driven people. The offer includes:
- Stable and long-term cooperation with very good conditions
- Enhance your skills and develop your expertise in the financial industry
- Work on the most strategic projects available in the market
- Define your career roadmap and develop yourself in the best and fastest possible way by delivering strategic projects for different clients of ITDS over several years
- Participation in Social Events, training, and work in an international environment
- Access to an attractive Medical Package
- Access to Multisport Program
#GETREADY
Internal job ID #6965
You can report violations in accordance with ITDS’s Whistleblower Procedure available here.