Join us, and architect the future of financial data intelligence!
Kraków - based opportunity with a hybrid work model (2 days/week in the office).
As a Data Engineer, you will be working for our client, a leading global financial institution, on an ambitious data transformation project aimed at enhancing ESG (Environmental, Social, and Governance) reporting and analytics. The client is modernizing their data infrastructure to support large-scale data processing and migration to the cloud, with a strong focus on ensuring data accuracy, availability, and performance across distributed systems. You will be part of a hybrid team contributing to the delivery of robust, scalable data pipelines and architecture, combining cutting-edge technologies in cloud computing and big data engineering.
Your main responsibilities:
- Developing and maintaining scalable data pipelines using Spark and Scala
- Designing and implementing workflows in Apache Airflow
- Managing large datasets using Hadoop ecosystem tools like HDFS and Hive
- Migrating and processing data across Google Cloud components such as BigQuery and Dataflow
- Collaborating with data architects and analysts to build efficient data models
- Automating data integration and deployment processes using Jenkins and Git
- Monitoring and troubleshooting data pipelines to ensure high reliability
- Conducting code reviews and ensuring best practices in data engineering
- Working closely with stakeholders to gather data requirements and support reporting needs
- Participating in Agile ceremonies and contributing to sprint deliverables
You're ideal for this role if you have:
- 5+ years of experience in data engineering or a related field
- Strong knowledge of Apache Spark and Scala
- Hands-on experience with Hadoop, Hive, and HDFS
- Experience working with Google Cloud Platform, especially BigQuery and Dataflow
- Proficiency with SQL and data modeling techniques
- Familiarity with CI/CD tools such as Jenkins and version control with Git
- Experience with Apache Airflow or similar orchestration tools
- Understanding of DevOps practices and Agile methodologies
- Strong debugging and problem-solving skills
- Excellent communication and interpersonal skills
It is a strong plus if you have:
- Experience with Cloud DataProc, Cloud PubSub, and Cloud Composer
- Hands-on experience with Tableau or other data visualization tools
- Familiarity with Enterprise Data Warehouse technologies
- Exposure to customer-facing roles or working with enterprise clients
- Experience with automated testing frameworks for data pipelines
- Knowledge of Cloud design patterns and architecture
- Google Cloud certification
- Experience using Jira for project tracking
- Understanding of both relational and non-relational big data modeling
- Prior involvement in ESG or sustainability data projects
We offer you:
ITDS Business Consultants is involved in many various, innovative and professional IT projects for international companies in the financial industry in Europe. We offer an environment for professional, ambitious and driven people. The offer includes:
- Stable and long-term cooperation with very good conditions
- Enhance your skills and develop your expertise in the financial industry
- Work on the most strategic projects available in the market
- Define your career roadmap and develop yourself in the best and fastest possible way by delivering strategic projects for different clients of ITDS over several years
- Participation in Social Events, training, and work in an international environment
- Access to an attractive Medical Package
- Access to Multisport Program
#GETREADY
Internal job ID #6922
📌 You can report violations in accordance with ITDS’s Whistleblower Procedure available here.