Senior Data Engineer
Kapelanka 42A, Kraków
ITDS
Join us, and engineer robust systems for millions of users!
Kraków - based opportunity with hybrid work model (6 days/month in the office).
As a Senior Data Engineer, you will be working for our client, a global digital-first bank focused on delivering innovative financial solutions at scale. You will join a dynamic engineering team responsible for building and enhancing data solutions that support critical business applications used by millions of customers. Your role will involve developing robust and fault-tolerant data pipelines, automating processes, and supporting cloud and on-premise deployments. You will collaborate with engineers, data analysts, and business stakeholders to ensure data solutions are efficient, scalable, and aligned with the bank’s digital and data transformation initiatives.
Your main responsibilities:
Designing, developing, and maintaining end-to-end data pipelines across cloud and on-premise systems
Implementing robust ETL/ELT processes using PySpark, Hadoop, Hive, and Spark SQL
Collaborating with engineers and analysts to translate requirements into scalable data solutions
Automating workflows and optimizing data engineering processes for efficiency and reliability
Ensuring data quality, accuracy, and consistency across pipelines and applications
Migrating on-premise data solutions to cloud platforms such as GCP, AWS, or Azure
Participating in code reviews, promoting development standards, and sharing knowledge with peers
Supporting production environments, troubleshooting issues, and monitoring performance and scale
Contributing to system architecture, design discussions, and Agile development processes
You're ideal for this role if you have:
Strong experience in PySpark, Scala, or similar data engineering languages
Hands-on experience building production data pipelines using Hadoop, Spark, and Hive
Knowledge of cloud platforms and migrating on-premise solutions to the cloud
Experience with scheduling tools such as Airflow and workflow orchestration
Strong SQL skills and experience with data modelling and warehousing principles
Familiarity with Unix/Linux platforms and big data distributed systems
Experience with version control tools such as Git and CI/CD pipelines (Jenkins, GitHub Actions)
Understanding of ETL/ELT frameworks and data formats (Parquet, ORC, Avro)
Proven ability to troubleshoot, debug, and optimize data processing workflows
Experience working in Agile environments and collaborating across global teams
It is a strong plus if you have:
Experience with near real-time event streaming tools (Kafka, Spark Streaming, Apache Flink)
Exposure to MLOps or running machine learning models in production
Knowledge of DevOps practices, containerization, and cloud design patterns
Experience developing in Java or other programming languages
Familiarity with Elasticsearch and ingestion pipelines
We offer you:
ITDS Business Consultants is involved in many various, innovative and professional IT projects for international companies in the financial industry in Europe. We offer an environment for professional, ambitious and driven people. The offer includes:
Stable and long-term cooperation with very good conditions
Enhance your skills and develop your expertise in the financial industry
Work on the most strategic projects available in the market
Define your career roadmap and develop yourself in the best and fastest possible way by delivering strategic projects for different clients of ITDS over several years
Participation in Social Events, training, and work in an international environment
Access to an attractive Medical Package
Access to Multisport Program
#GETREADYInternal job ID #7594
📌 You can report violations in accordance with ITDS’s Whistleblower Procedure available here.