Big Data Engineer
Join us, and build data solutions that drive global innovation!
Kraków - based opportunity with hybrid work model (2 days/week in the office).
As a Big Data Developer, you will be working for our client, a leading global financial institution, contributing to the design and development of cutting-edge data solutions for risk management and analytics. The client is undergoing a strategic digital transformation, focusing on scalable, cloud-based big data platforms that support advanced analytics and regulatory compliance. You will be part of a high-performing Agile team, collaborating closely with business stakeholders and technical teams to build and maintain robust distributed systems that process large volumes of data efficiently.
Your main responsibilities:
Designing and developing distributed big data solutions using Spark
Implementing microservices and APIs for data ingestion and analytics
Managing cloud-native deployments primarily on GCP
Writing and maintaining test automation frameworks using tools like JUnit, Cucumber, or Karate
Collaborating with cross-functional teams to translate business requirements into technical specifications
Developing and scheduling data workflows using Apache Airflow
Maintaining and optimizing existing big data pipelines
Utilizing DevOps tools such as Jenkins and Ansible for CI/CD automation
Participating in Agile ceremonies and contributing to sprint planning and retrospectives
Monitoring, troubleshooting, and improving data systems and services
You're ideal for this role if you have:
A degree in Computer Science, IT, or a related discipline
Proven experience in designing and developing big data systems
Hands-on experience with Spark and distributed computing
Solid Java, Python, and Groovy development skills
Strong knowledge of the Spring ecosystem (Boot, Batch, Cloud)
Familiarity with REST APIs, Web Services, and API Gateway technologies
Practical experience in DevOps tooling like Jenkins and Ansible
Proficiency in using RDBMS, especially PostgreSQL
Hands-on experience with public cloud platforms, particularly GCP
Excellent communication in English
It is a strong plus if you have:
Experience with streaming technologies like Apache Beam or Flink
Knowledge of OLAP solutions and data modeling
Background in financial risk management or the banking industry
Exposure to container technologies such as Docker and Kubernetes
Familiarity with Traded Risk domain concepts
Experience with RPC frameworks like gRPC
Knowledge of data lakehouse tools like Dremio or Trino
Hands-on experience with BI or UI development
We offer you:
ITDS Business Consultants is involved in many various, innovative and professional IT projects for international companies in the financial industry in Europe. We offer an environment for professional, ambitious, and driven people. The offer includes:
Stable and long-term cooperation with very good conditions
Enhance your skills and develop your expertise in the financial industry
Work on the most strategic projects available in the market
Define your career roadmap and develop yourself in the best and fastest possible way by delivering strategic projects for different clients of ITDS over several years
Participate in Social Events, training, and work in an international environment
Access to attractive Medical Package
Access to Multisport Program
Access to Pluralsight
Flexible hours & remote work
Internal job number #7225 You can report violations in accordance with ITDS’s Whistleblower Procedure available here.
Net per month - B2B
Check similar offers