GCP Data Engineer
Join us, and code the backbone of financial intelligence!
Kraków - based opportunity with hybrid work model (2 days/week in the office).
As a GCP Data Engineer, you will be working for our client, a global financial institution developing a cloud-based risk management platform used to generate and deliver risk factor definitions, historical market data, and scenarios for advanced financial modeling. The project involves building and optimizing scalable data pipelines, microservices, and integration layers that process high volumes of real-time and historical market data. You will be joining an international team of engineers focused on innovation, automation, and delivering measurable business value in a highly regulated environment.
Your main responsibilities:
Translating business requirements into secure, scalable, and performant data solutions
Integrating internal systems with an emphasis on fast data processing and cost optimization
Developing and documenting data ingestion blueprints for market data pipelines
Reviewing data solutions created by other team members
Assessing and modernizing existing data pipelines and microservices
Collaborating with engineers, analysts, and stakeholders to align technical solutions with business needs
Implementing consistent logging, monitoring, error handling, and automated recovery
Promoting automated unit and regression testing through test-centric development
Designing and implementing performant REST APIs
Applying industry-standard integration frameworks and patterns
You're ideal for this role if you have:
Strong knowledge of Java
Solid understanding of software design principles such as KISS, SOLID, and DRY
Proficiency with Spring Boot and its ecosystem
Experience building performant data processing pipelines
Familiarity with Apache Beam or similar technologies
Experience working with relational and NoSQL databases, such as PostgreSQL and Bigtable
Basic understanding of DevOps practices and CI/CD tools like Jenkins and Groovy
Ability to design and implement RESTful APIs
Excellent problem-solving and analytical skills
Strong communication and team collaboration abilities
Experience with GCP services like GKE, Cloud SQL, DataFlow, and BigTable
It is a strong plus if you have:
Knowledge of monitoring tools such as Open Telemetry, Prometheus, and Grafana
Familiarity with Kubernetes and Docker
Exposure to Terraform for infrastructure-as-code
Experience with messaging and streaming platforms like Kafka
We offer you:
ITDS Business Consultants is involved in many various, innovative and professional IT projects for international companies in the financial industry in Europe. We offer an environment for professional, ambitious, and driven people. The offer includes:
Stable and long-term cooperation with very good conditions
Enhance your skills and develop your expertise in the financial industry
Work on the most strategic projects available in the market
Define your career roadmap and develop yourself in the best and fastest possible way by delivering strategic projects for different clients of ITDS over several years
Participate in Social Events, training, and work in an international environment
Access to attractive Medical Package
Access to Multisport Program
Access to Pluralsight
Flexible hours & remote work
Internal job number #6923 You can report violations in accordance with ITDS’s Whistleblower Procedure available here.
Net per month - B2B
Check similar offers