#1 Job Board for tech industry in Europe

GCP Data Engineer
New
Data

GCP Data Engineer

43 - 59 USD/monthNet per month - B2B
43 - 59 USD/monthNet per month - B2B
Type of work
Full-time
Experience
Senior
Employment Type
B2B
Operating mode
Hybrid

Tech stack

    Big Data

    advanced

    Java

    advanced

    SOLID Principles

    advanced

    PostreSQL

    advanced

    GCP

    nice to have

Job description

GCP Data Engineer – STAR Platform

Location: Kraków / Hybrid Work ( 2 x week from office)

Are you ready to build impactful solutions on a global scale? Join a forward-thinking team that powers critical risk calculations in one of the world's leading financial institutions.


About the Role

We are looking for a talented GCP Data Engineer with a strong Java background to join the STAR platform team. STAR is HSBC’s strategic cloud-native platform designed to generate and deliver risk factor definitions, historical market data, and scenarios for Value at Risk (VaR) and Expected Shortfall (ES) calculations.

The platform leverages data pipelines and microservices, combining both real-time and batch processing to handle large-scale datasets. You’ll be joining a global team of developers within the Global Traded Risk Technology department, working in an open, inclusive, and innovation-driven environment.


What You’ll Do

  • Translate complex business requirements into secure, scalable, and high-performance data solutions
  • Design and implement performant data processing pipelines (batch and streaming)
  • Develop REST APIs and data ingestion patterns in a cloud-native architecture
  • Integrate internal systems with a focus on cost optimization and fast data processing
  • Modernize and enhance existing pipelines and microservices
  • Create and maintain solution blueprints and documentation
  • Conduct peer code reviews and provide constructive feedback
  • Promote test-centric development practices including unit and regression tests
  • Ensure consistent logging, monitoring, error handling, and automated recovery aligned with industry standards
  • Collaborate closely with engineers, analysts, and stakeholders across regions


Required Skills & Experience

  • Strong proficiency in Java and Spring Boot
  • Understanding of key software design principles: KISS, SOLID, DRY
  • Hands-on experience building data processing pipelines (preferably with Apache Beam)
  • Experience designing and building RESTful APIs
  • Familiarity with relational and NoSQL databases, especially PostgreSQL and Bigtable
  • Basic knowledge of DevOps and CI/CD tools, including Jenkins and Groovy scripting
  • Experience with integration frameworks and patterns (e.g., Saga, Lambda)
  • Strong problem-solving and analytical skills
  • Excellent communication skills and ability to thrive in a collaborative team environment


Nice to Have

  • Experience with Google Cloud Platform (GCP) services: GKE, Cloud SQL, Dataflow, Bigtable
  • Familiarity with OpenTelemetry, Prometheus, Grafana
  • Knowledge of Kubernetes, Docker, and Terraform
  • Messaging/streaming experience with Kafka
  • UI experience with Vaadin
  • Exposure to Apache Beam in large-scale data environments

 

To learn more about Antal, please visit www.antal.pl

 

43 - 59 USD/month

Net per month - B2B