GCP DevOps Engineer
B2B | Kraków / Remote (6-8 times per month)
For our client – a global financial institution developing scalable data platforms and cloud solutions for the Capital Markets domain – we are currently looking for a GCP DevOps Engineer to join the CTO Data Technology team within the Corporate & Institutional Banking division.
This role will focus on supporting the development and maintenance of data products and platform components in the Google Cloud environment. The engineer will work on CI/CD tooling, infrastructure automation, cloud services standardization, and the evolution of SRE practices within the organization.
- Develop and maintain CI/CD pipelines and DevOps tooling (Jenkins, GitHub, Nexus, Ansible)
- Contribute to the development of shared services and GCP deployment patterns
- Automate infrastructure tasks related to resilience, compliance and operations
- Support Infrastructure as Code development using Terraform
- Collaborate with product, platform and data teams to enhance platform performance, reliability and maintainability
- Help shape and implement best practices around Site Reliability Engineering (SRE)
- Contribute to service management, lifecycle automation and security enforcement across environments
- 3+ years of experience in DevOps or Cloud Engineering roles
- Practical experience with CI/CD tools such as Jenkins, GitHub Actions, Nexus and Ansible
- Hands-on experience with cloud platforms (GCP preferred, AWS or Azure also considered)
- Proficiency in Terraform and Infrastructure as Code (IaC) approaches
- Strong scripting skills in Bash and Python
- Very good knowledge of Linux administration and networking principles
- Understanding of cloud architecture best practices, security, compliance, and cost optimization
- Familiarity with SRE concepts, service lifecycle and production support
- Strong communication skills and the ability to collaborate with cross-functional teams
- Experience in a highly regulated or enterprise financial environment
- Experience with Kubernetes and container orchestration
- Familiarity with Data Engineering tools such as Airflow or Apache Spark
- Long-term project in a well-structured, international environment
- Flexible, remote-first model (within Poland)
- Opportunity to work with modern cloud and automation tools at scale
- Stable, full-time engagement with access to internal learning and development resources
If you're passionate about automation, cloud infrastructure, and reliability – and you want to work on impactful projects in the financial data space – we’d love to hear from you!