👉 Data Engineer (GCP)
🟣You will be:
designing, delivering, and maintaining large scale, production grade data platforms,
designing and implementing cloud native architectures, preferably on Google Cloud Platform,
designing and developing scalable ETL solutions handling diverse data formats,
building and managing complex data workflows using Apache Airflow,
developing and maintaining high volume batch and streaming data processing pipelines using Apache Spark,
work with large scale data processing workloads, including billions of events per day,
implementing Infrastructure as Code using Terraform,
building and maintaining automated CI CD pipelines,
collaborating on production grade data and ML platforms.
🟣 Your profile:
at least 5 years of professional experience in Data Engineering,
proven experience delivering and maintaining production data platforms,
strong experience with Google Cloud Platform and ability to design cloud native architectures,
expert-level knowledge of Apache Airflow for workflow orchestration,
strong programming skills in Python and SQL,
hands-on experience with Apache Spark for batch and streaming processing,
experience working with large-scale data processing systems,
experience handling multiple data formats
practical experience with Infrastructure as Code using Terraform,
experience building automated release and deployment pipelines,
hands- on experience with GitHub Actions, Docker, and Kubernetes,
openness to on-site workshops 1-3 per quarter in Warsaw and Poznań.
practical experience using AI-powered assistants (e.g. Claude Code, GitHub Copilot, Cursor) to improve productivity, quality, or decision-making in software delivery.
Work from the European Union region and a work permit are required.
🟣 Nice to have:
experience in productionizing and deploying Machine Learning models at scale,
experience with Vertex AI,
experience with scaling high-performance data processing systems,
experience with building scalable REST APIs for Machine Learning inference,
strong focus on software development best practices,
experience with Test Driven Development,
ability to write comprehensive unit and integration tests for Spark and Airflow,
professional GCP Data Engineer certification.
🟣 Recruitment Process:
CV review – HR call – Technical Interview – Client Interview I – Client Interview II (potential ) – Hiring Manager call – Decision
🎁 Benefits 🎁
✍ Development:
development budgets of up to 6,800 PLN,
we fund certifications e.g.: AWS, Azure,
access to Udemy, O'Reilly (formerly Safari Books Online) and more,
events and technology conferences,
technology Guilds,
internal training,
Xebia Upskill.
🩺 We take care of your health:
private medical healthcare,
multiSport card - we subsidise a MultiSport card,
mental Health Support.
🤸♂️ We are flexible:
B2B or employment contract,
contract for an indefinite period.

Xebia sp. z o.o.
While Xebia is a global tech company, our journey in CEE started with two Polish companies – PGS Software, known for world-class cloud and software solutions, and GetInData, a pioneer in Big Data. Today, in Poland, we’re...
👉 Data Engineer (GCP)
👉 Data Engineer (GCP)