👉 Senior Data Engineer
🟣 You will be:
developing and automating tools within the Data Platform, including: clickstream data processing, building and evolving an ID Graph service, implementing and enhancing Self-Service Analytics solutions,
designing, building, and maintaining effective streaming data pipelines,
supporting product, engineering, and analytics teams in building data-driven products and services,
solving technical and analytical issues reported via internal communication channels (e.g. Slack, email, incident management tools),
applying software engineering best practices such as Clean Code, TDD, and CI/CD,
participating in an on-call / incident response rotation.
🟣 Your profile:
strong commercial experience as a Senior Data Engineer,
very good programming skills in Python and one of JVM languages: Java, Kotlin, or Scala,
proven experience in building and maintaining streaming data pipelines,
hands-on experience with Google Cloud Platform, especially BigQuery, Cloud Composer, Dataproc, Dataflow, Cloud Run,
advanced knowledge of SQL and dbt,
solid understanding of software engineering best practices: TDD, Clean Code, CI/CD pipelines,
very good command of Linux/Unix environments,
experience with Docker and daily use of Git or other version control systems,
strong teamwork skills and a proactive, positive attitude,
fluent Polish and English at B2+ level,
readiness to participate in an on-call rotation (24/7, 7 dys per week / every 5-8 weeks)
practical experience using AI-powered assistants (e.g. Claude Code, GitHub Copilot, Cursor) to improve productivity, quality, or decision-making in software delivery.
🟣 Nice to have:
experience with Spring Framework,
knowledge of Apache Beam, Scio, and Google Dataflow,
practical experience with Apache Flink,
experience with Infrastructure as Code, especially Terraform,
experience with Spark or PySpark (not necessarily in a Hadoop environment),
experience applying GenAI in a more structured way within the SDLC, including defined workflows, prompt patterns, or tool integrations embedded into daily work,
interest in and familiarity with emerging AI-driven practices (e.g. agent-based workflows, automation patterns, AI-augmented development), with a willingness to explore and experiment beyond standard approaches.
Work from the European Union region and a work permit are required.
🟣 Recruitment Process: CV review – HR Call – Interview – Client Interview – Decision
🎁 Benefits 🎁
✍ Development:
development budget of up to 6,800 PLN,
we fund certifications e.g.: AWS, Azure, ISTQB, PSM,
access to Udemy, Safari Books Online and more,
events and technology conferences,
technology Guilds,
internal training,
Xebia Library,
Xebia Upskill.
🩺 We take care of your health:
private medical healthcare,
multiSport card - we subsidise a MultiSport card,
mental Health Support.
🤸♂️ We are flexible:
flexible working hours,
B2B or permanent contract,
contract for an indefinite period.

Xebia sp. z o.o.
While Xebia is a global tech company, our journey in CEE started with two Polish companies – PGS Software, known for world-class cloud and software solutions, and GetInData, a pioneer in Big Data. Today, in Poland, we’re...
👉 Senior Data Engineer
👉 Senior Data Engineer