Big Data Engineer

5 096 - 7 049 USDGross per month - Permanent
Data

Big Data Engineer

Data
-, Warszawa

Allegro

Full-time
Permanent
Senior
Hybrid
5 096 - 7 049 USD
Gross per month - Permanent

Job description

Important things for you

 

  • Flexible working hours in an office first model (4/1) that depend on you and your team. Starting later or finishing earlier? No problem! Work hours keep pace with our lifestyles and can start between 7 a.m. and 10 a.m.

  • The salary range for this position depending on the skill set is as follows (contract of employment, tax-deductible cost):

    • Data Engineer: PLN 14 200 - 20 200 

    • Senior Data Engineer: PLN 18 400 - 25 450

    • Annual bonus (depending on your annual assessment and the company's results)

  • Our team is based in Warsaw.


About the team

 

As part of the Data & AI area, we implement projects based on the practical "data science" and "artificial intelligence" applications of an unprecedented scale in Poland. Data & AI is a group of over 150 experienced engineers organized into over a dozen teams with various specializations. Some of them build dedicated tools for creating and launching BigData processes or implementing ML models for the entire organization. Others work closer to the client and are responsible for the implementation of the search engine, creating recommendations, building a buyer profile or developing an experimental platform. There are also research teams in the area whose aim is to find solutions to non-trivial problems requiring the use of machine learning.

 

We are looking for BigData engineers who want to build a highly scalable and fault-tolerant data ingestion for millions of Allegro customers. The platform collects 5 billion clickstream events every day (up to 150k / sec) from all Allegro sites and Allegro mobile applications. This is a hybrid solution using a mix on-premise and Google Cloud Platform (GCP) services like Spark, Kafka, Beam, BigQuery, Pubsub or Dataflow.


We are looking for people who

  • Are programming in languages such as Scala or Java, Python

  • Strong understanding of distributed systems, data storage, and processing framework like dbt, Spark or Apache Beam

  • Have knowledge of GCP (especially Dataflow and Composer) or other public cloud environments like Azure or AWS

  • Use good practices (clean code, code review, TDD, CI/CD)

  • Navigate efficiently within Unix/Linux systems

  • Possess a positive attitude and team-working skills

  • Are eager for personal development and keeping their knowledge up to date

  • Know English at B2 level 


What we offer

  • Possibility to learn and work with backend (Spring, Kotlin) and AI technologies within the team.

  • Well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms)

  • A wide selection of varied benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers)

  • English classes that we pay for related to the specific nature of your job

  • Macbook Pro / Air (depending on the role) or Dell with Windows (if you don't like Macs) and other gadgets that you may need

  • Working in a team you can always count on — we have on board top-class specialists and experts in their areas of expertise

  • A high degree of autonomy in terms of organizing your team’s work; we encourage you to develop continuously and try out new things

  • Hackathons, team tourism, training budget and an internal educational platform (including training courses on work organization, means of communications, motivation to work and various technologies and subject-matter issues)

  • If you want to learn more, check it out


Why is it worth working with us

  • At Allegro, you will be responsible for processing petabytes of data and billions of events daily

  • You will become a participant in one of the largest projects of building a data platform in GCP

  • Your development will align with the latest technological trends based on open source principles (data mesh, data streaming)

  • You will have a real impact on the direction of product development and technology choices. We utilize the latest and best available technologies, as we select them according to our own needs

  • You will have the opportunity to work within a team of experienced engineers and big data specialists who are eager to share their knowledge, including publicly through allegro.tech

  • Once a year, you can take advantage of the opportunity to work in a different team or more often if there’s an internal business need (known as team tourism)

 

Send in your CV and see why it is #dobrzetubyć (#goodtobehere)

Tech stack

    English

    B2

    Polish

    C1

    Google Cloud Platform

    regular

    Linux / Unix

    regular

    Java

    regular

    Python

    regular

    Scala

    regular

Office location

Published: 11.12.2025

About the company

Allegro

At Allegro, we build and maintain some of the most distributed and scalable applications in Central Europe. Work with us on e-commerce solutions to be used (and loved) by your friends, family and millions of our customer...

Company profile

Big Data Engineer

5 096 - 7 049 USDGross per month - Permanent
Summary of the offer

Big Data Engineer

-, Warszawa
Allegro
5 096 - 7 049 USDGross per month - Permanent
By applying, I consent to the processing of my personal data for the purpose of conducting the recruitment process. Informujemy, że administratorem danych jest Allegro.pl z siedzibą w Poznaniu, ul. Wierzbięcice 1B (dalej jako "administrator"). Masz p... MoreThis site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.