All offersWarszawaDataData Platform Engineer (with Python)
Data Platform Engineer (with Python)
Data
Allegro

Data Platform Engineer (with Python)

Allegro
Warszawa
Type of work
Full-time
Experience
Mid
Employment Type
Permanent
Operating mode
Hybrid
Allegro

Allegro

We’re Poland’s most popular shopping platform and the largest e-commerce player of European origin. With over 250 million offers, we’re now expanding our successful marketplace model across Central and Eastern Europe. Join us!

Company profile

Tech stack

    Java/Python
    regular
    Hadoop
    regular
    GCP
    regular
    Big Data
    nice to have

Job description

Job Description

The salary range for this position is (contract of employment):

mid: 12 300 - 17 600 PLN in gross terms

senior: 16 100 - 23 200 PLN in gross terms


A hybrid work model that incorporates solutions developed by the leader and the team.


As part of the Data & AI area, we implement projects based on the practical "data science" and "artificial intelligence" applications of an unprecedented scale in Poland. Data & AI is a group of over 150 experienced BigData engineers organized into over a dozen teams with various specializations. Some of them build dedicated tools for creating and launching BigData processes or implementing ML models for the entire organization. Others work closer to the client and are responsible for the implementation of the search engine, creating recommendations, building a buyer profile or developing an experimental platform. There are also research teams in the area whose aim is to find solutions to non-trivial problems requiring the use of machine learning.


We are looking for BigData engineers who want to build a data platform and solutions for millions of Allegro customers, e.g.:

  • An integrated platform of tools created for internal BigData developers to automate away tedious tasks associated with developing, deploying and maintaining data analysis solutions based on Google Cloud solutions
  • Solutions for Generative AI enablement in organizations based on Google Cloud and Microsoft Azure tooling
  • Governance of internal data platform costs
  • Keeping an eye on all the news from cloud providers, testing and deploying new features


We are looking for people who:

  • Are programming in languages such as Python or Java
  • Have experience in Big Data ecosystem, e.g. Hadoop, Spark, Airflow
  • Have knowledge of GCP or other public cloud environments (Dataproc, Composer, BigQuery)
  • Have experience with Terraform or other infrastructure-as-code tool
  • Use good practices (clean code, code review, TDD, CI/CD)
  • Navigate efficiently within Unix/Linux systems
  • Are interested in the application of ML AI
  • Possess a positive attitude and team-working skills
  • Are eager for personal development and keeping their knowledge up to date
  • Know English at B2 level and Polish at minimum C1 level


What we offer:

  • A hybrid work model that you will agree on with your leader and the team. We have well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms)
  • Annual bonus up to 10% of the annual salary gross (depending on your annual assessment and the company's results)
  • A wide selection of fringe benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers)
  • English classes that we pay for related to the specific nature of your job
  • A 16" or 14" MacBook Pro with M1 processor and, 32GB RAM or a corresponding Dell with Windows (if you don’t like Macs) and other gadgets that you may need
  • Working in a team you can always count on — we have on board top-class specialists and experts in their areas of expertise
  • A high degree of autonomy in terms of organizing your team’s work; we encourage you to develop continuously and try out new things
  • Hackathons, team tourism, training budget and an internal educational platform, MindUp (including training courses on work organization, means of communications, motivation to work and various technologies and subject-matter issues)
  • If you want to learn more,  check it out


Why is it worth working with us?

  • At Allegro, you will be responsible for processing petabytes of data and billions of events daily
  • You will become a participant in one of the largest projects of building a data platform in GCP
  • Your development will align with the latest technological trends based on open source principles (data mesh, data streaming)
  • You will have a real impact on the direction of product development and technology choices. We utilize the latest and best available technologies, as we select them according to our own needs
  • You will have the opportunity to work within a team of experienced engineers and big data specialists who are eager to share their knowledge, including publicly through allegro.tech
  • Once a year, you can take advantage of the opportunity to work in a different team (known as team tourism)