#1 Job Board for tech industry in Europe

Data Engineer - Delivery Experience
New
Data

Data Engineer - Delivery Experience

Warszawa
3 799 - 5 267 USD/monthGross per month - Permanent
3 799 - 5 267 USD/monthGross per month - Permanent
Type of work
Full-time
Experience
Mid
Employment Type
Permanent
Operating mode
Hybrid
Allegro

Allegro

At Allegro, we build and maintain some of the most distributed and scalable applications in Central Europe. Work with us on e-commerce solutions to be used (and loved) by your friends, family and millions of our customers.

Tech stack

    Polish

    B2

    English

    B2

    SQL

    regular

    Google Cloud Platform

    regular

    ETL

    regular

    ELT

    regular

    Python

    regular

    PySpark

    regular

    Airflow

    regular

    CI/CD

    regular

    DHW

    regular

Job description

About the team

The salary range for this position is (contract of employment): 14 200 - 19 690 PLN in gross terms

A hybrid work model requires 1 day a week in the office 

 

In the area of Delivery Experience, we are building technology that makes Allegro's deliveries easy, cost-effective, fast and predictable. Our team takes care of critical services along the Allegro shopping journey, responsible for predicting delivery times using statistical algorithms and machine learning, selecting the best delivery methods tailored to customers, and integrating with carrier companies. Delivery Experience is also one of the fastest-growing areas where we undertake new, complex projects to enhance logistics and warehousing processes.

We are looking for a Mid/Senior Data Engineer with a focus on the data processing and preparation, deployment and maintenance of our data projects. Join our team to enhance your skills related to deploying data-based processes, data ops approaches and share the skills within the team.


Your main responsibilities:

- You will be actively responsible for developing and maintaining processes for handling large volumes of data

- You will be streamlining and developing the data architecture that powers analytical products and work along a team of experienced analysts

- You will be monitoring and enhancing quality and integrity of the data

- You will manage and optimize costs related to our data infrastructure and data processing on GCP


This is the right job for you if:

- You have at least 3 years of experience as Data Engineer and working with large datasets. 

- You have experience with cloud providers (GCP preferred).

- You are highly proficient in SQL.

- You have strong understanding of data modeling and cloud DWH architecture.

- You have experience in designing and maintaining ETL/ELT processes.

- You are capable of optimizing cost and efficiency of data processing.

- You are proficient in Python for working with large data sets (using PySpark or Airflow).

- You use good practices (clean code, code review, CI/CD).

- You have a high degree of autonomy and take responsibility for developed solutions.

- You have English proficiency on at least B2 level.

- You like to share knowledge with other team members.

 

Nice to have:

- Experience with Azure and cross-cloud data transfers and multi-cloud architecture


What we offer:

- Big Data is not an empty slogan for us, but a reality - you will be working on really big datasets (petabytes of data). 

- You will have a real impact on the direction of product development and technology choices. We utilize the latest and best available technologies, as we select them according to our own needs.

- Our tech stack includes: GCP, BigQuery, (Py)Spark, Airflow.

- We are a close -knit team where we work well together.

- You will have the opportunity to work within a team of experienced engineers and big data specialists who are eager to share their knowledge, including publicly through allegro.tech

 

Apply to Allegro and see why it is #dobrzetubyć (#goodtobehere)

3 799 - 5 267 USD/month

Gross per month - Permanent