#1 Job Board for tech industry in Europe

  • Job offers
  • Data Engineer - A/B Testing Platform Team - Data&AI
    New
    Data

    Data Engineer - A/B Testing Platform Team - Data&AI

    Warszawa
    14 200 - 19 690 PLNGross/month - Permanent
    Type of work
    Full-time
    Experience
    Mid
    Employment Type
    Permanent
    Operating mode
    Hybrid
    Allegro

    Allegro

    We’re Poland’s most popular shopping platform and the largest e-commerce player of European origin. With over 250 million offers, we’re now expanding our successful marketplace model across Central and Eastern Europe. Join us!

    Company profile

    Tech stack

      English

      advanced

      Data

      regular

      SQL

      regular

    Job description

    Job Description


    The salary range for this position is (contract of employment):

    mid: 14 200 - 19 690 PLN in gross terms

    A hybrid work model requires 1 day a week in the office 

    We are seeking a passionate Data Engineer to join the newly forming A/B Testing Platform team in the Data Science Hub where we apply analytical techniques, mathematics, and machine learning to solve a wide range of business problems.


    About the team

    The A/B Testing Platform team is a multidisciplinary group of product analysts, software engineers, and data engineers. Our mission is to strategically enhance our A/B testing platform, a critical tool that empowers data-driven decision-making regarding the roll out of new features by assessing the potential impact of these features through user behavior analysis. Through tasks performed, the team plays a pivotal role in shaping the overall user experience on Allegro, one of the world's largest eCommerce platforms.


    We are looking for people who:

    • Have a Bachelor's or Master's degree in Computer Science, Mathematics or a related field.
    • Know English at min. B2 level.
    • Have proven experience as a Data Engineer or in a similar role
    • Possess necessary data-related skill set, meaning:
    • Are able to fluently work with SQL preferably GCP BigQuery.
    • Have knowledge of BigData tools in Google Cloud Platform, AWS or Azure.
    • Have experience with message broker systems and streaming data processing eg. Pub/Sub, Apache Beam
    • Are aware of data pipelines orchestration tools like Apache Airflow.
    • Have experience in Python programming and are familiar with software engineering best practices (PEP8, clean architecture, code review, CI/CD etc.).
    • Experience with Infrastructure as a Code tools - Terraform is welcomed.
    • Have proven commercial experience in DevOps and CI/CD practice.
    • Have strong communication skills, capable of conveying complex ideas in a clear, concise manner.
    • Are detail-oriented and capable of working in a fast-paced, dynamic environment.
    • Have a positive attitude and ability to work in a team.
    • Are eager to constantly develop and broaden their knowledge.


    In your daily work you will handle the following tasks

    • Designing, developing, and maintaining robust, scalable data pipelines.
    • Collaborating closely with product managers, UX designers, data analysts and software engineers to understand their requirements and deliver high quality, prepared data to enable their work.
    • Building, testing, and maintaining data systems for accuracy and readiness for a bigger pipeline containing streaming data flow.
    • Designing and implementing data schemas, data models, message brokers, and SQL/No-SQL databases.
    • Optimizing data systems and building them from the ground up to deliver insights for data analytical systems.
    • Implementing data pipelines and automated workflows required for the A/B testing platform.
    • Ensuring data privacy and compliance standards across all projects.  
    • Operating with multiple platforms and technologies such as Google Cloud Platform, Azure Cloud and Allegro Data Centers.
    • Delivering solutions for multiple markets.
    • Balancing engagement across ad-hoc support of Product Managers and Data Analysts requests.


    Why is it worth working with us:

    • Data plays a key role in the operation of Allegro - we are a data-driven technology company, and through the models and analyses provided you will have a significant impact on one of the largest eCommerce platforms in the world.  
    • Gain invaluable experience and deepen your skills through continuous learning and development opportunities.  
    • Collaborate with a network of industry experts, enhancing your professional growth and knowledge sharing.  
    • We are happy to share our knowledge. You can meet our speakers at hundreds of technological conferences such as Data Science Summit, Big Data Technology Warsaw Summit. We also publish the content on the allegro.tech blog.  
    • We use, depending on teams and their needs, the latest versions of Java, Scala, Kotlin, Groovy, Go, Python, Spring, Reactive Programming, Spark, Kubernetes, TensorFlow.  
    • Microservices – a few thousand microservices and 1.8m+ rps on our business data bus.  
    • In the Data&AI team you would be a part of a team consisting of over 200 data, ML & product specialists overseeing dozens of products, few hundred production ML models and governs all data in Allegro (several dozen petabyte scale).  
    • We practice Code Review, Continuous Integration, Scrum/Kanban, Domain Driven Design, Test Driven Development, Pair Programming depending on the team.  
    • GenAI tools (e.g., Copilot, internal LLM bots) support our everyday work.  
    • Our internal ecosystem is based on self-service and widely used tools, such as Kubernetes, Docker, GitHub (including CI/CD). This will allow you, from day one, to develop software using any language, architecture and scale, restricted only by your creativity and imagination.
    • We actively participate in the life of the biggest user groups in Poland centered around technologies we use at work (Java, Python, DevOps).  
    • Technological autonomy: you get to choose which technology solves the problem at hand (no need for management’s consent), you are responsible for what you create.  
    • Once a year, you can take advantage of the opportunity to work in a different team or more often if there’s an internal business need (known as team tourism).


    What we offer:

    • A hybrid work model that you will agree on with your leader and the team. We have well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms).  
    • Annual bonus up to 10% of the annual salary gross (depending on your annual assessment and the company's results).  
    • A wide selection of fringe benefits in a cafeteria plan – you choose what you like (e.g., medical, sports or lunch packages, insurance, purchase vouchers).  
    • English classes that we pay for related to the specific nature of your job.  
    • 16" or 14" MacBook Pro with M1 processor and 32GB RAM or a corresponding Dell with Windows (if you don’t like Macs) and other gadgets that you may need.  
    • Working in a team you can always count on — we have on board top-class specialists and experts in their areas of expertise.  
    • A high degree of autonomy in terms of organizing your team’s work. We encourage you to develop continuously and try out new things.  
    • Hackathons, team tourism, training budget and an internal educational platform, MindUp (including training courses on work organization, means of communication, motivation to work and various technologies and subject-matter issues).  
    • If you want to learn more, check out this webpage or listen to the Allegro Tech Podcast Episode about recent projects in the Data Science Hub.  


    Apply to Allegro and see why it is #dobrzetubyć (#goodtobehere)

    I'm interested


    14 200 - 19 690 PLN

    Gross/month - Permanent

    Check similar offers

    Mid Data Engineer

    New
    CLOUDFIDE
    13.4K - 23.5K PLN
    Warszawa
    , Fully remote
    Fully remote
    SQL
    Python
    Data Lake

    SAS Developer

    New
    Avenga
    17.6K - 18.5K PLN
    Warszawa
    , Fully remote
    Fully remote
    Oracle
    SVN
    Jenkins

    DWH Analyst

    New
    InfiniteDATA
    12K - 16K PLN
    Warszawa
    , Fully remote
    Fully remote
    Snowflake
    visio/erwin
    Data Warehouse

    Data Engineer

    New
    DCV Technologies
    19K - 21K PLN
    Warszawa
    , Fully remote
    Fully remote
    Scala
    Azure
    Spark

    BI Developer

    New
    Lite e-Commerce
    9K - 12K PLN
    Warszawa
    , Fully remote
    Fully remote
    Google BigQuery
    modelowanie danych
    Business Intelligence