#1 Job Board for tech industry in Europe

  • Job offers
  • Database Engineer 2
    Data

    Database Engineer 2

    Poznań
    Type of work
    Full-time
    Experience
    Mid
    Employment Type
    Permanent
    Operating mode
    Hybrid
    Allegro

    Allegro

    We’re Poland’s most popular shopping platform and the largest e-commerce player of European origin. With over 250 million offers, we’re now expanding our successful marketplace model across Central and Eastern Europe. Join us!

    Company profile

    Tech stack

      SQL

      regular

      Linux

      regular

      MongoDB

      nice to have

    Job description

    The hybrid work model requires a minimum of one day per week of in-office work (Poznań).

    As part of the Technical Platform Teams, we provide internal customers with NoSQL databases. We have built our own DBaaS based on Kubernetes. We support development teams in database management, application modelling, performance troubleshooting, and backup & restore solutions.


    With us, you will work with MongoDB clusters at scale:

    • Hundreds of databases in production
    • Thousands of data nodes
    • Hundreds of terabytes of data
    • Approximately 1 million operations in aggregate database traffic
    • Databases hosted on Kubernetes
    • Two datacenters and a cloud environment


    We are looking for people who:

    • Have professional experience with any NoSQL or SQL databases
    • Are eager and willing to learn MongoDB database administration
    • Possess knowledge in data modeling, with a focus on optimal data storage and high performance
    • Have practical expertise in identifying causes of performance problems, performing load tests, tuning queries, and selecting optimal indexes
    • Use Linux with ease
    • Are familiar with English at a minimum B2 level


    Nice to have:

    • Experience in MongoDB
    • Knowledge of Kubernetes and public cloud environments
    • Understanding of backup and disaster recovery processes
    • Focus on solving issues at the interface between applications and databases


    What we offer:

    • A hybrid work model that you can agree upon with your leader and the team. We have well-located offices with fully equipped kitchens and bicycle parking facilities, as well as excellent work tools (height-adjustable desks, interactive conference rooms)
    • An annual bonus of up to 10% of the annual gross salary (depending on your annual assessment and the company’s results)
    • A wide selection of fringe benefits through a cafeteria plan – you choose what you like (e.g., medical, sports, or lunch packages, insurance, purchase vouchers)
    • English classes related to your job’s specific nature, which we cover
    • A 16" or 14" MacBook Pro with M1 processor and 36GB RAM, or a comparable Dell with Windows (if you don’t prefer Macs)
    • Working in a team you can count on — we have top-class specialists and experts in various fields
    • High autonomy in organizing your team’s work; we encourage continuous development and trying new things
    • Hackathons, team-building activities, a training budget, and an internal educational platform, MindUp (covering work organization, communication, motivation, technologies, and other topics)
    • If you want to learn more, check it out


    Why is it worth working with us?

    • The IT team includes over 1,700 members who have shared knowledge at multiple conferences, such as Devoxx, Geecon, Confitura, and contribute to a blog: allegro.tech
    • Microservices – a few thousand microservices and over 1.8 million RPS on our business data bus
    • Big Data – several petabytes of data and Machine Learning used in production
    • We practice Code Review, Continuous Integration, Scrum/Kanban, Domain-Driven Design, Test-Driven Development, and Pair Programming, depending on the team
    • Our internal ecosystem is based on self-service and widely used tools, such as Kubernetes, Docker, Consul, GitHub, and GitHub Actions, enabling you to start developing software using any language, architecture, and scale, limited only by your creativity and imagination
    • To match the scale, we also focus on building entire Platforms of tools and technologies to accelerate and facilitate development and ensure an excellent Developer Experience
    • Technological autonomy: you choose the best technology to solve each problem (no need for management’s consent) — you’re responsible for what you create
    • Our deployment environment combines private data centers (tens of thousands of servers) and public clouds (Google Cloud and Microsoft Azure)
    • Over 100 original open-source projects and a few thousand stars on GitHub
    • We organize Allegro Tech Live, a fully remote version of our Allegro Tech Talks meetups, and we are often invited to communities like Warsaw AI, JUG (Poznań, Łódź, Lublin, Wrocław), WG .Net, Dare IT, and the Women in Tech Summit
    • We focus on development as well. We organize hackathons and internal conferences (e.g., the annual Allegro Tech Meeting). If you want to keep growing and share your knowledge, we will support you


    This may also interest you:

    Allegro Tech Podcast → https://podcast.allegro.tech/

    Send us your CV and see why it’s #goodtobehere

    Check similar offers

    Data Software Engineer (Upskilling position for Python Developers)

    New
    EPAM Systems
    Undisclosed Salary
    Poznań
    , Fully remote
    Fully remote
    Python
    AWS
    Azure

    ETL Developer

    New
    PKO BP Finat
    Undisclosed Salary
    Warszawa
    , Fully remote
    Fully remote
    ETL
    Informatica Powercenter
    Oracle

    SQL Administrator - IT Software Engineering​

    New
    KPMG
    Undisclosed Salary
    Poznań
    , Fully remote
    Fully remote
    T-SQL
    SQL
    HA

    Snowflake Lead Engineer

    New
    Comscore (via CC)
    6.08K - 7.3K USD
    Wrocław
    , Fully remote
    Fully remote
    Python
    Snowflake
    Java

    Data Engineer

    New
    Dataplace.ai
    Undisclosed Salary
    Warszawa
    , Fully remote
    Fully remote
    PySpark
    SQL
    ETL