#1 Job Board for tech industry in Europe

  • Job offers
  • (Senior) Data Engineer (fully remote possible)
    New
    Data

    (Senior) Data Engineer (fully remote possible)

    Wrocław
    5 523 USDNet/month - B2B
    Type of work
    Full-time
    Experience
    Senior
    Employment Type
    B2B
    Operating mode
    Remote
    Limango Polska

    Limango Polska

    At limango, we've been in e-commerce for 17 years. We're part of the OTTO Group, one of Europe's top e-commerce companies, along with platforms in Poland, the Netherlands, and Germany. We're the shopping platform with the biggest selection of products for the whole family! We work and play together. We value work-life balance and create a culture of respect, trust, and equality. If you're looking for a company that shares these values, we'd love to have you on board.

    Company profile

    Tech stack

      AWS

      regular

      Databricks

      regular

      PySpark

      regular

      SparkSQL

      regular

    Job description

     Join the limango IT!

    • In the limango IT you get the chance to contribute your own ideas and know-how to maintain and develop our highly frequented, self built online shop for our markets in Poland, Germany, Austria and the Netherlands
    • Together with our experienced and international development teams you will bring our shop sites to the next level: Contribute your expertise to our Access, Shop and Checkout Teams!
    • You will work in a modern tech stack world: React frontend, cloud-based & virtualized infrastructure, service-oriented architecture & event based communication
    • The following technologies await you: AWS, docker, kubernetes, microservices, GoLang, React, JSS, Jest, ES6/Typescript
    • We work agile: You will become a part of our well-rehearsed agile teams of versatile and experienced Web Developers, dedicated Product Owner and Scrum Master


    You'll need:

    Tasks and Role

    • Developing a Central Lakehouse Data Platform leveraging the AWS/Databricks ecosystem.
    • Managing Data Infrastructure using Terraform (AWS/Databricks) and ETL code (PySpark, Delta Live Tables, SparkSQL).
    • Implementing System Enhancements, including monitoring, data quality tests, unit testing, and automated alerts.
    • Refactoring Legacy AWS Solutions (AWS Glue, Redshift) into a modern Lakehouse ecosystem with CI/CD pipelines, data quality testing, and robust monitoring.
    • Supporting Medallion Lakehouse Architecture Development for machine learning and analytics data, introducing data-mesh concepts.
    • Collaborating with Business Analytics and ML Teams to optimize and extend data platform capabilities.


    Candidate's Profile

    Must-Haves:

    • Experience with Terraform and PySpark at least at a basic level.
    • A degree in Information Systems, Computer Science, Mathematics/Physics, Engineering, or a related field.
    • Several years of experience building data engineering solutions (preferably in E-commerce).
    • Strong proficiency in Python programming, including application building, automated testing, packaging, and deployment.
    • Hands-on experience with PySpark for data processing and a solid understanding of Spark processing engine architecture.
    • Advanced SQL processing skills.
    • Familiarity with data lakes architecture and data modeling paradigms.
    • Proven experience with Terraform infrastructure-as-code and CI/CD deployments.
    • Background in building and deploying data engineering ETL applications, including CI/CD pipelines in Git environments.
    • Fluency in English, as we work in an international environment.
    • A team player mindset with a willingness to learn modern technologies and frameworks.


    Nice-to-Have Skills: 

    • Familiarity with AWS Cloud architecture and the Databricks platform.
    • Experience with AWS Glue and Redshift warehousing.
    • Knowledge of CloudFormation templates.
    • Experience with AWS Kinesis and Spark streaming processing.
    • Good knowledge of Scala programming, functional programming, and application development.
    • Experience in MLOps environments (e.g., MLFlow).


    What you can count for?

    • Exciting challenges and a huge influence on our project and business – you’ll be responsible for bringing our new projects live,
    • Flat hierarchies, and direct communication in a very comradely atmosphere,
    • A steep learning curve in a dynamic company with international orientation,
    • Flexible working hours and either B2B or contract of employment,
    • Stability in those pandemic times – growing, profitable business and proper safety rules,
    • Possibility to develop your skills, thanks to trainings and cooperation with international experts.

     

    Benefits

    • Private healthcare
    • We provide access to the best specialists for you and your loved ones.
    • Language classes
    • English and German lessons in small groups, tailored to your skills.
    • Remote work and flexible working hours 
    • Possibility of partial remote work, as well as adjusting working hours to your daily schedule.
    • Office in the center of Wrocław
    • Nearby cinema, fitness club and large selection of lunch places.

    Check similar offers

    Senior Data Scientist

    New
    IP Partner
    4.32K - 7.45K USD
    Warszawa
    , Fully remote
    Fully remote
    Azure Services
    Azure Databricks
    Real World Data

    Senior Data Analyst

    New
    hubQuest
    4.8K - 6.24K USD
    Katowice
    , Fully remote
    Fully remote
    SQL
    Python
    Power BI and/or Tableau

    Data Engineer

    New
    Decerto
    5.76K - 7.69K USD
    Lublin
    , Fully remote
    Fully remote
    PySpark
    Data
    SQL

    Data Engineer with GCP

    New
    Holisticon Insight
    6K - 7.45K USD
    Wrocław
    , Fully remote
    Fully remote
    SQL
    Terraform
    GCP

    Senior Data Engineer

    New
    Idego
    6.48K - 7.69K USD
    Kraków
    , Fully remote
    Fully remote
    Snowflake
    DBT
    Fivetran