#1 Job Board for tech industry in Europe

  • Job offers
  • All offersWarszawaDataData Engineer
    Data Engineer
    Data
    Addepto

    Data Engineer

    Addepto
    Warszawa
    Type of work
    Full-time
    Experience
    Senior
    Employment Type
    B2B
    Operating mode
    Remote

    Tech stack

      AWS

      advanced

      Spark

      advanced

      SQL

      advanced

      Python

      advanced

      Data Integration

      advanced

      ETL tools

      regular

      Hadoop

      regular

      NIFI

      nice to have

      GCP

      nice to have

      Kafka

      nice to have

    Job description

    Online interview
    Friendly offer

    We are Addepto, where you can feel a startup atmosphere! We believe that the only constant in life is change, so we try to keep developing and improving to become better at what we do every day! We act outside the box and create and deliver the best solutions in the area of Big Data, AI, and Business Intelligence.


    Our team based in Warsaw and remotely is looking for a Senior Data Engineer focusing mainly on designing and constructing data processing architecture.


    Some of our recent Big Data projects:

    • Data lakes which stores terabyte of data and process machine learning tasks for big telecom company
    • Streaming applications to server data analytics in real-time for manufacturing companies
    • Systems that support the decision-making process and help to analyze data in a unified format for controlling and operations departments
    • Support real-time machine learning prediction on massive datasets, which prevents company losses for pharmaceutical companies
    • And more!


    What we offer:

    • Work in a well-coordinated team of passionate enthusiasts of Big Data & Artificial Intelligence
    • Fast career path and opportunity to develop your qualifications thanks to sponsorship for trainings, conferences and many other development possibilities in various areas
    • Challenging international projects for global clients and innovative start-ups
    • Friendly atmosphere, outstanding people and great culture – autonomy and supportive work environment are crucial for us
    • Flexible working hours – you can adjust your schedule to better fit your daily routine
    • Possibility of both remote and office-based work – modern office space available in Warsaw, Cracow, Wroclaw, Bialystok or coworking space in any place in Poland if needed
    • Any form of employment – we offer B2B, employment contract or contract of mandate
    • Paid vacation – 20 fully paid days off if you choose B2B or contract of mandate
    • Other benefits – e.g. great team-building events, language classes, trainings & workshops, knowledge sharing sessions, medical & sports package, and others


    Responsibilities:

    • Design and construction of scalable data processing architecture
    • Design, build and deploy effective data ingestion pipelines/streams in StreamSets Data Collector, or Kafka
    • Making an application that will aggregate, process, and analyze data from various sources
    • Cooperation with the Data Science department in the field of Machine Learning projects (including text/image analysis and building predictive models)
    • Using Big Data and BI technologies (e.g. Spark, Kafka, Hadoop, SQL)
    • Manage distributed database systems like ClickHouse, BQ, Teradata, Oracle Exadata, PostgreSQL + Citus Modelling, Star and Snowflake schema
    • Develop and organize data transformations in DBT and Apache Airflow
    • Translate requirements from the business and translate them into technical code
    • Ensure the best possible performance and quality in the packages
    • Manage business user’s expectations


    Requirements:

    • Higher education in technical and mathematical studies (or the last year of studies)
    • Commercial experience in the implementation, development, or maintenance of Business Intelligence or Big Data systems
    • Knowledge of Python (or Java/Scala), SQL and Spark
    • Hands-on experience with Big Data technologies
    • Good command of the English language (min. B2+)
    • Experience with cloud services (preferably AWS)
    • Independence and responsibility for delivering a solution
    • Excellent knowledge in Dimensional Data
    • Good communication and soft skills
    • Lead discussions, requirement sessions, should be able to comprehend, summarize and finalize the requirements
    • Familiarity with NiFi, Docker, Kafka, Airflow, Splunk, DBT, Dagster, Hadoop, Databricks


    Are you interested in Addepto and would like to join us?

    Get in touch! We are looking forward to receiving your application. Would you like to know more about us?

    Visit our website (career page) and social media (Facebook, LinkedIn).