#1 Job Board for tech industry in Europe

  • Job offers
  • All offersPoznańDataData Engineer
    Data Engineer
    new
    Data
    Datumo

    Data Engineer

    Datumo
    3 530 - 5 044 USDNet/month - B2B
    Type of work
    Full-time
    Experience
    Mid
    Employment Type
    B2B
    Operating mode
    Remote

    Tech stack

      Cloud

      advanced

      Python

      advanced

      Big Data

      advanced

      Snowflake

      regular

      Databricks

      regular

      Scala

      regular

      Apache Spark

      regular

      Airflow

      junior

      Docker

      junior

      Kafka

      junior

    Job description

    Online interview
    Friendly offer

    Datumo specializes in providing Big Data and Cloud consulting services to clients from all over the world, primarily in Western Europe, Poland and the USA. Core industries we support include e-commerce, telecommunications and life science. Our team consists of exceptional people whose commitment allows us to conduct highly demanding projects


    Our team members tend to stick around for more than 3 years, and when a project wraps up, we don't let them go - we embark on a journey to discover exciting new challenges for them. It's not just a workplace; it's a community that grows together! 



    What we expect: 


    Must-have: 

    • at least 3 years of commercial experience in Big Data
    • proven record with a selected cloud provider GCP preferred, Azure or AWS
    • good knowledge of JVM languages - Scala or Java or Kotlin
    • good knowledge of Python
    • good knowledge of SQL
    • understanding of Apache Spark or similar distributed data processing framework
    • experience with BigQuery, Snowflake, Hive or similar distributed datastore
    • designing and implementing Big Data systems following best practices
    • ensuring solution quality through automatic tests, CI / CD and code review
    • proven collaboration with businesses
    • English proficiency at B2 level, communicative in Polish


    Nice to have:

    • experience in Snowflake/Databricks platform
    • familiarity with Apache Airflow or similar pipeline orchestrator 
    • knowledge of Apache Kafka, Docker and Kubernetes technologies
    • knowledge of real-time data processing
    • experience in Machine Learning projects
    • experience in Apache Flink
    • willingness to share knowledge (conferences, articles, open-source projects)



    What’s on offer:

    • 100% remote work, with workation opportunity 
    • 20 free days
    • onboarding with a dedicated mentor
    • project switching possible after a certain period
    • individual budget for training and conferences
    • benefits: Medicover private medical care, co-financing of the Medicover Sport card
    • opportunity to learn English with a native speaker
    • regular company trips and informal get-togethers


    Development opportunities in Datumo:

    • participation in industry conferences
    • establishing Datumo's online brand presence
    • support in obtaining certifications (e.g. GCP, Azure, Snowflake)
    • involvement in internal initiatives, like building technological roadmaps
    • training budget
    • access to internal technological training repositories 



    Discover our exemplary projects:


    IoT data ingestion to cloud 

    The project integrates data from edge devices into the cloud using Azure services. The platform supports data streaming via either the IoT Edge environment with Java or Python modules, or direct connection using Kafka protocol to Event Hubs. It also facilitates batch data transmission to ADLS. Data transformation from raw telemetry to structured tables is done through Spark jobs in Databricks or data connections and update policies in Azure Data Explorer.


    Petabyte-scale data platform migration to Google Cloud

    The goal of the project is to improve scalability and performance of the data platform by transitioning over a thousand active pipelines to GCP. The main focus is on rearchitecting existing Spark applications to either Cloud Dataproc or Cloud BigQuery SQL, depending on the Client’s requirements and automate it using Cloud Composer.


    Data analytics platform for investing company

    The project centers on developing and overseeing a data platform for an asset management company focused on ESG investing. Databricks is the central component. The platform, built on Azure cloud, integrates various Azure services for diverse functionalities. The primary task involves implementing and extending complex ETL processes that enrich investment data, using Spark jobs in Scala. Integrations with external data providers, as well as solutions for improving data quality and optimizing cloud resources, have been implemented. 


    Realtime Consumer Data Platform

    The initiative involves constructing a consumer data platform (CDP) for a major Polish retail company. Datumo actively participates from the project’s start, contributing to planning the platform’s architecture. The CDP is built on Google Cloud Platform (GCP), utilizing services like Pub/Sub, Dataflow and BigQuery. Open-source tools, including a Kubernetes cluster with Apache Kafka, Apache Airflow and Apache Flink, are used to meet specific requirements. This combination offers significant possibilities for the platform. 


    Recruitment process:

    • Quiz - 15 minutes
    • Soft skills interview - 30 minutes
    • Technical interview - 60 minutes


    Find out more by visiting our website - https://www.datumo.io


    If you like what we do and you dream about creating this world with us - don’t wait, apply now! 

    3 530 - 5 044 USD

    B2B

    Apply for this job

    File upload
    Add document

    Format: PDF, DOCX, JPEG, PNG. Max size 5 MB

    This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

    Informujemy, że administratorem danych jest Datumo sp. z o.o. z siedzibą w Warszawie, ul. Dziekońskiego 1 (dalej jako "a...more