#1 Job Board for tech industry in Europe

  • Job offers
  • Java Developer – Learn and Grow into Big Data
    Java

    Java Developer – Learn and Grow into Big Data

    2 677 - 3 894 USDNet/month - B2B
    Type of work
    Full-time
    Experience
    Mid
    Employment Type
    B2B
    Operating mode
    Remote

    Tech stack

      JVM

      advanced

      Java

      advanced

      SQL

      advanced

      Cloud

      regular

      Docker

      regular

      Kafka

      regular

      Elasticsearch

      regular

      Airflow

      nice to have

      Spark

      nice to have

      Python

      nice to have

    Job description

    Online interview
    Friendly offer

    Java Developer – Learn and Grow into Big Data 


    Datumo specializes in providing Big Data and Cloud consulting services to clients from all over the world, primarily in Western Europe, Poland and the USA. Core industries we support include e-commerce, telecommunications and life science. Our team consists of exceptional people whose commitment allows us to conduct highly demanding projects


    Our team members tend to stick around for more than 3 years, and when a project wraps up, we don't let them go - we embark on a journey to discover exciting new challenges for them. It's not just a workplace; it's a community that grows together! 


    What we expect: 


    Must-have: 

    • Hands-on experience with a selected cloud provider (GCP, Azure, AWS). - this is mandatory
    • At least 2 years of commercial experience in software development with a strong focus on Java or Scala 
    • Readiness to learn and transition into the Big Data domain. 
    • Good knowledge of SQL and experience with RDBMS (e.g., MariaDB, Oracle).
    • Experience working with Kubernetes/Docker. 
    • Passion for writing clean code and using established design patterns.
    • Understanding of concepts like domain-driven design, test patterns, and common programming principles. 
    • Experience in using CI/CD tools. 
    • Contribution to internal projects. 
    • Knowledge of English at B2 level, communicative Polish (at least B2 level)


    Nice to have:

    • Familiarity with tools for testing (Junit, AssertJ, Mockito), event processing (Kafka, RabbitMQ), and data (ElasticSearch, Redis). 
    • Experience with data warehouses like Google BigQuery, Databricks, Snowflake, etc.
    • Familiarity with distributed data processing frameworks running on JVMs (e.g., Apache Spark, Flink). 
    • ETL design skills. 
    • Experience with Apache Airflow or similar pipeline orchestrators. 
    • Experience in Machine Learning projects. 
    • Contribution to open-source Big Data tools built with Java. 
    • Sharing knowledge by writing articles for the company blog or contributing to open-source projects. 


    What’s on offer: 

    • 100% remote work, with workation opportunity 
    • 20 free days 
    • onboarding with a dedicated mentor 
    • project switching possible after a certain period 
    • individual budget for training and conferences 
    • benefits: Medicover private medical care, co-financing of the Medicover Sport card
    • opportunity to learn English with a native speaker 
    • regular company trips and informal get-togethers 


    Development opportunities in Datumo: 

    • participation in industry conferences 
    • establishing Datumo's online brand presence 
    • support in obtaining certifications (e.g. GCP, Azure, Snowflake) 
    • involvement in internal initiatives, like building technological roadmaps
    • training budget 
    • access to internal technological training repositories 


    Discover our exemplary projects: 

    IoT data ingestion to cloud 

    The project integrates data from edge devices into the cloud using Azure services. The platform supports data streaming via either the IoT Edge environment with Java or Python modules, or direct connection using Kafka protocol to Event Hubs. It also facilitates batch data transmission to ADLS. Data transformation from raw telemetry to structured tables is done through Spark jobs in Databricks or data connections and update policies in Azure Data Explorer.


    Petabyte-scale data platform migration to Google Cloud 

    The goal of the project is to improve scalability and performance of the data platform by transitioning over a thousand active pipelines to GCP. The main focus is on rearchitecting existing Spark applications to either Cloud Dataproc or Cloud BigQuery SQL, depending on the Client’s requirements and automate it using Cloud Composer. 


    Data analytics platform for investing company 

    The project centers on developing and overseeing a data platform for an asset management company focused on ESG investing. Databricks is the central component. The platform, built on Azure cloud, integrates various Azure services for diverse functionalities. The primary task involves implementing and extending complex ETL processes that enrich investment data, using Spark jobs in Scala. Integrations with external data providers, as well as solutions for improving data quality and optimizing cloud resources, have been implemented. 


    Realtime Consumer Data Platform 

    The initiative involves constructing a consumer data platform (CDP) for a major Polish retail company. Datumo actively participates from the project’s start, contributing to planning the platform’s architecture. The CDP is built on Google Cloud Platform (GCP), utilizing services like Pub/Sub, Dataflow and BigQuery. Open-source tools, including a Kubernetes cluster with Apache Kafka, Apache Airflow and Apache Flink, are used to meet specific requirements. This combination offers significant possibilities for the platform. 


    Recruitment process: 

    • Programming task 
    • Soft skills interview - 30 minutes 
    • Technical interview - 60 minutes 


    Find out more by visiting our website - https://www.datumo.io 


    If you like what we do and you dream about creating this world with us - don’t wait, apply now!

    Check similar offers

    Java and Cloud Architect

    New
    Hays Poland
    Undisclosed Salary
    Kraków
    , Fully remote
    Fully remote
    Spring Boot
    Java
    Kubernetes

    Integration Specialist

    New
    ITLT
    4.91K - 6.13K USD
    Warszawa
    , Fully remote
    Fully remote
    API
    Kafka
    MuleSoft

    Software Engineer

    New
    Lekta AI
    3.65K - 4.87K USD
    Gdańsk
    , Fully remote
    Fully remote
    Scala
    Kotlin
    Java

    Java Developer – Learn and Grow into Big Data

    New
    Datumo
    2.68K - 3.89K USD
    Poznań
    , Fully remote
    Fully remote
    Java
    SQL
    JVM

    Regular Java Developer with AWS

    New
    Optiveum
    4.62K - 5.6K USD
    Warszawa
    , Fully remote
    Fully remote
    Java
    Apache Kafka
    Microservices