#1 Job Board for tech industry in Europe

All offersWarszawaDataJunior Data Engineer
Junior Data Engineer
Data
AI Clearing

Junior Data Engineer

AI Clearing
Warszawa
Type of work
Full-time
Experience
Junior
Employment Type
Mandate, B2B
Operating mode
Hybrid

Tech stack

    PostgreSQL

    regular

    Bash

    regular

    SQL

    regular

    Python

    junior

    Docker

    nice to have

    Kubernetes

    nice to have

    QGIS

    nice to have

Job description

At AI Clearing, we disrupt the construction industry using AI! 


We are a Warsaw-based startup with global ambitions, that applies AI to supervise huge construction sites.

Our SaaS platform is universally acclaimed by the biggest companies in the construction industry like Skanska, Eurovia, PCL, Neom and many more, and we already monitor dozens of mega construction sites across 6 continents.


Now, we are entering the hyper-growth stage and we are heavily investing in new features and technologies to expand our platform.


We are a fast-paced company with a flat organizational structure, which is an ideal place for ambitious people to accelerate their careers.

We are very open to new ideas and there are lots of opportunities to grow!


We have a mature development process with version control, code reviews, and automated CI pipelines.

We invest in the latest technologies - we operate in the cloud on a Kubernetes native infrastructure and strive to keep the best engineering practices, so there are lots of things to learn and implement!

 

As a Data Engineer, you will join the Data team, whose main focus is to develop databases which are the heart of our solution.


Requirements:

  • >1 year of internship or professional experience
  • Solid knowledge of SQL
  • Experience with Bash, Python, Git
  • Experience with Docker would be a plus
  • Communicative command of English (both written and spoken)
  • Willing to work 3 days a week in our Warsaw office
  • Responsibility and sense of ownership


Nice to have:

  • Experience in developing ETL pipelines
  • Basic knowledge of Kubernetes and Argo Workflows
  • Familiarity with geospatial tools (QGIS, PostGIS


Responsibilities:

  • Implement novel data structures for processing geospatial data
  • Ensure the smooth operation of existing data pipelines
  • Deploy code to production environments
  • Actively participate in defining requirements and planning
  • Cooperate with a team consisting of developers and DevOps


What do we offer?

  • Competitive salary
  • Stock options plan (ESOP) - benefit from the company's success
  • Flexible working hours
  • Hybrid work (3 days in the office)
  • Multi-sport card
  • Fruits & cookies in the office - whatever you prefer
  • A lot of team-building events and parties


Join our team and let’s create the unicorn startup together!