All offersKrakówPythonData Engineer
Data Engineer
Python
Inuits

Data Engineer

Inuits
Kraków
Type of work
Undetermined
Experience
Senior
Employment Type
B2B, Permanent
Operating mode
Remote
Inuits

Inuits

We specialize in designing and building augmented teams of highly-skilled professionals for long-term collaborations. Our approach is rooted in human-to-human interactions and thoughtful actions, resulting in customer intimacy and the organic development of success stories within our extended team.

Company profile

Tech stack

    Python
    regular
    ETL
    regular
    ETL tools
    regular
    Data
    regular
    NoSQL
    regular
    Airflow
    junior
    Big Data
    nice to have

Job description

Online interview
Friendly offer
For a project with our client 1010data we are looking for several Data Engineers.

1010data travels at the speed of thought to make Big Data discovery easy; they power sub-second responses to analyses run on billions of rows of data. 1010data is defining the way the world interacts with data. With more than 30 trillion rows of data in private cloud, 1010data is designed to scale to the largest volumes of granular data, the most disparate and varied data sets, and the most complex advanced analytics. All while delivering lightning-quick system performance.

As a Data Engineer at 1010data, you will design, build, maintain and optimize large-scale automated ELT processes. You will work closely with our data scientists and analysts to create efficient and reliable jobs that support 1010data's data products and our clients' data warehousing needs. Your toolbox: industry-standard data orchestration tools including Apache Airflow; proprietary scheduling and automation tools; proprietary query transformation language and your own Python and ELT design skills. As we incorporate more cloud technologies into our processes, you will be at the forefront of exploring and defining best practices, and helping us transition our products to be more scalable.

What you will take on:
  • Coordinating with the systems, core, data science, and analytics teams to build and maintain data products and custom solutions for our clients;
  • Designing and writing automated scripts to preprocess terabytes of data from our partners/clients;
  • Designing and writing new enterprise-scale ELT/ETL workflows from scratch in Python using Airflow, Docker, Kubernetes, AWS, etc.;
  • Ensuring quality, reliability and uptime for critical automated processes;
  • Migrating our products and ELT/ETL processes into the cloud while drastically reducing our in-house data center footprint.

What you will bring:
  • At least 1-2 years of professional experience programming in Python;
  • Exposure to ETL/ELT pipeline automation;
  • Exposure to basic database concepts.

Preferred Skills:
  • Good understanding of Data Engineering, NoSQL databases and database design, distributed systems and/or information retrieval;
  • Experience with Airflow and Cloud Technologies;
  • Familiarity with functional/vector programming;
  • Experience with organising data and creating efficient data transformation process for optimised client data analysis;
  • Strong communication skills, ability to work seamlessly with team members with different areas of technical expertise.

Education:
  • STEM Bachelor’s preferred, Masters degree or Math/Statistics background is a big plus;
  • Data Engineering Bootcamp or equivalent.

In exchange for your skills we offer:
  • Friendly work atmosphere, in a young and international team;
  • Dog-friendly office in the center of historical Kraków;
  • Attractive compensation and benefits incl MultiKafeteria, Generali, Luxmed,... ;
  • Sport and other events, including, weekly running, squash, teamlunch;
  • Free tea, coffee and all-you-can-eat fruits in the office.

As our client is based in the US, we normally start at 11 AM.