#1 Job Board for tech industry in Europe

Senior Data Engineer
New
Data

Senior Data Engineer

Bydgoszcz
Type of work
Full-time
Experience
Senior
Employment Type
B2B, Permanent
Operating mode
Remote

Tech stack

    Polish

    C1

    English

    B2

    Databricks

    advanced

    Delta Lake

    advanced

    SQL Server

    advanced

    PostgreSQL

    advanced

    Git

    advanced

    CI/CD

    advanced

    Azure

    regular

    GCP

    regular

    AWS

    regular

    Python

    regular

Job description

Online interview

As a Data Engineer, you will support the development of ETL processes, collaborate with BI developers and data scientists on data & analytics solutions, and manage data ingestion and processing within a data lake. You'll be responsible for ensuring data quality, implementing best practices, and leveraging ways of working like version control and CI/CD. With a strong foundation in data engineering and familiarity with the Cloud and Databricks platforms, you'll play a key role in enhancing our data infrastructure and optimizing cloud costs. If you have a passion for data engineering and want to accelerate your development curve, we would love to hear from you.


Your responsibilites:


  • Supporting the development of data infrastructure to extract, transform, and load (ETL/ELT) data,

  • Supporting BI developers and data scientists to build data & analytics solutions,

  • Responsible for the ingestion and processing of data in the data lake,

  • Responsible for periodic data refreshes through data pipelines (scheduled jobs),

  • Implementing data management practices to improve data quality and meta data in the data lake,

  • Leveraging software engineering best practices such as version control (Git) and CI/CD (DevOps),

  • Continuously strengthening the data foundation through updating the data infrastructure,

  • Monitoring the cost associated with the cloud environment, data processing, and data computation.


Our requirements:


  • 5+ years of experience in data engineering or a similar role,

  • At minimum a bachelor’s degree in computer science, information technology or a related field,

  • Familiarity with the Cloud Platforms (e.g. Azure, GCP, AWS), 

  • Working knowledge of Databricks and Delta Lake,

  • Experience with relational databases (e.g., SQL Server, PostgreSQL),

  • Proficiency in coding (e.g., Python, PySpark and SQL),

  • Experience with version control (Git) and CI/CD,

  • Very good knowledge of English language - minimum B2 level,

  • Ability to translate business requirements to code.


Nice to have:

  • Experience in building solution using Data Mesh Approach,

  • Working knowledge of dbt,

  • Experience with snowflake database, 

  • Familiarity with Airflow.


We offer:

  • Challenging role within the company that creates innovative solutions,

  • Work in international environment on demanding projects,

  • Remote work model,

  • Subsidized private medical care, life insurance, multisport card,

  • Integration meetings,

  • Employee referral program.


  Sound like a good fit? Don’t wait – send in your CV and let’s talk!

Undisclosed Salary

B2B, Permanent