Data Engineer (Orchestration and Processing)

120 - 150 PLNNet per hour - B2B

Data Engineer (Orchestration and Processing)

Data

Daszyńskiego, Warszawa +4 Locations

Link Group

120 - 150 PLN
Net per hour - B2B
Full-time
B2B
Mid
Remote

Tech stack

    English

    B2

    Python

    regular

    Cloud

    regular

    Apache Airflow

    regular

    Spark

    regular

    Data

    regular

Job description

About the Role:

We are seeking a skilled Data Engineer to join our team and help design, build, and optimize scalable data pipelines and platforms. The ideal candidate will have strong Python coding skills, experience in AWS cloud services, and expertise in orchestration tools like Apache Airflow. You will be responsible for integrating automation solutions, optimizing data workflows, and ensuring smooth operations across data engineering processes.


Tech Stack:

  • Programming: Python

  • Cloud: AWS (e.g., S3, Glue, Lambda, EC2, Redshift)

  • Data Orchestration: Apache Airflow

  • Data Processing: Spark, Glue

  • Databases: Relational & NoSQL databases

  • Other: CI/CD pipelines, automation tools, Git


Key Responsibilities:

  • Design, develop, and maintain robust, scalable data pipelines and ETL workflows.

  • Leverage Apache Airflow for orchestration and automation of complex data processes.

  • Work with AWS services to manage and optimize cloud-based data solutions.

  • Integrate automation tools to enhance platform efficiency and reliability.

  • Collaborate closely with data scientists, analysts, and other engineers to deliver high-quality data solutions.

  • Troubleshoot and optimize existing data workflows for performance and scalability.

  • Support the implementation of Spark- and Glue-based solutions for big data processing.


Requirements:

  • Proven experience as a Data Engineer or in a similar data-focused role.

  • Strong Python programming skills with hands-on experience in data engineering.

  • Solid knowledge of AWS cloud services (S3, Glue, Redshift, Lambda, etc.).

  • Expertise in Apache Airflow or similar orchestration tools.

  • Experience working with Spark and Glue for data processing.

  • Familiarity with CI/CD, automation frameworks, and version control (Git).

  • Strong problem-solving skills and the ability to work in cross-functional teams.

  • Excellent communication skills in English.


Nice to Have:

  • Experience with Kubernetes, Docker, or other container orchestration tools.

  • Knowledge of data governance and security best practices.

  • Exposure to streaming platforms (e.g., Kafka).

Published: 27.08.2025

Meet the company

Link Group

Hundreds of IT opportunities are waiting for you—let’s make it happen! Since 2016, our team of tech enthusiasts has been building exceptional IT teams for Fortune 500 companies and startups worldwide. Join impactful projects in BFSI, CPG, Industrial, and Life Sciences & Healthcare industries. Work with cutting-edge technologies like Cloud, Business Intelligence, Data, and SAP. Unlock your potential, grow your skills, and collaborate with top global clients. Ready for your next big career move? Let’s link with us!

Company profile
Office location