#1 Job Board for tech industry in Europe

  • Job offers
  • Senior Data Engineer
    New
    Python

    Senior Data Engineer

    Warszawa
    45 - 52 USD/hNet per hour - B2B
    45 - 52 USD/hNet per hour - B2B
    Type of work
    Full-time
    Experience
    Senior
    Employment Type
    B2B
    Operating mode
    Hybrid
    Spyrosoft

    Spyrosoft

    Spyrosoft is an authentic, cutting-edge software engineering company, established in 2016. We have been included in the Financial Times ranking of 1000 fastest growing companies for three consecutive years: 2021, 2022 and 2023.

    Company profile

    Tech stack

      English

      B2

      Python

      master

      PySpark

      master

      Apache Airflow

      master

      Kafka

      advanced

      Airflow

      advanced

      AWS

      advanced

      Kinesis

      advanced

      Terraform

      advanced

      Redshift

      regular

      Ansible

      regular

    Job description

    Online interview

    Join our team in Warsaw, where we’re collaborating on a cutting-edge fintech venture with a global industry leader. Together with our Partner – Klarna, we’re building an IT hub designed to drive innovation in digital payment solutions. We’re on the lookout for top-tier engineers who thrive in dynamic, forward-thinking environments. Spyrosoft is leading the recruitment process, facilitating a seamless experience for candidates who are ready to shape the future of online shopping and payments.

    This opportunity is ideal for engineers who value independence, proactiveness, and flexibility. Our engagement begins with a B2B contract through Spyrosoft, transitioning to a direct contract with our Partner.

    We offer a hybrid work model in Warsaw’s vibrant Wola district. English fluency and eligibility to work in Poland are essential, as is the successful completion of a background check to meet the rigorous standards of the financial domain.


    Our process:

    • CV selection
    • Initial recruitment screening
    • Technical interview
    • Online logic test
    • Cultural fit interview


    Project description:

    This project focuses on building scalable, high-performance data pipelines and infrastructures to support real-time analytics and decision-making across the organization. You’ll work alongside top-tier professionals in a dynamic, cloud-native environment leveraging modern data tools and practices.


    Tech stack:

    • Python, PySpark (essential for data pipeline development)
    • Apache Airflow, AWS Glue, Kafka, Redshift
    • Git, Airflow
    • Cloud & DevOps: AWS stack (Lambda, S3, CloudWatch, SNS/SQS), Kinesis
    • Terraform, Ansible
    • Automated testing, deployment, and version control processes


    Requirements:

    • A degree in Computer Science, Information Technology, or a related technical field.
    • Proficiency in SQL, PySpark, and Python for building scalable data pipelines and transformations.
    • Experience with Apache Airflow for orchestration and pipeline scheduling.
    • Familiarity with AWS Glue, Kafka, and Redshift for both batch and real-time data processing.
    • Git for version control and collaborative development.
    • Airflow for creating, managing, and monitoring ETL/ELT workflows.
    • Hands-on experience with the AWS stack: Lambda, S3, CloudWatch, SNS/SQS.
    • Working knowledge of Kinesis for handling streaming data in a cloud-native environment.
    • Proficiency in Terraform and Ansible for automating infrastructure provisioning and management.
    • Ability to monitor, debug, and maintain ETL pipelines to ensure performance, reliability, and data quality.
    • Experience with continuous integration and delivery processes, including automated testing, deployment, and versioning.
    • Fluent English (written and spoken) is essential for effective communication and collaboration in a multi-team environment


    Main responsibilities:

    • Designing, building, and maintaining scalable ETL/ELT data pipelines
    • Orchestrating workflows and managing dependencies using Apache Airflow
    • Handling batch and real-time data processing using Kafka and AWS services
    • Building data transformations and analytics models with SQL, Python, and PySpark
    • Ensuring performance, reliability, and observability of data pipelines
    • Managing infrastructure as code with Terraform and Ansible
    • Collaborating using Git and following best practices in CI/CD pipelines
    • Monitoring, debugging, and optimizing data workflows in AWS (Lambda, S3, CloudWatch, Kinesis, SNS/SQS) 


    45 - 52 USD/h

    Net per hour - B2B

    Check similar offers

    Lead Python Developer

    New
    Ascendix
    5K - 7K USD/month
    Warszawa
    , Fully remote
    Fully remote
    Python
    AI