Join our team in Warsaw, where we’re collaborating on a cutting-edge fintech venture with a global industry leader. Together with our Partner – Klarna, we’re building an IT hub designed to drive innovation in digital payment solutions. We’re on the lookout for top-tier engineers who thrive in dynamic, forward-thinking environments. Spyrosoft is leading the recruitment process, facilitating a seamless experience for candidates who are ready to shape the future of online shopping and payments.
This opportunity is ideal for engineers who value independence, proactiveness, and flexibility. Our engagement begins with a B2B contract through Spyrosoft, transitioning to a direct contract with our Partner.
We offer a hybrid work model in Warsaw’s vibrant Wola district. English fluency and eligibility to work in Poland are essential, as is the successful completion of a background check to meet the rigorous standards of the financial domain.
- CV selection
- Initial recruitment screening
- Technical interview
- Online logic test
- Cultural fit interview
This project focuses on building scalable, high-performance data pipelines and infrastructures to support real-time analytics and decision-making across the organization. You’ll work alongside top-tier professionals in a dynamic, cloud-native environment leveraging modern data tools and practices.
- Python, PySpark (essential for data pipeline development)
- Apache Airflow, AWS Glue, Kafka, Redshift
- Git, Airflow
- Cloud & DevOps: AWS stack (Lambda, S3, CloudWatch, SNS/SQS), Kinesis
- Terraform, Ansible
- Automated testing, deployment, and version control processes
- A degree in Computer Science, Information Technology, or a related technical field.
- Proficiency in SQL, PySpark, and Python for building scalable data pipelines and transformations.
- Experience with Apache Airflow for orchestration and pipeline scheduling.
- Familiarity with AWS Glue, Kafka, and Redshift for both batch and real-time data processing.
- Git for version control and collaborative development.
- Airflow for creating, managing, and monitoring ETL/ELT workflows.
- Hands-on experience with the AWS stack: Lambda, S3, CloudWatch, SNS/SQS.
- Working knowledge of Kinesis for handling streaming data in a cloud-native environment.
- Proficiency in Terraform and Ansible for automating infrastructure provisioning and management.
- Ability to monitor, debug, and maintain ETL pipelines to ensure performance, reliability, and data quality.
- Experience with continuous integration and delivery processes, including automated testing, deployment, and versioning.
- Fluent English (written and spoken) is essential for effective communication and collaboration in a multi-team environment
- Designing, building, and maintaining scalable ETL/ELT data pipelines
- Orchestrating workflows and managing dependencies using Apache Airflow
- Handling batch and real-time data processing using Kafka and AWS services
- Building data transformations and analytics models with SQL, Python, and PySpark
- Ensuring performance, reliability, and observability of data pipelines
- Managing infrastructure as code with Terraform and Ansible
- Collaborating using Git and following best practices in CI/CD pipelines
- Monitoring, debugging, and optimizing data workflows in AWS (Lambda, S3, CloudWatch, Kinesis, SNS/SQS)