Currency

Data Engineer - Spark, Pyspark, Databricks (Financial Services)

8 211 - 10 948 USDNet per month - B2B
Data

Data Engineer - Spark, Pyspark, Databricks (Financial Services)

Data

Krakow, Kraków

Caspian One

Full-time
B2B
Mid
Hybrid
8 211 - 10 948 USD
Net per month - B2B

Tech stack

    Databricks

    regular

    Python

    regular

    PySpark

    regular

    Apache Spark

    regular

    Cloud

    regular

    Apache Kafka

    regular

    Azure

    nice to have

    Java

    nice to have

Job description

Location: Krakow (Hybrid)

Rate: Up to 2000 PLN/day

Duration: 6-month rolling contract

 

Overview:

  • Excellent opportunity to work with a highly reputable financial services company in Poland!

  • We’re seeking a skilled Spark Software Engineer to join a dynamic agile team focused on building the strategic backbone between Dealstores and Operations/Regulatory systems.

  • This role is part of a long-term initiative to modernise infrastructure and leverage cloud technologies for enhanced performance and scalability.

 

Key Responsibilities:

  • Design and develop a strategic platform enabling trade executions to flow seamlessly between systems.

  • Translate epics and features into robust, scalable functionality.

  • Collaborate closely with agile pod members during sprints to deliver product requirements.

  • Work directly with the product team to understand and implement required features.


About the Team:

You’ll be part of the Digital Operations stream within the Investment Bank, a technology group driving transformation through simplification and innovation. The team supports regulatory and data clients by delivering cutting-edge solutions using the latest technologies. With a global footprint across London, New York, Singapore, Zurich, Hong Kong, Poland, and India, the team values diversity, autonomy, and continuous learning.

 

Required Skills & Experience:

  • Bachelor’s degree in Computer Science or relevant certification

  • Strong hands-on experience with Spark and PySpark for scalable data processing

  • Strong experience with Azure Databricks

  • Proven experience with Kafka for real-time and batch data workflows

  • Solid understanding of cloud architecture, preferably Azure (AWS or GCP also considered)

  • Proficient in CI/CD pipelines (GitLab, ADO, or GitHub)

  • Skilled in test-driven development and software design principles

  • Experience with Java backend development (desirable)

  • Knowledge of Kubernetes and modern data infrastructure (desirable)


Interested?

If you're passionate about data engineering and cloud technologies, and want to be part of a forward-thinking, agile team, we’d love to hear from you.

Tech stack

    Databricks

    regular

    Python

    regular

    PySpark

    regular

    Apache Spark

    regular

    Cloud

    regular

    Apache Kafka

    regular

    Azure

    nice to have

    Java

    nice to have

Office location

Published: 25.11.2025

Data Engineer - Spark, Pyspark, Databricks (Financial Services)

8 211 - 10 948 USDNet per month - B2B
Summary of the offer

Data Engineer - Spark, Pyspark, Databricks (Financial Services)

Krakow, Kraków

Caspian One

8 211 - 10 948 USDNet per month - B2B
By applying, I consent to the processing of my personal data for the purpose of conducting the recruitment process. Please be informed that the data controller is Caspian One (hereinafter "controller"). You have the right to request access to your pe... MoreThis site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.