Job Title: Full Stack Engineer (Big Data)
Location: Kraków
You will work as part of a newly established engineering team in Kraków, responsible for the development, enhancement, and support of high-volume data processing systems and OLAP solutions used in global traded risk management.
- Design, develop, test, and deploy scalable IT systems to meet business objectives
- Build data processing and calculation services integrated with risk analytics components
- Collaborate with BAs, business users, vendors, and IT teams across regions
- Integrate with analytical libraries and contribute to overall architecture decisions
- Apply DevOps and Agile methodologies, focusing on test-driven development
- Provide production support, manage incidents, and ensure platform stability
- Contribute to both functional and non-functional aspects of delivery
- Degree in Computer Science, IT, or a related field
- Fluent in English, with strong communication and problem-solving skills
- Hands-on experience with big data solutions and distributed systems (e.g., Apache Spark)
- Strong backend development using Java 11+, Python, and Groovy
- Experience in building REST APIs, microservices, and integrating with API gateways
- Exposure to public cloud platforms, especially GCP or AWS
- Familiarity with Spring (Boot, Batch, Cloud), Git, Maven, Unix/Linux
- Experience with RDBMS (e.g., PostgreSQL) and data orchestration tools (e.g., Apache Airflow)
- Solid understanding of test automation tools like JUnit, Cucumber, Karate, Rest Assured
- Knowledge of financial or traded risk systems
- Experience with UI/BI tools and streaming solutions
- OLAP and distributed computation platforms such as ClickHouse, Druid, or Pinot
- Familiarity with data lakehouse technologies (e.g., Dremio, Trino, Delta Lake, Iceberg)
- Exposure to technologies like Apache Flink, Beam, Samza, Redis, Hazelcast
- Containerization and orchestration tools: Docker, Kubernetes
- Certifications: Scrum Master, PMP, FRM, or CFA
- Knowledge of RPC frameworks (e.g., gRPC)