Senior Data Engineer

Data

Senior Data Engineer

Data
Full-time
Permanent
Senior
Hybrid

Job description


Some careers shine brighter than others.

If you’re looking for a career that will help you stand out, join HSBC, and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.


Your career opportunity


If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. 

HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. 


We are currently seeking an experienced professional to join our team in the role of Senior Data Engineer.


What you’ll do


  • Design, develop, and implement ETL processes using PySpark and Python.
  • Own & review release process.
  • Establish best practices for data engineering and ensure team adherence.
  • Perform code testing and conduct reviews with team members to ensure high-quality, efficient, and well-tuned code.
  • Design and implement data-driven solutions in collaboration with stakeholders.
  • Develop and maintain data pipelines and CI/CD pipelines, optimizing data integration processes for accuracy and efficiency.



What you need to have to succeed in this role


  • Strong experience with database technologies (SQL, NoSQL), data warehousing solutions, and big data technologies (Hadoop, Spark).
  • Expertise in designing and implementing ETL/ELT frameworks for complex data warehouses.
  • Proficiency in programming with Python and Shell Scripting (e.g., PowerShell, Bash).
  • Deep understanding of ETL processes and data pipeline orchestration tools (Airflow, Apache) and solid understanding of IT infrastructure, including networking, operating systems, and security principles.
  • Experience with tools such as GitHub, Jenkins, Ansible Automation, JIRA, Confluence, and ServiceNow.
  • Strong analytical, problem-solving, communication, and collaboration skills. The ability to communicate efficiently upwards (to business), downwards (to IT teams) and laterally (to peers, vendors, and client-side staff).
  • Familiarity with technology concepts, roles, and terminology, and the ability to work closely with application architects and modelers. Experience with API design and micro-service architectures.
  • The ability to serve as an active participant in architecture groups and forums communicating solution strategy and design in conversation, documentation and presentations.
  • Strong knowledge on the SDLC, Agile and DevOps operating model.


What we offer


  • Competitive salary
  • Annual performance-based bonus
  • Additional bonuses for recognition awards
  • Multisport card
  • Private medical care
  • Life insurance
  • One-time reimbursement of home office set-up (up to 800 PLN).
  • Corporate parties & events
  • CSR initiatives
  • Nursery discounts
  • Financial support with trainings and education
  • Social fund
  • Flexible working hours 
  • Free parking


If your CV meets our criteria, you should expect the following steps in the recruitment process:


  • Online behavioural test 
  • Telephone screen 
  • Interview with the hiring manager 

Tech stack

    SQL

    advanced

    ETL

    advanced

    Data

    advanced

    PySpark

    regular

    CI/CD

    regular

Office location

Published: 27.05.2025