Senior Data Engineer
Senior Data Engineer
22082
Some careers shine brighter than others.
If you’re looking for a career that will help you stand out, join HSBC, and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.
Your career opportunity
HSBC Markets and Securities is an emerging markets-led and financing-focused investment banking and trading business that provides tailored financial solutions to major government, corporate and institutional clients worldwide.
In IT we provide HSBC with a genuine competitive advantage across the globe. Global Business Insights (GBI) provide critical metrics and reports to Markets and Securities Services Operations to enable them to monitor the health of their business and make data-driven decisions.
The GBI Transformation is a large and complex data integration program spanning all of MSS Ops globally. We serve a diverse audience of users and data visualisation requirements from Exco down, and over 80 data sources in multiple time-zones across Middle Office, Post-Trade and Securities Services IT and elsewhere. We are a critical enabler for the Rubix 2025 Strategy and the MSS control agenda, providing operational KPI and KRI metrics which allow senior management to measure the success of their BAU and CTB investment dollars.
We are looking for a GCP developer who can design, develop, test and deploy ETL/SQL pipelines connected to a variety of on-prem and Cloud data sources - both data stores and files. We will be using mainly GCP technologies like Cloud Store, BigQuery, and Data Fusion.
You will also need to work with our devops tooling to deliver continuous integration/deployment capabilities, automated testing, security, and IT compliance.
The role will be responsible for the provisioning of subject matter expertise to support Enterprise Risk Management (ERM) Leadership Team (LT) and ERM Assurance teams discharge their responsibilities in relation to operational risk and resilience risk steward delivery across all service areas, delivery of assurance activities, embedding of assurance practices and embedding of stewardship activities and service catalogue in respective GB/GF/Specialist team.
What you’ll do
Design, build, test and deploy Google Cloud data models and transformations in BigQuery environment (e.g. SQL, stored procedures, indexes, clusters, partitions, triggers, etc.)
Creating and managing ETl/ELT data pipelines to model raw/unstructured data into Data Vault universal model, enriched, transformed and optimized raw data into suitable for end consumers usage
Review and refine, interpret and implement business and technical requirements
Deliver non-functional requirements, IT standards and developer and support tools to ensure our applications are a secure, compliant, scalable, reliable and cost effective
Monitoring data pipelines for failures or performance issues and implementing fixes or improvements as needed
Optimizing ETL/ELT processes for performance and scalability, ensuring they can handle large volumes of data efficiently
Integrating data from multiple sources, ensuring consistency and accuracy
Manage code artefacts and CI/CD using tools like Git, Jenkins, Google Secrets Manager, etc.
Fix defects and provide enhancements during the development period and hand-over knowledge, expertise, code and support responsibilities to support team
What you need to have to succeed in this role
Proven (3+ years) hands on experience in SQL querying and optimization of complex queries/transformation in BigQuery, with a focus on cost, time-effective SQL coding and concurrency/data integrity.
Proven (3+ years) hands on experience in SQL Data Transformation/ETL/ELT pipelines development, testing and implementation, ideally in GCP Datafusion.
Proven Experience in Data Vault modelling and usage.
Hands on experience in Cloud Composer/Airflow, Cloud Run, Pub/Sub and on development in Python, Terraform.
Proficiency in Git usage for version control and collaboration.
Proficiency with CI/CD processes/pipelines designing, creation, maintenance in DevOps tools like Ansible/Jenkins etc. for Cloud Based Applications (Ideally GCP).
Experience in working in DataOps model and in working in Agile environment and toolset.
Strong problem-solving and analytical skills
Enthusiastic willingness to learn and develop technical and soft skills as needs require rapidly and independently.
Nice to Have
Experience designing, testing, and implementing data ingestion pipelines on GCP Data Fusion, CDAP or similar tools, including ingestion and parsing and wrangling of CSV, JSON, XML etc formatted data from RESTful & SOAP APIs, SFTP servers, etc.
Modern world data contract best practices understanding with experience for independently directing, negotiating, and documenting best in class data contracts.
Java development, testing and deployment skills (ideally custom plugins for Data Fusion)
Proficiency in working with Continuous Integration (CI), Continuous Delivery (CD) and continuous testing tools, ideally for Cloud based Data solutions.
What we offer
Competitive salary
Annual performance-based bonus
Additional bonuses for recognition awards
Multisport card
Private medical care
Life insurance
One-time reimbursement of home office set-up (up to 800 PLN).
Corporate parties & events
CSR initiatives
Nursery and kindergarten discounts
Financial support with trainings and education
Social fund
Flexible working hours
Free parking
If your CV meets our criteria, you should expect the following steps in the recruitment process:
Online behavioural test (for external candidates only)
Telephone screen (for external candidates only)
Job Interviews with the hiring manager
We are looking to hire as soon as possible so don’t wait and apply now!
You'll achieve more when you join HSBC.
Senior Data Engineer
Senior Data Engineer