#1 Job Board for tech industry in Europe

  • Job offers
  • All offersKrakówDataLead Data Engineer
    Lead Data Engineer
    Data
    HSBC Service Delivery

    Lead Data Engineer

    HSBC Service Delivery
    Kraków
    Type of work
    Full-time
    Experience
    Senior
    Employment Type
    Permanent
    Operating mode
    Hybrid

    Tech stack

      Data

      advanced

      Python

      advanced

      .NET C#

      advanced

      Java

      advanced

      SRE

      advanced

      Azure DevOps

      advanced

    Job description

    ABOUT THE ROLE

    Are you looking for an opportunity to advance your tech career? Interested in creating next-generation cybersecurity and analytics capabilities? If you are, then apply to join HSBC’s Cybersecurity Science & Analytics (CSA) team. We use our award-winning advanced analytics platform to develop innovative products for securing one of the largest technology estates in the world.    


    CSA are a unique and multi-skilled team of cybersecurity scientists, data and analytics professionals, and engineers. Our mission is to harness the power of data, analytics, AI/ML, and cybersecurity science to innovate and advance HSBC’s cybersecurity capabilities.


    The Lead Data Engineer will form part of the CSA Platform & Data Engineering Team, joining a global team of data technology professionals to deliver critical analytics engineering requirements for the strategic cybersecurity data lake and analytics platform.


    The position is a mid-senior technical, hands-on delivery role, requiring knowledge of data engineering, cloud infrastructure and platform engineering, platform operations and production support.



    YOUR RESPONSIBILITIES

    • Ingest and provision raw datasets, enriched tables, and/or curated, re-usable data assets to enable Cybersecurity use cases.
    • Drive improvements in the reliability and frequency of data ingestion including increasing real-time coverage.
    • Support and enhance data ingestion infrastructure and pipelines. 
    • Design and implement data pipelines that will collect data from disparate sources across the enterprise, and from external sources, transport said data, and deliver it to our data platform.
    • Extract Translate and Load (ETL) workflows, using both advanced data manipulation tools and programmatically manipulating data throughout our data flows, ensuring data is available at each stage in the data flow, and in the form needed for each system, service, and customer along said data flow.
    • Identify and onboard data sources using existing schemas and, where required, conducting exploratory data analysis to investigate and determine new schemas.



    SKILLS & EXPERIENCE WE REQUIRE

    • Data Engineering & Data Acquisition experience. Cloud-based Data Pipelines (Azure preferred); Data Transport and Data Cleaning; Data Engineering pipeline automation, product ionisation, and optimisation; Designing, building, and maintaining data pipelines and ETL workflows across disparate datasets; Cloud Cost optimisation; Dataset and Data Asset Curation; Data Modelling and Cataloguing; Database Architecture and Design; Data Warehousing and Data Integration; Real-Time Analytics Deployment for Large-Scale Datasets.
    • Programming skills. Ability to script (Bash/PowerShell, Azure CLI), code (Python, C#, Java), query (SQL, Kusto query language) coupled with experience with software versioning control systems (e.g., GitHub) and CI/CD systems; Programming experience in the following languages: PowerShell, Terraform, Python Windows command prompt and object orientated programming languages.
    • Software & Network Principles knowledge. Experience with SRE and Azure DevOps; Demonstrable experience of Linux administration and scripting (preferably Red Hat Systems); Understanding of hardware and software principles and storage technologies (SSD, HDD, NVMe), CPU architectures, and Memory & Operating system principles (especially network stack fundamentals); Understanding of network protocols and network design.
    • Azure technology services and Cloud & Big Data Technologies knowledge (Identity, Networking, Compute, Storage, Web, Containers, Databases).
    • Experience with server, operating system, and infrastructure technologies such as Nginx/Apache, CosmosDB, Linux, Bash, PowerShell, Prometheus, Grafana, Elasticsearch).
    • Experience with Infrastructure-as-Code and Automation tools such as Terraform, Chef, Ansible, CloudFormation/Azure Resource Manager (ARM).
    • Knowledge of streaming platforms such as Azure Event Hubs or Kafka, and stream processing services such as Spark streaming.
    • Experience with Security Information & Event Management (SIEM) and Security Orchestration, Automation & Response (SOAR) technologies, especially cloud based, is a significant asset.



    WHAT WE OFFER

    • Competitive salary
    • Annual performance-based bonus
    • Additional bonuses for recognition awards
    • Multisport card
    • Private medical care
    • Life insurance
    • One-time reimbursement of home office set-up (up to 800 PLN).
    • Corporate parties & events
    • CSR initiatives
    • Nursery discounts
    • Financial support with trainings and education
    • Social fund
    • Flexible working hours 
    • Free parking (Cracow office)