#1 Job Board for tech industry in Europe

  • Job offers
  • Data Engineer - Financial Markets / Fixed Income
    New

    Data Engineer - Financial Markets / Fixed Income

    120 - 160 PLN/hNet per hour - B2B
    Type of work
    Full-time
    Experience
    Mid
    Employment Type
    B2B
    Operating mode
    Remote
    Link Group

    Link Group

    Hundreds of IT opportunities are waiting for you—let’s make it happen! Since 2016, our team of tech enthusiasts has been building exceptional IT teams for Fortune 500 companies and startups worldwide. Join impactful projects in BFSI, CPG, Industrial, and Life Sciences & Healthcare industries. Work with cutting-edge technologies like Cloud, Business Intelligence, Data, and SAP. Unlock your potential, grow your skills, and collaborate with top global clients. Ready for your next big career move? Let’s link with us!

    Company profile

    Tech stack

      ETL

      regular

      Python

      regular

      SQL

      regular

      redhat

      regular

      DevOps

      regular

      Azure DevOps

      regular

      Docker

      regular

      PostgreSQL

      nice to have

      Podman

      nice to have

      Apache Airflow

      nice to have

    Job description

    We are looking for a Data Engineer with a strong background in ETL pipelines, database management, and data integrations. The ideal candidate will have hands-on experience with Python and SQL to develop and maintain data processing workflows, ensuring data integrity, performance, and security.

    This role involves building integrations with both external APIs and internal systems, managing data storage solutions such as data warehouses and relational databases, and conducting data quality checks. The candidate will collaborate closely with data scientists and IT teams to ensure seamless and secure integration of data infrastructure within the organization.


    Project Description

    The project focuses on developing and optimizing data pipelines to support business intelligence, analytics, and machine learning initiatives. The Data Engineer will be responsible for integrating data from various sources, ensuring data consistency, and maintaining high-performance storage solutions.

    The role requires collaboration with data scientists and IT teams to align data architecture with business objectives while maintaining security and compliance standards. Additionally, the engineer will monitor system performance, troubleshoot issues, and contribute to the automation and scalability of data workflows.

    This is an excellent opportunity to work with cutting-edge data technologies in a dynamic environment, contributing to data-driven decision-making and business transformation.


    Key Responsibilities

    • Design, develop, and maintain ETL data pipelines using Python and SQL.
    • Implement and manage data storage solutions, including relational databases and data warehouses.
    • Build integrations with external APIs and internal systems to enable efficient data exchange.
    • Perform data analysis and quality checks, ensuring data accuracy and reliability.
    • Monitor data infrastructure to optimize performance, security, and scalability.
    • Work closely with data scientists and IT teams to align data workflows with business needs.


    Skills & Qualifications

    • 3+ years of experience developing ETL processing pipelines using Python and SQL.
    • Strong data modeling and database management skills, particularly with relational databases (preferably PostgreSQL).
    • Proficiency in Linux environments (especially RedHat distributions) and version control systems (Git).
    • Familiarity with DevOps pipelines, particularly Azure DevOps Services.
    • Understanding of containerization technologies like Docker or Podman.
    • Strong problem-solving skills and willingness to learn new technologies.


    Nice to Have

    • Experience with Apache Airflow or other data orchestration tools.
    • Familiarity with Databricks, Dataiku, MLflow, or similar Machine Learning platforms.
    • Knowledge of kdb+/q and distributed processing frameworks like Hadoop and Spark/PySpark.
    • Understanding of financial markets, especially fixed income instruments.


    120 - 160 PLN/h

    Net per hour - B2B

    Check similar offers

    Programista ETL/SQL

    New
    Aplikacje Krytyczne
    11K - 15K PLN/month
    Warszawa
    , Fully remote
    Fully remote
    SSIS
    Oracle
    SQL

    Data Engineer

    New
    Acaisoft
    18K - 28K PLN/month
    Warszawa
    , Fully remote
    Fully remote
    Airflow
    PostreSQL
    Kafka

    Data Engineer (Databricks)

    New
    Addepto
    15.1K - 21K PLN/month
    Wrocław
    , Fully remote
    Fully remote
    Azure
    Python
    Databricks

    Data Engineer (Azure Data Factory)

    New
    7N
    0.15K - 0.165K PLN/h
    Gdańsk
    , Fully remote
    Fully remote
    Azure Data Factory
    Snowflake

    Mid DevOps Data Engineer (+ Azure Data Factory)

    New
    1dea
    0.11K - 0.12K PLN/h
    Kraków
    , Fully remote
    Fully remote
    Azure
    ETL
    ETL tools