Job Description
We are seeking a skilled ETRM Data Engineer to join our data engineering team and support critical initiatives in the Energy Trading and Risk Management (ETRM) domain. This role involves building robust data pipelines, integrating trading systems, and ensuring data quality across platforms such as Azure Data Factory, Databricks, and Snowflake.
You will collaborate closely with traders, analysts, and IT teams to design and implement scalable, high-performance data solutions that power decision-making in fast-paced trading environments.
Key Responsibilities
-
Design, develop, and maintain scalable data pipelines and ETRM systems
-
Lead data integration projects within the energy trading ecosystem
-
Integrate data from ETRM platforms such as Allegro, RightAngle, and Endur
-
Build and optimize data storage solutions using Data Lake and Snowflake
-
Develop and orchestrate ETL/ELT workflows using Azure Data Factory and Databricks
-
Write efficient, production-grade Python/PySpark code for data processing and analytics
-
Build and expose APIs using FastAPI for data services
-
Ensure data quality, consistency, and reliability across complex systems
-
Work closely with stakeholders to translate business requirements into technical data solutions
-
Optimize and enhance data architecture for scalability and performance
Mandatory Skills
-
Strong experience with Azure Data Factory (ADF)
-
Proficient in Data Lake architecture and best practices
-
Hands-on expertise with Snowflake and SQL
-
Solid experience in Python and PySpark
-
Knowledge of FastAPI for building scalable APIs
-
Proven work with Databricks in production environments
Nice to Have
-
Domain experience in ETRM / energy trading systems
-
Familiarity with Streamlit for internal dashboards
Experience integrating with Allegro, RightAngle, or Endur trading platforms