Data Engineer
General info:
Rate: 190 PLN net + vat/ hour
Type of contract: B2B, long term cooperation
Model of work: hybrid, 1 day per week work from office (Gdańsk/ Warsaw/ Łódź)
Introduction & Summary
We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines, specifically focused on supporting corporate lending operations. The ideal candidate must be proficient in Python, PySpark, and SQL and possess experience with ETL/ELT pipeline development. This role requires a strong analytical mindset, able to transform data insights into actionable recommendations for stakeholders.
Main Responsibilities
Design, build, and maintain scalable data pipelines for internal banking systems and external data providers.
Develop and optimize data models for corporate lending domains, including loan origination and repayment history.
Implement data quality checks, validation rules, and data lineage.
Manage and evolve the data warehouse and lake for analytics and reporting.
Ensure compliance with regulatory and data governance requirements.
Collaborate with product managers and domain experts in an agile setting.
Document data pipelines, models, and key design decisions.
Key Requirements
Proficiency in Python, PySpark and SQL
Experience with ETL/ELT pipeline development
Cloud data warehouse/lake experience
Experience with AWS and services (e.g. Sagemaker AI, Event Bridge)
Data modeling, quality, and lineage expertise
Experience with ML model development and deployment
Nice to Have
Corporate lending or financial services background
Knowledge of credit decision processes
Experience with LLM/GenAI integration
Agentic AI implementations
Understanding of model explainability