Join us, and transform complex queries into elegant solutions!
Kraków - based opportunity with hybrid work model (6 days per month in the office).
As a Data Engineer, you will be working for our client, a global financial institution engaged in modernizing its financial IT systems to meet critical regulatory requirements. The project involves preparing core finance applications for seamless operation on a new cloud infrastructure. You will be supporting the migration and transformation of large-scale ETL workflows and complex SQL logic from one cloud provider to another, with a strong emphasis on automation, performance optimization, and quality assurance. You will work alongside cross-functional teams to ensure the project’s success in a dynamic and fast-paced environment.
Your main responsibilities:
- Migrating complex BigQuery SQL transformations to Azure Spark SQL
- Building and executing ETL workflows using Azure Databricks
- Creating automation tools for data and code migration between cloud platforms
- Analyzing existing SQL logic and transforming it for new cloud environments
- Writing Python scripts to support migration utilities and ETL automation
- Documenting processes to support production readiness and handover
- Collaborating with developers, product owners, and technical leads
- Identifying and resolving performance bottlenecks in SQL workflows
- Supporting the CI/CD process by integrating SQL and ETL components
- Participating in Agile ceremonies and contributing to team planning
You're ideal for this role if you have:
- Experience working with at least one cloud provider, preferably Microsoft Azure
- Strong expertise in SQL, particularly Spark SQL or BigQuery SQL
- Hands-on experience building and maintaining complex ETL pipelines
- Proficiency in Python programming
- Understanding of SQL coding standards and performance optimization techniques
- Familiarity with CI/CD pipelines and automation tools
- Strong problem-solving skills and adaptability
- Ability to manage time effectively under tight deadlines
- Experience working within Agile development teams
- Excellent communication and documentation skills
It is a strong plus if you have:
- Prior experience with Azure Databricks
- Background in financial IT systems or regulatory projects
- Knowledge of data migration tools and strategies
- Experience with version control tools like Git
- Familiarity with big data tools and ecosystems
- Exposure to production support and post-deployment processes