Project information:
Industry: Consulting (HoReCa, Financial, e-commerce)
Location: Wrocław, 100% remote
Type of employment: b2b or employment contract
Budget: 140-170 net/h on a b2b contract
Project language: Polish, English
Start date: ASAP
Project scope:
Creation, configuration, implementation, and integration of Data Lakes, Data Warehouse or Lakehouse solutions with the use of Azure technologies (Azure Data Factory, Azure SQL and Synapse Analytics, Databricks)
Data processing and transformation of large datasets
Designing and implementing data pipelines, data models, and data integration solutions
Performance tuning and cost optimization
Building ELT and ETL processes
Collecting and monitoring performance metrics, and tuning the performance of processes
Identifying ways of improving the data quality and efficiency
Refactoring code for currently existing solutions
Proposing a solution architecture based on cloud solutions
Partnering with BI teams to define the best approach around data ingestion and data structures
5+ years of experience in Data Engineering in the Azure environment
Fluency in Microsoft Azure Data ecosystem (i.e., Azure Data Factory, Azure SQL, Synapse Analytics, and Databricks)
At least one completed end-to-end data warehouse implementation project
Experience with data modelling and data integration
Knowledge in SQL (T-SQL or any other)
Problem-solving, analytical, and critical thinking skills
Experience with Agile and Scrum methodologies
Azure DevOps experience (Repos, Pipelines)
Good command of both written and spoken English (B2 level)
We offer:
Stable cooperation based on a B2B contract or an employment contract
Fully remote work
Flexible working hours
Healthcare and Multisport benefits
Training budget
Integration meetings
Net per hour - B2B
Check similar offers