Start date: ASAP / 1 month / flexible
Duration: Long term (36 months with further extensions)
Work model: hybrid, min. 2 days per week work from the Wroclaw's office
Type of cooperation: B2B
Overview
We are looking for a skilled Data Engineer to join a team focused on transforming industrial data into valuable insights. The team processes data from machines and factories to deliver customized data products to clients worldwide. The goal is to help organizations become fully data-driven by unlocking the potential of their data. You’ll work with diverse data sources and technologies in a dynamic and collaborative environment.
Responsibilities
Design, develop, and maintain data pipelines using Azure Databricks and Informatica Power Center
Work with large-scale datasets using PySpark and Spark SQL
Collaborate with stakeholders from both IT and business to understand data needs and deliver solutions
Implement and manage data workflows, jobs, and cataloging using Unity Catalog
Ensure data quality, security, and governance using Azure Key Vault and DevOps practices
Contribute to continuous improvement of data engineering practices and tools
Requirements
Proficiency in Informatica Power Center
Strong experience with Azure Databricks (PySpark, Spark SQL, Unity Catalog, Jobs/Workflows)
Advanced SQL skills
Hands-on experience with at least one relational DBMS (SQL Server, Oracle, or PostgreSQL)
Familiarity with Azure DevOps (Repos, Pipelines, YAML)
Knowledge of Azure Key Vault
Optional: experience with Azure Data Factory and DBT
Soft skills: open-minded, engaged, flexible, proactive, and collaborative
Bonus: experience working in a data mesh environment
We offer
B2B contract via Experis
Access to Medicover healthcare
Multisport card
E-learning platform for continuous development
Group insurance
B2B
Check similar offers