Role - Data Engineer
We are looking to hire a Data Engineer on behalf of our client from a global leader in beverage industry.
If you’re interested and meet the qualifications, please send your CV to Alina Pchelnikova at alina.pchelnikova@dcvtechnologies.co.uk.
Location: Remote from Poland
Contract Type: B2B Contract
Industry: Global leader in beverage industry
Main Skills: Azure, Spark, Scala
- 7-8 Years of Total Experience with minimum 6-7 years of Experience in Data Engineering space on Scala & Spark + Azure.
- Monitor and provide ongoing support/maintenance of the data pipelines to ingest and transform data using Scala and SQL on Spark.
- Investigate and remediate data pipeline errors and performance issues. Identify and resolve data discrepancies, ambiguities, and inconsistencies.
- Provide technical support for the data analysis Source and version control code/configuration artifacts using GitHub.
- Deploy code artifacts using GitHub Workflow/actions.
-
Technical Oversight: Provide technical leadership and hands-on oversight in developing data processing applications on Spark using Scala programming, focusing on Microsoft Azure Synapse Spark Runtime.
-
Data Pipeline Optimization: Design and optimize data pipelines processing through various zones in Medallion architecture using Azure Synapse pipelines.
-
Data Ingestion and Quality: Manage data ingestion, ensure data quality checks with tools like DQ, and handle data validation and error management.
-
Configuration Management: Develop and manage configuration settings using JSON files read by classes such as ApplicationConfig and TableConfig for various zones.
-
Cross-Functional Collaboration: Collaborate with data scientists, analysts, and cross-functional teams to ensure seamless integration and alignment of data engineering practices with marketing strategies.
-
Logging and Auditing: Oversee logging, auditing, and error handling processes to track and ensure data processing integrity. Knowledge of Azure Log Analytics and KQL queries a plus.
-
Testing and Validation: Implement unit testing with tools like Scala Test and maintain data quality checks for reliable data processing outcomes.