Kevin Edward is looking for a highly skilled Senior Data Engineer with deep expertise in Databricks and Lakehouse architecture. This role is ideal for a data professional with strong experience in PySpark, Delta Lake, Delta Live Tables, and Unity Catalog, who can design, develop, and optimize data workflows in a high-performance environment.
- Establish and implement Databricks Lakehouse architecture for enterprise solutions.
- Ingest and transform batch and streaming data using Databricks Lakehouse Platform.
- Design and maintain Delta Live Tables, Delta Sharing, and Unity Catalog for seamless data governance and security.
- Leverage Databricks Workflows to create and manage end-to-end data pipelines and ensure efficient data processing.
- Develop and optimize Delta Lake tables for monitoring data quality, reliability, and performance metrics.
- Orchestrate diverse workloads using PySpark, Delta Live Tables, and Databricks tools.
- Implement security best practices for data access, governance, and compliance.
- Size and optimize Databricks clusters for performance and cost efficiency.
- Collaborate with cross-functional teams in an Agile environment to understand business requirements and improve data workflows.
- 10+ years of experience in data engineering and working with large datasets.
- Must-have skills:
- Databricks (Lakehouse, Workflows, Unity Catalog, Delta Sharing)
- Delta Lake, PySpark, Lake flow, Delta Live Tables
- Data modelling & cluster sizing
- Security implementation in Databricks environments
- Good-to-have skills:
- Experience with Azure Data Factory (ADF) or Fivetran for data ingestion.
- Exposure to Power BI for data visualization.
- Strong ability to analyse, optimize, and troubleshoot large-scale data pipelines.
- Excellent communication and customer-handling skills to work with stakeholders.
- Databricks Champions Certification is preferred.