Databricks Data Engineer
Requirements
Minimum 5 years of experience in the Data Engineering field
Minimum 2 years of experience with Databricks
Very good knowledge of SQL, PySpark, and Python
Experience with data warehousing, ETL, distributed data processing, and data modeling
Strong analytical problem-solving skills in a Big Data environment
Experience working with structured, semi-structured, and unstructured data
Experience with at least one public cloud platform (Azure, AWS, or GCP)
Knowledge of designing relational and non-relational databases
Knowledge of Data Mart, Data Warehouse, Data Lake, and Data Mesh concepts
Very good command of English
Experience working with Agile methodologies (Scrum, Kanban) and knowledge of DevOps and CI/CD principles
Responsibilities
Designing and developing Data Engineering solutions
Building and maintaining ETL/ELT processes
Working with large volumes of data in a distributed environment
Data modeling and processing for analytical purposes
Collaborating with technical and business teams
Participating in the design of cloud-based data architectures
Databricks Data Engineer
Databricks Data Engineer