PROJECT INFORMATION:
Industry: Banking
Location: Remote
Type of assignment: B2B
RESPONSIBILITIES:
As a Data Engineer, your core responsibility will be to manage data artifacts for integration.
- Develop Component Data Artifacts (CDAs).
- Break down Master Data Artifacts (MDAs) and Global Data Artifacts (GDAs) into CDAs for integration.
- Focus on data modeling, data reuse, and understanding data domains.
- Implement data normalization concepts.
- Efficiently query data and prepare it for integration into the data lake using Juniper pipelines.
REQUIREMENTS:
- Strong technical understanding of Google Cloud Platform (GCP).
- Experience with Tableau or Looker.
- Experience with Apache Airflow or Hadoop.
- Proficiency in data modeling concepts.
- Experience with data normalization techniques.
- Knowledge of data integration methodologies.
NICE TO HAVE:
- Understanding of Juniper pipelines.
- Knowledge of refinery data processes.
OTHER DETAILS:
- The Data Engineer will collaborate closely with GCP engineers and platform teams.
- This role focuses on ensuring effective management of data modeling and reuse across various platforms.