Data ETL Engineer
We are looking for a Data ETL Engineer to design, build, and maintain scalable BigQuery data models and ETL/ELT pipelines. You will turn raw data into structured, business-ready datasets, optimize performance, ensure data quality, and support secure, cost-efficient cloud solutions.
Responsibilities
Design, build, test, and deploy BigQuery data models and transformations
Develop and maintain ETL/ELT pipelines based on Data Vault principles
Integrate data from multiple sources while ensuring accuracy and consistency
Optimize SQL queries and pipelines for performance, scalability, and cost
Monitor pipelines and troubleshoot failures or performance issues
Implement CI/CD processes and manage code with Git and Jenkins
Collaborate in Agile teams and support knowledge transfer to operations
Requirements
3+ years of experience with SQL and complex data transformations (preferably in BigQuery)
Experience building ETL/ELT pipelines and working with Data Vault models
Hands-on experience with GCP services: Cloud Composer/Airflow, Cloud Run, Pub/Sub
Proficiency in Python and Terraform
Familiarity with CI/CD pipelines and version control (Git)
Experience in Agile and DataOps environments
Strong analytical and problem-solving skills
Nice to Have
Experience with data ingestion pipelines using Data Fusion/CDAP or similar tools
Experience ingesting data from APIs, SFTP, CSV, JSON, or XML
Knowledge of modern data contract practices
Java development experience (e.g., custom plugins for Data Fusion)
Experience with CI/CD for cloud-based data solutions
Data ETL Engineer
Data ETL Engineer