GCP Data Lake Developer
-, Warszawa
Shimi Sp. z o.o.
We are looking for GCP Data Lake Developers to join our client’s team.
Requirements:
Experience in working with GCP services, specifically BigQuery, Vertex AI
Proficient in managing large datasets, optimizing queries, and designing data models in BigQuery.
Solid experience in working with Oracle databases, including writing complex queries, data migration, and performance tuning
Experience with Databricks for collaborative data science and machine learning workflows, as well as in building Power BI dashboards and reports for business stakeholders.
English B2
Nice to Have:
Exposure to building and deploying machine learning models on platforms such as Vertex AI.
Familiarity with Data Lakes and unstructured data management.
Project Tasks:
Data Infrastructure Setup – Design, build, and maintain scalable data pipelines on GCP using tools like BigQuery, Databricks, and Vertex AI to process large datasets.
Data Modeling & Analysis – Create and optimize BigQuery data models to enable efficient querying and analysis.
Machine Learning Integration – Implement machine learning models and pipelines on Vertex AI, automating model training and deployment
Offer:
Rate: up to 190 PLN/h
Type of contract: UoP/B2B
Work model: remote (with possibility of occasional business trips)
GCP Data Lake Developer
GCP Data Lake Developer
-, Warszawa
Shimi Sp. z o.o.