Requirements:
at least 3 years of experience in Big Data or Cloud projects in the areas of processing and visualization of large and/or unstructured datasets (including at least 1 year of hands-on Snowflake experience)
understanding of Snowflake's pricing model and cost optimization strategies for managing resources efficiently
experience in designing and implementing data transformation pipelines natively with Snowflake or Service Partners
familiarity with Snowflake’s security model
practical knowledge of at least one Public Cloud platform in Storage, Compute (+Serverless), Networking and DevOps area supported by commercial project work experience
at least basic knowledge of SQL and one of programming languages: Python/Scala/Java/bash
very good command of English
Tasks:
design, develop, and maintain Snowflake data pipelines to support various business functions
collaborate with cross-functional teams to understand data requirements and implement scalable solutions
optimize data models and schemas for performance and efficiency
ensure data integrity, quality, and security throughout the data lifecycle
implement monitoring and alerting systems to proactively identify and address issues
plan and execute migration from on-prem data warehouses to Snowflake
develop AI, ML and Generative AI solution
stay updated on Snowflake best practices and emerging technologies to drive continuous improvement
Our offer:
Remote work
Multisport card
Private healthcare system
Life insurance
Net per hour - B2B
Check similar offers