#1 Job Board for tech industry in Europe

Snowflake Data Engineer
Data

Snowflake Data Engineer

Warszawa
Type of work
Undetermined
Experience
Mid
Employment Type
B2B
Operating mode
Remote

Tech stack

    SQL

    regular

    Python

    junior

    AWS

    nice to have

    Snowflake

    nice to have

    Apache Spark

    nice to have

    Java

    nice to have

    JavaScript

    nice to have

Job description

Online interview
As an Snowflake Data Engineer you’ll be part of a project focused on development of data pipelines within central data platform (data lake, dwh, data applications).  The project contains multiple parts, such as integration of data from multiple sources, applying transformations using Python and Snowflake SQL and taking care of the performance aspects. The candidate needs to be able to effectively translate business requirements into technical specification and solutions, has to know how to build scalable data pipelines using Snowflake core services (SQL, Stages, Snowpipe) and ecosystem ( AWS / Apache Spark / Python Connector ).

Responsibilities:

  • Being able to effectively convert business requirements into technical solutions
  • Data pipelines development using Snowflake (SQL, optionally Java / Javascript UDFs)
  • Integration with Snowflake ecosystem (Python connector / Spark connector / Snowpipe / AWS)
  • Building, testing and debugging ETLs.
  • Proposing solutions for data processing applications 

Requirements:

  • Good understanding of Snowflake architecture
  • At least 0,5 years of Experience with Snowflake or 1 year experience with similar technology  (e.g. AWS Redshift, RDBMS)
  • Advanced SQL (group by, having, analytical queries) 
  • Experience in Python or willingness to learn it
  • Ability to perform data manipulations, load, extract from several sources
  • Ability to work with multiple file formats (structural and semi-structural data)

Nice to have:

  • AWS experience
  • Knowledge/ experience in any of DWH modelling styles (Data Vault or others)
  • Experience in Apache Airflow (DAGs developer or maintainer)
  • Experience in Spark / Java / Javascript
  • Snowflake Certifications
 
We offer:

  • 100% remote work 
  • B2B contract
  • A competitive salary
  • Multiple opportunities to gain new knowledge from AWS / Data Engineering area on our internal knowledge sharing meetups
  • Money refund for Data Engineering certification expenses
  • Training budget