Data Engineer / DBT Developer (m/f/n)
We are looking for a Data Engineer / dbt Developer for a long term engagement to join our client's Analytics team and support the development of an end to end analytics platform.
The role involves close collaboration with the Data Engineering and Data Product teams, building efficient data models, and creating clear and valuable data visualisations.
The ideal candidate is regular / mid level, with strong experience in dbt and Snowflake.
The client operates in the AdTech domain, offering an advanced marketing platform that supports campaign automation and performance analytics. The company focuses on building scalable data solutions and tools that provide quick access to key business metrics and insights. Their technology stack is modern, cloud based, and built on best practices in data engineering and analytics.
Responsibilities:
Building and developing efficient analytics pipelines bridging Data Engineering and Data Product.
Co creating the end to end analytics strategy — from raw data to reports and dashboards.
Developing and maintaining dbt data models using software engineering best practices.
Defining key metrics and designing clear, effective data visualisations.
Optimising the data warehouse for performance, cost, and complexity.
Ensuring data consistency (“single source of truth”) and supporting self service analytics culture.
Monitoring costs and diagnosing potential performance issues.
Technical Requirements – Must Have:
Experience with dbt (cloud or local).
Experience with cloud data warehouses: Snowflake (BigQuery/ClickHouse considered a plus).
Very good knowledge of SQL.
Experience building robust, scalable data models (Kimball, OBT, Data Vault).
Understanding of data warehouse principles and ETL/ELT processes.
Experience with data visualisation tools (e.g., Power BI, Tableau, Looker, QuickSight, Superset, ThoughtSpot).
Ability to work with raw data and transform it into business insights.
Strong attention to detail and code quality.
English at minimum B2+ level.
Nice to Have:
Python and command line experience.
Experience with ThoughtSpot.
Familiarity with GitHub Actions / CI/CD.
Degree in a quantitative or technical field.
Experience in Programmatic / AdTech.
Technologies Used in the Project:
AWS S3 & Athena – raw data storage and querying
Snowflake – data warehouse
dbt Cloud / Airflow – orchestration
dbt – transformations and testing
ThoughtSpot – data visualisation
GitHub – version control
Python – integrations and working with API endpoints
Offer:
Collaboration model: 100% remote
Rate: 110 PLN/h net+VAT
B2B contract via SHIMI
Engagement: long term
Start: April/May
Working hours: Standard 8:00–16:00 (occasional later calls possible — flexibility appreciated)
Data Engineer / DBT Developer (m/f/n)
Data Engineer / DBT Developer (m/f/n)