About the team & project
We’re looking for a Data Engineer to join a newly formed team focused on building and maintaining data solutions that support financial and operational reporting. The work is fully in-house – we’re not delivering for an external client – and directly supports key business areas.
The role requires high attention to detail and a strong sense of responsibility. The data you’ll work with is used for financial reporting, so accuracy is critical. You’ll collaborate closely with business stakeholders, analysts, and other engineers, working with tools like SQL, Python, DBT, Airflow, and Snowflake.
What your day will look like
- Investigate and understand data requirements from business users
- Build and maintain data pipelines (ETL) in SQL and Python
- Create and improve data models in Snowflake
-
Work with Airflow and DBT for pipeline orchestration and transformations
- Prepare clean datasets for reporting (Power BI, dashboards, analysis)
- Ensure that the data delivered is accurate, consistent, and aligned with business logic
Tech stack
- SQL (advanced)
- Python (for scripting and data processing)
- Airflow, DBT
- Snowflake
- Power BI (used by analysts)
What we’re looking for
- 3+ years of hands-on experience as a Data Engineer or in a similar role
- Strong command of SQL (query building, optimization)
- Experience with Python in data workflows
- Good understanding of data modeling principles
- Ability to work with stakeholders and understand business needs
- Very good English skills (B2+/C1) – daily communication in an international team
Nice to have:
- Experience with Snowflake
- Familiarity with DBT
- Background in financial data or accounting
How we work
- International, in-house team
- Clear and structured scope of work
- Hybrid model
- Daily stand-ups (remote), async communication
- We focus on quality and maintainability of data pipelines