Microsoft Fabric Engineer
We are looking for a Microsoft Fabric Engineer who understands that a data platform is only as good as the pipelines, governance, and architecture behind it — someone who has actually built and maintained Fabric-based solutions in production, not just completed a Microsoft Learn tutorial. You will work on international projects for clients in banking, insurance, manufacturing, and telco (Netherlands, UK, Germany, US), designing the data infrastructure that turns raw data into reliable business intelligence at scale.
About Devapo
At Devapo, we focus on continuous self-development and acquiring new knowledge. If you are a fast learner, want to participate in international projects, are a team player, and can work independently - join us!
We provide our clients with more than just code - we want to equip them with tools that allow their businesses to flourish. Our clients’ success is our success, which is why we ensure that everyone who creates Devapo has a long-term goal in mind. At Devapo, you’ll have the opportunity to discuss your challenges and solutions with our team of experts, including experienced architects who are always ready to share their knowledge and guide you through complex technical decisions.
Key Responsibilities
● Designing and implementing end-to-end data platforms in Microsoft Fabric — Lakehouse, Warehouse, OneLake, and Power BI
● Building and maintaining PySpark/Delta pipelines across medallion architecture layers (bronze, silver, gold)
● Developing semantic models for Power BI using Direct Lake mode for high-performance analytics
● Implementing Git-based CI/CD with Azure DevOps and Fabric deployment pipelines
● Optimizing Fabric capacity
● Collaborating with client business and technical teams to translate requirements into scalable data solutions
● (Senior) Mentoring junior engineers, leading architecture workshops, and supporting pre-sales engagements
Requirements
● Min. 4 years of data engineering experience
● Hands-on experience with Microsoft Fabric: Lakehouse, Warehouse, OneLake, Data Pipelines, Notebooks
● Advanced SQL / T-SQL
● Python with PySpark
● Solid understanding of medallion architecture (bronze / silver / gold)
● Experience with Azure Data Factory / Fabric Pipelines for data ingestion and orchestration
● Power BI fundamentals — semantic models, DAX, Direct Lake mode
● Git + CI/CD with Azure DevOps or GitHub Actions applied to data workflows
● English B2+ — client-facing role, calls and written communication included
Nice to Have
● Migrating legacy ETL workloads from Azure Synapse, Azure Data Factory, Power BI Premium, or on-prem SQL into Fabric and OneLake
● DP-700 (Fabric Data Engineer Associate) and/or DP-600 (Fabric Analytics Engineer Associate) certification
● Databricks (Spark, Unity Catalog, Lakeflow) or Snowflake — cross-platform data engineering
● Real-Time Intelligence experience: Eventhouse, Eventstream, KQL, Data Activator
What We Offer
● Certifications and training funded
● Private medical care (Medicover)
● Multisport card
● English language classes
● Flexible working hours
● Team meetups and integration events
● Referral bonus

Devapo
Devapo to firma zajmująca się tworzeniem oprogramowania dla skomplikowanych, strategicznych projektów. Firma oferuje rozwiązania dostosowane do indywidualnych potrzeb klienta, skupiając się na zwiększaniu zwrotu z inwest...Microsoft Fabric Engineer
Microsoft Fabric Engineer