Fiege is seeking an experienced Senior Data Engineer to join our cross-functional, agile Logistics Data Product Team. In this role, you will play a key part in building and scaling modern data architectures on Microsoft Azure, with a focus on Databricks. You will help shape the foundation for our enterprise-wide data and analytics strategy by implementing scalable data pipelines and enabling advanced data-driven solutions.
About the Role
This is an opportunity for a seasoned data engineer to work on advanced cloud-based data solutions, contribute to critical business transformation, and play a leadership role in our growing data organization.
Responsibilities
- Design, develop, and deploy efficient, scalable data pipelines in the Azure Cloud using Databricks, Azure Data Factory, Azure Data Explorer, Azure Fabric, and Azure Synapse Analytics.
- Manage large-scale data processing within our Azure-based Data Lakehouse to support both real-time and historical analytics.
- Implement and maintain a medallion architecture (bronze, silver, gold layers) for structured data management.
- Establish data governance practices, including data quality checks, auditing, and compliance frameworks.
- Collaborate closely with Data Analysts, Data Scientists, Developers, Product Owners, and internal/external stakeholders to translate business requirements into technical solutions.
- Continuously optimize data pipeline performance, scalability, and reliability.
- Define and promote best practices for Databricks cluster management, job scheduling, and cost efficiency.
- Lead code reviews, mentor junior team members, and deliver internal training on data engineering practices and tools.
- Provide expert advice across departments to ensure alignment with enterprise data strategy.
Profile
- Bachelor’s or Master’s degree in Computer Science, Statistics, Physics, Informatics, or a related field.
- At least 7 years of professional experience in backend or data engineering roles, with deep knowledge of Azure services and modern cloud-based data ecosystems.
- Strong programming skills in Python and SQL.
- Experience with business intelligence tools and data visualization, including 3+ years working with Power BI and semantic modeling.
- Proficient in working with big data technologies (e.g., Spark, Kafka) and databases (e.g., SQL Server, Cosmos DB, MongoDB).
- Familiarity with CI/CD practices, infrastructure as code (e.g., Git, Terraform), and container/orchestration technologies such as Docker and Kubernetes.
- Practical experience building ML workflows or analytics pipelines in Databricks.
- Strong analytical and conceptual thinking with excellent communication skills, including the ability to explain complex technical concepts to non-technical stakeholders.
- Demonstrated leadership skills and a proactive approach to problem-solving.
- Experience working in agile teams; knowledge of the logistics domain is a plus.
- Fluent in English, both written and spoken; German language skills are a plus.
- Willingness to travel occasionally.
What We Offer
- Long-term cooperation
- Benefits package (private medical care, sport card co-financing, group life insurance etc)
- Flexible working environment with options for remote work
- Opportunities for professional development, training, and certifications
- A collaborative, innovative team culture that emphasizes growth, ownership, and creativity