This company is a leading provider of innovative 3D printing solutions, offering a wide range of software and services to industries such as healthcare, automotive, and aerospace. With a strong focus on healthcare, they develop advanced 3D printed medical devices and surgical guides, significantly enhancing surgical precision and patient outcomes.
- Play a crucial role in the development and maintenance of Data Lake and data cubes, handling tasks such as data ingestion, creating ETL and deployment pipelines, and dashboarding, utilizing multi-dimensional and tabular models from various data sources.
- Assist data scientists with data preparation and deployment.
- Contribute to Data Lake applications and facilitate the implementation of new AI applications.
- Provide maintenance and support for production environments, promptly addressing urgent requests.
- Aim to deliver a stable, production-ready system.
- Collaborate on creating reusable Infrastructure as Code (IAC) modules for other data lakes.
- Design, develop, and maintain reports using SSRS, PowerBI, Grafana, and Quicksight dashboards.
- Design, develop, and support ETL processes (SSIS) and data warehouses, including cloud-based solutions.
- Perform data modeling, including the design and development of logical and physical layers and ETL processes, ensuring scalability and extensibility (on-premise/on-cloud).
- Conduct technical analysis of existing solutions and participate in technical design.
- Prepare and support technical documentation.
- At least a bachelor's degree in computer science, mathematics, engineering, information technology, or a related technical field.
- Over 5 years of experience with business intelligence solutions, SQL programming (2016/2019/…), and a solid understanding of SQL code optimization principles.
- Proficiency in Spark, Python, and SQL (T-SQL, SSIS, SSAS, SSRS).
- Practical experience with BI tools such as Power BI, DAX, Grafana, and Quicksight.
- Familiarity with Visual Studio and SQL Management Studio.
- Strong experience with ETL/orchestration/data pipeline/IoT/data streaming tools.
- Extensive experience with cloud technology, particularly AWS (Glue, S3, Athena, Lambda).
- Experience with Infrastructure as Code (Terraform).
- Experience with Azure DevOps, Azure BI platform, and Git workflows is a plus.
- A solid technical understanding of data architectures and data warehousing principles.
- English: professional working proficiency
- Experience with Databricks is nice to have.
- Inspiring and challenging job with growth potential
- Private medical care and life insurance
- Great career growth opportunities
- Working at a company that supports innovation for people.