Work globally, cross domain & cross technologies to support fast, small scale analytical prototypes as well as driving the scaling and industrialization of respective products
Conceptualize, develop and deploy analytical solutions and products in on-premise and cloud environments (end-to-end)
Translate non-technical and specific machine learning requirements into design specifications & reference architectures around data pipelines, CI/CD pipelines and hosting infrastructure
Consult in analytical use cases in terms of software engineering related considerations to enable a smooth scale-up phase of prototypes into production
Draft and implement operations & lifecycle management concepts for running & decommissioning analytical products and the underlying infrastructure
Collaborate with product line leads and data scientists to manage the entire lifecycle of analytical products
We would expect from you:
Minimum of a master’s degree in Computer Science, Mathematics, Statistics, Physics or comparable studies
Expert knowledge in cloud engineering, preferably in AWS and Azure
Advanced knowledge of data & machine learning pipelines and infrastructure, front-end applications and respective overarching software design patterns
Advanced competencies around DevOps principles and tool stack (e.g. CI / CD, Docker, Kubernetes, Infrastructure as Code, event driven architecture, ELK Stack)
Advanced knowledge in terms of relational (SQL-type) databases and data warehouses
Experience with NoSQL databases (e.g. MongoDB) or big data infrastructure (Hadoop, Spark) beneficial
Good knowledge in the following fields: Data Science, Statistics, Machine & Deep Learning, Data Visualization
Strong programming skills in Python, JavaScript or comparable programming languages
Strong conceptual, quantitative and problem-solving skills