Many development opportunities and access to modern technology
Medicover
Multisport Card
Life insurance and many more
Responsibilities:
Massive data: You will source / examine, engineer data pipelines for gigabytes/terabytes of structured and unstructured data with our platform to create value for customers
Production deployment: You will be responsible for integration and deployment of the machine learning pipelines into production where your ideas will come to life
Linux hacking: You will be masterfully using the command line, including tools like vi/emacs and understanding beyond basics of grep, bash, awk, sed, etc to aggressively dive into data, systems, and compute platforms to get the results you are seeking
Pushing the limits: This role will be on the cutting edge of our Data / Machine Learning platform. As we push to solve more of our ML/AI challenges, you will be prototyping new features, tools and ideas. Innovate at a very fast pace to maintain our competitive edge
Standardization: You will be fully involved in optimization of various process and coming up with a standardization of various approaches in solving usecases/problems
Collaboration: Coordinate and work with cross functional teams, sometimes located at different geo locations
Required:
CS fundamentals: B.S. (MS / PhD desired) in Computer Science, or related degree and you have a strong ethos of continuous learning
Software engineering: 5+ years of professional software development experience using Python and SQL, with version control (GIT) for production purposes, with good analytical & debugging skills. Experience with ML packages like pandas, numpy, scikit-learn, pyspark
Machine Learning: 2+ years of experience with machine learning modeling, creating prototypes and evolving them into full-fledged products that are served in production environments. Understanding of main concepts like classification, regression, time series, NLP, anomaly detection, clustering and other common problems
Environments: You have worked in at least one cloud environment like GCP (preferred), AWS or other cloud data platforms. Understanding of different cloud components to build ML solutions (Seldon.ai, Kubeflow, Vertex AI or similar)
Data Modeling: Experience and expertise in SQL is a must. Flair for data, schema, data model, how to bring efficiency in data modeling for efficient querying data for analysis, understands and develops data validation techniques
Project management: You demonstrate excellent project and time management skills, exposure to scrum or other agile practices in JIRA