Industry: Pharmaceutical
Start: ASAP (flexible).
Rate: depending on experience.
Contract: B2B 6 months + prolongations.
Remote: up to 100%
Location: remote/ Warsaw.
Project language: English.
Recruitment process: 2 interviews.
Summary: We are looking for a Platform Engineer who will be an essential KaaS (Kafka as a Service) team member, focused on enhancing operations and delivering new features to improve the platform's offerings.
Main Responsibilities:
- Develop and maintain the KaaS platform, ensuring high availability, performance, and scalability.
- Design and implement CI/CD pipelines using Azure DevOps (or similar tools) and YAML to automate build, test, and deployment processes.
- Utilize Terraform for infrastructure as code, structuring and optimizing solutions while troubleshooting potential issues.
- Implement platform DevOps principles to enhance the operational efficiency of the KaaS environment, focusing on system reliability and performance metrics.
- Work with AWS services, including S3, DynamoDB, and IAM, to ensure seamless integrations and optimal resource configurations.
- Develop robust and efficient code in programming languages such as Go or Python, deploying best practices for software development.
- Leverage containerization technologies such as Docker and orchestration via Kubernetes to improve deployment and scalability of KaaS services.
- Utilize Kafka technologies, including architecture design and implementation, to manage streams of data effectively and build real-time data pipelines.
- Share knowledge and promote best practices within the team and across the organization through excellent communication and collaboration.
Key Requirements:
- Experience in software development and DevOps areas.
- Proficiency in a general-purpose programming language such as Go or Python.
- Proficient in Terraform with a solid understanding of building, structuring, and optimizing solutions.
- Strong grasp of platform DevOps principles and practices.
- Experience building DevOps pipelines.
- Familiarity with Kafka concepts, including topics, partitions, schemas, and overall architecture.
- Understanding of AWS fundamentals, including integrations with services such as S3, DynamoDB, and IAM.
- Knowledge of containerization using Docker and orchestration with Kubernetes.
- Proficiency in source code control using GIT.
- Experience working in Agile development teams with strong collaboration and communication skills.
Nice to Have:
- Experience with additional programming languages.
- Knowledge of advanced Kafka functionalities.
- Familiarity with advanced AWS services.
Other Details:
- Team Structure - Agile development environment.
- Tools/Technologies - Kafka, Azure DevOps, YAML, AWS( S3, DynamoDB, IAM), GO/Python, Terraform, Docker, Kubernetes. GIT.
WE OFFER:
- Challenging international projects in an International business culture
- Transparently built relations based on trust and fair play
- Medicover card, Multisport card on preferential terms