Work model: remote with occasional trips to Gdansk or Denmark (once every 3 months)
Contract B2B, long-term cooperation (12+ months)
Summary:
The Senior Platform Engineer will play a crucial role by harnessing cloud technologies and data processing tools to manage and optimise data pipelines. This position primarily aims to enhance data accessibility and processing efficiency within the organisation.
Responsibilities:
Design and implement scalable data pipelines using cloud services and technologies.
Work with Apache Spark and Databricks for effective data processing.
Utilise object-oriented programming languages, primarily Python and C#, for development tasks.
Collaborate in an agile environment, employing DevOps practices and CI/CD methodologies.
Manage code using Git, implementing effective branching strategies.
Develop infrastructure as code (IaC) using Terraform.
Write SQL queries and work with both relational and no-SQL databases.
Apply data modeling techniques and use data transformation technologies.
Key Requirements:
Proficiency in cloud services, preferably Azure.
Experience with Apache Spark and Databricks.
Strong knowledge of object-oriented programming, preferably in Python or C#.
Hands-on experience with DevOps, agile methodologies, and CI/CD processes.
Experience with Git and branching strategies.
Familiarity with Infrastructure as Code (IaC), preferably Terraform.
Skilled in writing SQL and handling relational and no-SQL databases.
Understanding of data modeling techniques and transformation technologies.
Nice to Have:
Experience with streaming data pipelines.
Knowledge of messaging queues.
Exposure to serverless components.
Experience developing and consuming RESTful API services.
Net per month - B2B
Check similar offers