#1 Job Board for tech industry in Europe

Data Engineer
New
Data

Data Engineer

Type of work
Full-time
Experience
Mid
Employment Type
Permanent, B2B
Operating mode
Remote
Transition Technologies MS

Transition Technologies MS

We are a rapidly growing IT company with global reach. We deal with IT outsourcing and implementation projects in flexible cooperation models, providing access to competence and experts in technologies from mainstream to cloud. TTMS' greatest strength is its skilled professionals, so people are at the heart of our organisational culture.

Tech stack

    English

    B2

    Pandas

    advanced

    Python

    advanced

    SQL

    advanced

    ETL

    advanced

    CI/CD

    advanced

    Snowflake

    advanced

Job description

                        

Your responsibilities:

  • Technical Skills: Proficiency in SQL, Python, and Pandas for data manipulation, along with experience in dbt for transformations and GitLab for version control and CI/CD.

  • Data Engineering Expertise: Strong understanding of data modelling, ETL processes, and data warehousing concepts.

  • CI/CD Knowledge: Ability to set up and manage automated data pipelines using GitLab CI/CD.

  • Cloud & Infrastructure: Experience with Snowflake's architecture, including warehouses, schemas, and security.

  • Python & Pandas: Ability to write efficient Python scripts for data processing, leveraging Pandas for data wrangling and analysis.

  • Collaboration & Documentation: Strong communication skills to work with analysts and engineers, plus experience documenting data workflows.


We are looking for you, if you have:

  • 2+ years of working with programming language focused on data pipelines,eg. Python or R

  • 1+ years of experience working on GCP, Cloud (AWS/Azure/Google) or other cloud platform (optional)

  • 1+ years of experience in data pipelines maintenance

  • 1+ years of experience with different types of storage (filesystem, relation, MPP, NoSQL) and working with various kinds of data (structured, unstructured, metrics, logs, etc.

  • 1+ years of experience in working in data architecture concepts (in any of following areas data modeling, metadata mng., workflow management, ETL/ELT, real-time streaming, data quality, distributed systems)

  • 2+ years of experience working with SQL

  • Exposure to open source and proprietary cloud data pipeline tools such as Airflow, Glue and Dataflow (optional)

  • Very good knowledge of relational databases (optional)

  • Very good knowledge of Git, Gitflow and DevOps tools (e.g. Docker, Bamboo, Jenkins, Terraform

  • Very good knowledge of Unix

  • Good knowledge of Java and/or Scala

  • Pharma data formats is a big plus (SDTM)


We offer:

  • Interesting and challenging projects

  • Flexible working hours

  • Friendly, non-corporate atmosphere

  • Stable working conditions (CoE or B2B)

  • Possibility for self-development and promotion in the company

  • Rich benefits package

  • Possibility to work remotely



We reserve the right to contact the selected candidates.

Undisclosed Salary

Permanent, B2B