All offersWarszawaDataData Cloud Engineer
Data Cloud Engineer
Data
Transition Technologies MS

Data Cloud Engineer

Transition Technologies MS
Warszawa
Type of work
Full-time
Experience
Senior
Employment Type
Permanent, B2B
Operating mode
Remote
Transition Technologies MS

Transition Technologies MS

We are a rapidly growing IT company with global reach. We deal with IT outsourcing and implementation projects in flexible cooperation models, providing access to competence and experts in technologies from mainstream to cloud. TTMS' greatest strength is its skilled professionals, so people are at the heart of our organisational culture.

Company profile

Tech stack

    Python or R
    master
    aiflow/glue/dataflow
    master
    SQL
    master
    git/gitflow/docker
    master
    bamboo/jenkins/terraform
    master
    json/xml/yaml
    master
    ETL/ELT
    advanced

Job description

We offer:

  • Participation in interesting and demanding projects
  • Flexible working hours
  • A great, non-corporate atmosphere
  • Stable employment conditions (contract of employment or B2B contract)
  • Opportunity to develop your career
  • Benefits package
  • Remote work

 

Your tasks:

  • Lead the design, development, and maintenance of data pipelines using Python or R
  • Utilize SQL for data processing and analysis, ensuring data accuracy and reliability
  • Optimize and manage data pipelines for enhanced efficiency and performance
  • Apply data architecture concepts, including data modeling, metadata management, and ETL/ELT, leading the team in these areas
  • Implement cloud technologies, focusing on data pipelines, and lead the team in utilizing tools like Airflow, Glue, and other cloud solutions
  • Mentor and guide the team in conducting performance analysis, troubleshooting, and remediation (optional)

 

We are looking for you, if you have:

  • Bachelor's or Master's degree in Computer Science, Engineering, or related field
  • Minimum of 4 years of experience working with programming languages focused on data pipelines, such as Python or R
  • Minimum of 4 years of experience working with SQL for data processing and analysis
  • Minimum of 3 years of experience in data pipelines maintenance
  • Minimum of 3 years of experience in data architecture concepts, including data modeling, metadata management, workflow management, ETL/ELT, real-time streaming, data quality, and distributed systems
  • Minimum of 3 years of experience in cloud technologies, emphasizing data pipelines using tools like Airflow, Glue, Dataflow, and other cloud solutions
  • Minimum of 1 year of experience in Java and/or Scala
  • Very good knowledge of data serialization languages such as JSON, XML, YAML
  • Excellent knowledge of Git, Gitflow, and DevOps tools (Docker, Bamboo, Jenkins, Terraform)
  • Capability to conduct performance analysis, troubleshooting, and remediation (optional)
  • Excellent knowledge of Unix
  • Familiarity with pharma data formats, especially SDTM, is a big plus

 

 

We reserve right to contact the selected candidates