All offersGdańskDevOpsSenior DevOps Engineer
Senior DevOps Engineer
DevOps
Dynatrace

Senior DevOps Engineer

Dynatrace
Gdańsk
Type of work
Undetermined
Experience
Senior
Employment Type
Permanent
Operating mode
Office
Dynatrace

Dynatrace

Dynatrace exists to make the world’s software work perfectly. Our unified platform combines broad and deep observability and continuous runtime application security with the most advanced AIOps to provide answers and intelligent automation from data at an enormous scale. This enables innovators to modernize and automate cloud operations, deliver software faster and more securely, and ensure flawless digital experiences. That’s why the world’s largest organizations trust the Dynatrace® platform to accelerate digital transformation.

Company profile

Tech stack

    DevOps
    advanced
    SRE
    advanced

Job description

Online interview
Data = information = knowledge = power. Do you want to hold the keys to that power? Are you motivated by solving challenging problems, where creativity is as crucial as your ability to write code, deliver solutions, and bring valuable data sets together to answer business questions?
If this sounds like an environment where you will thrive, come and join our Data Engineering team. Interested? Cause we are! 

Job Description
In Dynatrace we are all about automation, self healing and noOps approach. We preach automation wherever possible and we live by what we preach.

In the Data Engineering team for which we are hiring we are providing data to drive the world class Application Intelligence platform that is Dynatrace.
As a DevOps Engineer in the Data Engineering you will help us automate away our Data Platform both by providing the necessary tooling as by designing processes. 

It is quite an unique situation, as Dynatrace delivers the one of the best tools for DevOp, with this opportunity you would put your experience to drive this product, dogfooding it whenever possible and building a tool for other DevOps as well.
You will be building tools to automate installation at scale, accelerating time-to-value and enhancing the reliability of the Data Platform. That includes scripts but we may also need to integrate with existing mechanisms via APIs or provide means to reconfigure an already deployed product. Have an impact on how we shape our ETL pipeline and make sure the deployments of new builds of pipeline are automatic, predictable and transparent. All this working towards eliminating data downtimes and adding bricks to building trust in the data that your fellow Dynatracers will use on all levels of seniority to build the product that our customers love.

This is an exciting opportunity to make a direct, tangible impact on our product and work on our crucial Digital Business Platform.
As a member of the Data Engineering team, you will be at the center of Dynatrace product innovation.

In a company as Agile in organization as Dynatrace, there is always an option and encouragement to explore new areas when you find them interesting, moving to new positions and building a career with Dynatrace.
We guarantee plenty of challenges and scope to grow. 

Main responsibilities 
  • Creating deployment integrations for cloud platforms, primarily AWS and Azure.
  • Deployment automation in Jenkins and Terraform
  • Designing and automating processes for ETL data pipeline(s)
  • Proactively ensuring continuous and smooth data related processes execution
  • Collaboration in international cross-lab teams (mostly in the same time zone, across Europe) on the delivery of current objectives. 

Qualifications
Priority skills & experience:
  • 3+years professional experience with process automation, preferably as a DevOps, SRE or sysadmin 
  • 3+ years working with Cloud solutions, preferably AWS, on configuration, deployment management and automation
  • Experience with deployment automation, and CI/CD pipelines preferably using Jenkins
  • Good English communication skills.

Desired skills & experience 
  • Experience with Cloud databases, preferably Snowflake 
  • Experience with DB services administration (PostgreSQL, AWS RDS, Aurora, Snowflake) and practical knowledge of SQL
  • Practical knowledge of IaC tools: CloudFormation, Terraform, and similar tools.
  • Mindset focused on monitoring and observability

Nice-to-haves 
  • Experience in CI/CD support for Ms Power BI development
  • Experience with working on data pipelines (ETL / ELT) automation as well as supporting Data Engineering and Data Science team(s)
  • Experience with multiple cloud platforms (AWS, GCP, Azure)
  • Good command of scripting language(s): Python, Shell script, PowerShell.
  • Practical knowledge of IaC tools: Ansible, Chef, Puppet, PowerShell DSC, SaltStack, CloudFormation, Terraform, and similar tools.
  • Java literacy, experience with other programming languages
  • Familiarity with Docker and Kubernetes.

Additional Information
Please consider when submitting your CV that due to the current health crisis related to COVID-19, in our Lab in Poland, we are currently limited in extending offers to non-EU citizens. We are keeping the situation under review and would adjust our position, should the restrictive measures be removed later on. Should this affect your application, we are happy to keep it in evidence until further notice.