Data Engineer with DevOps experience

Data

Data Engineer with DevOps experience

Data
Wybrzeże Wyspiańskiego, Wrocław

PeakData

Full-time
B2B
Mid
Hybrid
34.58 - 40.34 USD
Net per hour - B2B

Job description

Data Engineer with DevOps Experience

📍 Remote · Wrocław

💼 B2B: 30–35 €/h (VAT handled on the Swiss side)

⏳ Long-term collaboration · PeakData – AI & Pharma Intelligence

About PeakData

At PeakData, we don’t just process data — we help shape faster, more effective treatments for patients worldwide.

We’re building a modern data platform powering real-world AI/LLM medical insights for global pharma leaders.

If you want to build things that matter — this role is for you.

The Role

We’re looking for a Data Engineer with DevOps experience to help us build, scale and automate our internal data platform that powers global medical & pharmaceutical insights.

This role connects two worlds:

🔹 Data Engineering – pipelines, integration, optimisation

🔹 DevOps – CI/CD, observability, infrastructure automation

Someone who has:

  • Curiosity and willingness to experiment with new tools and technologies

  • Growth mindset, especially in areas like automation, CI/CD and cloud infrastructure

  • Analytical thinking – you like to understand how systems behave and why certain solutions work better than others

  • Ownership and independence – you can organize your work, make technical decisions and deliver results without constant supervision

  • Team player attitude – you collaborate easily, share knowledge and support others in finding the best solutions

  • Clarity in communication – you can explain technical concepts and architectural choices in a clear, structured, and simple way

Must-have

  • Strong Python skills (FastAPI, Flask, asyncio)

  • Minimum 3 years of experience with AWS, including S3, Lambda, IAM, monitoring and Bedrock

  • Proven experience in building and maintaining ETL/ELT pipelines

  • Practical knowledge of Docker and Kubernetes for deployment and scaling

  • Experience with Terraform (IaC) and infrastructure automation

  • Ability to write automated tests and maintain clear technical documentation

Nice to have

  • Experience with CI/CD pipelines (GitHub Actions, GitLab)

  • Experience using LLM-based tools (e.g. GitHub Copilot, Cursor) to support coding, automation or data workflows

  • Familiarity with GCP (BigQuery, Vertex AI, Gemini LLM – focused on data processing)

  • Knowledge of cloud cost optimization, monitoring, and alerting tools (CloudWatch, Prometheus, Grafana)

  • Experience in automating data analysis workflows or building ML-assisted processes

Why PeakData

Here’s what our engineers say keeps them here:

✔ We work with new technologies (LLM, Bedrock, Kubernetes, Terraform)

✔ We build things that matter — your work supports real medical decisions

✔ We have autonomy — real engineering ownership, not ticket factory

✔ Ideas are welcome — you can propose, design and deliver improvements

✔ The atmosphere is genuinely great — supportive, open team

What we offer

B2B contract: 30–35 €/h (VAT handled on the Swiss side)

Remote, but you’re very welcome to join us in our Wrocław office whenever you’d like 😊

Stable environment backed by global pharma leaders

Access to newest generation LLM tools and cloud technologies

Flexibility & work–life balance (core hours 9:00–15:00)

Space to grow — practical work with LLM, AI and automation

Ownership and autonomy over your work and technical decisions

✨ Join PeakData and help us shape the data backbone of global healthcare.

If you're ready to work with modern tech, meaningful projects and a genuinely good team — we’d love to talk.

Tech stack

    Polish

    B2

    English

    B2

    Automated Testing

    regular

    AWS

    regular

    Docker

    regular

    Terraform

    regular

    Kubernetes

    regular

    Python

    regular

    LLM

    nice to have

    CI/CD

    nice to have

    GCP

    nice to have

Office location