Databricks Data Engineer

37.96 - 48.80 USDNet per hour - B2B
Data

Databricks Data Engineer

Data
Postępu 15, Warszawa +2 Locations

Spyrosoft

Full-time
B2B
Senior
Remote
37.96 - 48.80 USD
Net per hour - B2B

Job description

Project description:

Join our data engineering team as we develop and scale our enterprise data platform. We are building a high-performance ecosystem designed to manage large-scale datasets, ranging from structured to unstructured formats. In this role, you will help modernize our data infrastructure by implementing cutting-edge storage and processing solutions. You will play a key part in designing how we ingest, process, and govern data to provide reliable insights across the organization.

Tech stack:

  • Databricks (Unity Catalog, Delta Live Tables)

  • Python (PySpark), SQL

  • Azure, AWS, or GCP

  • Data Lakehouse, Data Mesh, Data Marts

  • DevOps, CI/CD Pipelines

  • Agile (Scrum/Kanban)

Requirements:

  • At least 8 years in Data Engineering, with a minimum of 2 years specifically in Big Data environments.

  • 4+ years of hands-on experience with Databricks services, including data pipelines and Unity Catalog.

  • Expert-level skills in Python and SQL.

  • Strong background in Data Warehousing, ETL, and distributed data processing.

  • Deep understanding of Data Lakes, Data Warehouses, and Data Mesh concepts.

  • Experience with at least one public cloud (Azure, AWS, or GCP) and strong design skills for both relational and non-relational storage.

  • Analytical mindset capable of troubleshooting complex issues in a big data landscape.

  • Very good verbal and written English B2/C1

  • Experience working in Agile (Scrum/Kanban) environments.

Main responsibilities:

  • Design and maintain robust data pipelines and distributed data processing systems using Databricks.

  • Implement and manage data governance and security frameworks via Unity Catalog.

  • Develop sophisticated data models (Relational and Non-Relational) to support complex analytical requirements.

  • Improve the performance and reliability of Big Data workflows and ETL processes.

  • Work within an Agile environment, integrating DevOps and CI/CD principles into the data lifecycle.

  • Act as a subject matter expert, guiding the team through complex big data challenges and architectural decisions.

Tech stack

    English

    C1

    PySpark

    master

    Databricks

    master

    SQL

    master

    Python

    master

    Data Marts

    advanced

    Data Lakehouse

    advanced

    Azure, AWS, or GCP

    advanced

    Data Mesh

    advanced

Office location

About the company

Spyrosoft

Spyrosoft is a leading technology company specializing in software development and IT services. The company provides a wide range of expertise including artificial intelligence, cloud services, cybersecurity, digital pro...

Company profile

Databricks Data Engineer

37.96 - 48.80 USDNet per hour - B2B
Summary of the offer

Databricks Data Engineer

Postępu 15, Warszawa
Spyrosoft
37.96 - 48.80 USDNet per hour - B2B
By applying, I consent to the processing of my personal data for the purpose of conducting the recruitment process. Informujemy, że administratorem danych jest SpyroSoft S.A. z siedzibą w 50-141 Wrocław, pl. Nowy Targ 28 (dalej jako "administrator").... MoreThis site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.