Data Engineer (Lat61)

Data

Data Engineer (Lat61)

Data

-, Warszawa +4 Locations

Point Wild

Full-time
B2B
Senior
Remote

Tech stack

    English

    B2

    SQL

    master

    Databricks

    advanced

    AWS

    advanced

    Python

    advanced

    PyTorch

    regular

    Delta Lake

    regular

    Parquet

    regular

    lakehouse architectures

    regular

Job description

Data Engineer (Lat61)

Poland, Remote or Ukraine, Remote or Romania, Remote


Point Wild helps customers monitor, manage, and protect against the risks associated with their identities and personal information in a digital world. Backed by WndrCo, Warburg Pincus and General Catalyst, Point Wild is dedicated to creating the world’s most comprehensive portfolio of industry-leading cybersecurity solutions. Our vision is to become THE go-to resource for every cyber protection need individuals may face - today and in the future. 


Join us for the ride!


Lat61 Mission

The Lat61 platform will power the next generation of cybersecurity and AI-enabled decision-making. As a Data Engineer on this team, you will help deliver:

  • Multi-Modal Data Ingestion: Bringing together logs, telemetry, threat intel, identity data, cryptographic assets, and third-party feeds into a unified lakehouse.

  • AI Agent Enablement: Supporting Retrieval-Augmented Generation (RAG) workflows, embeddings, and feature stores to fuel advanced AI use cases across Point Wild products.

  • Analytics & Decision Systems: Providing real-time insights into risk posture, compliance, and security events through scalable pipelines and APIs.

  • Future-Proofing for Quantum: Laying the groundwork for automated remediation and transition to post-quantum cryptographic standards.


Your work won’t just be about pipelines and data models - it will directly shape how enterprises anticipate, prevent, and respond to cybersecurity risks in an era of quantum disruption.


About the Role:

We are seeking a highly skilled Data Engineer with deep experience in Databricks and modern lakehouse architectures to join the Lat61 platform team. This role is critical in designing, building, and optimizing the pipelines, data structures, and integrations that power Lat61.


You will collaborate closely with data architects, AI engineers, and product leaders to deliver a scalable, resilient, and secure foundation for advanced analytics, machine learning, and cryptographic risk management use cases.


Your Day to Day:

  • Build and optimize data ingestion pipelines on Databricks (batch and streaming) to process structured, semi-structured, and unstructured data.

  • Implement scalable data models and transformations leveraging Delta Lake and open data formats (Parquet, Delta).

  • Design and manage workflows with Databricks Workflows, Airflow, or equivalent orchestration tools.

  • Implement automated testing, lineage, and monitoring frameworks using tools like Great Expectations and Unity Catalog.

  • Build integrations with enterprise and third-party systems via cloud APIs, Kafka/Kinesis, and connectors into Databricks.

  • Partner with AI/ML teams to provision feature stores, integrate vector databases (Pinecone, Milvus, Weaviate), and support RAG-style architectures.

  • Optimize Spark and SQL workloads for speed and cost efficiency across multi-cloud environments (AWS, Azure, GCP).

  • Apply secure-by-design data engineering practices aligned with Point Wild’s cybersecurity standards and evolving post-quantum cryptographic frameworks.


What you bring to the table:

  • At least 5 years in Data Engineering with strong experience building production data systems on Databricks.

  • Expertise in PySpark, SQL, and Python.

  • Strong expertise with various AWS services.

  • Strong knowledge of Delta Lake, Parquet, and lakehouse architectures.

  • Experience with streaming frameworks (Structured Streaming, Kafka, Kinesis, or Pub/Sub).

  • Familiarity with DBT for transformation and analytics workflows.

  • Strong understanding of data governance and security controls (Unity Catalog, IAM).

  • Exposure to AI/ML data workflows (feature stores, embeddings, vector databases).

  • Detail-oriented, collaborative, and comfortable working in a fast-paced innovation-driven environment.


Bonus Points:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.

  • Data Engineering experience in a B2B SaaS organization.


Why Join Us?

  • Opportunity to build a next-generation lakehouse platform at the intersection of cybersecurity, cryptography, and AI.

  • A role with direct impact on how global enterprises defend against quantum-era risks.

  • Collaborative, mission-driven culture with a focus on innovation and agility.

  • A chance to shape the future of data + AI across the Point Wild portfolio.

Published: 04.09.2025
Office location

Data Engineer (Lat61)

Apply

Data Engineer (Lat61)

-, Warszawa

Point Wild

ADVERTISEMENT: Recommended by Just Join IT
Please be informed that the data controller is Point Wild (hereinafter "controller"). You have the right to request acce... MoreThis site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
ADVERTISEMENT: Recommended by Just Join IT