Data Engineer with Streaming
We are looking for a Data Engineer with strong hands-on experience in batch data processing and practical exposure to streaming solutions.
The ideal candidate understands event-driven architecture and can design scalable, real-time data pipelines on Azure.
The role includes designing reliable near real-time data platforms supporting analytical and operational use cases.
Tasks:
Design and implement scalable batch data pipelines on Azure using modern ETL/ELT approaches.
Develop and optimize data transformations using Apache Spark and SQL, ensuring performance and maintainability.
Design efficient data models for analytical and reporting use cases (data lakes, data warehouses).
Ensure data reliability, quality, monitoring, and performance in distributed environments.
Build and maintain near real-time data pipelines using Apache Spark (Structured Streaming).
Implement Change Data Capture (CDC) pipelines from relational databases when required.
Contribute to event-driven solutions using message brokers where streaming use cases are present.
Engage with stakeholders to refine data requirements and architecture choices.
Perform code reviews, promote best practices, and mentor junior engineers.
Work independently while maintaining effective communication with stakeholders.
Requirements:
4–6 years of experience as Data Engineer.
Strong SQL and Python skills (production-grade code).
Experience with Apache Spark (including Structured Streaming).
Hands-on experience with Azure (Synapse / ADF / ADLS / Databricks).
Understanding of distributed systems fundamentals.
Experience designing ETL/ELT pipelines.
Experience working in Agile environment.
English C1.
Nice to have skills:
Experience with Apache Kafka (event streaming) and/or Apache Flink (stream processing engine).
Experience with Change Data Capture (CDC) tools (e.g. Debezium, SQL Server CDC, Azure CDC, Kafka Connect).
Experience with schema management (e.g. Avro, Schema Registry).
Experience with Delta Lake / Delta Live Tables.
Knowledge of event-driven architecture patterns.
Experience with CI/CD for data platforms.
We offer:
Stable employment in a well-established organization
Flexible work model – remote or office-based, depending on your preference
Autonomy and trust in how you organize your work
Opportunities to engage in charity and environmental projects
Access to social benefits, including vacation co-financing
Collaboration with experienced professionals and subject matter experts
Initiatives supporting well-being and health
Structured onboarding program with dedicated support
Modern equipment provided or available for remote work
Clear career development paths and internal promotion opportunities
Referral program with bonuses for recommending new team members
Access to online learning platforms and training opportunities from day one
Support for professional certifications and continuous upskilling
Inclusive and values-driven work environment

Lingaro
At Lingaro, we empower global brands to achieve more with data. We lead our clients, from strategy through development to operations and adoption, transforming data into opportunities that propel their business’ forward....
Data Engineer with Streaming
Data Engineer with Streaming