🟣You will be:
-
developing and maintaining data pipelines to ensure seamless data flow from the Loyalty system to the data lake and data warehouse,
-
collaborating with data engineers to ensure data engineering best practices are integrated into the development process,
-
ensuring data integrity, consistency, and availability across all data systems,
-
integrating data from various sources, including transactional databases, third-party APIs, and external data sources, into the data lake,
-
implementing ETL processes to transform and load data into the data warehouse for analytics and reporting,
-
working closely with cross-functional teams including Engineering, Business Analytics, Data Science and Product Management to understand data requirements and deliver solutions,
-
collaborating with data engineers to ensure data engineering best practices are integrated into the development process,
-
optimizing data storage and retrieval to improve performance and scalability,
-
monitoring and troubleshooting data pipelines to ensure high reliability and efficiency,
-
implementing and enforcing data governance policies to ensure data security, privacy, and compliance,
-
developing documentation and standards for data processes and procedures.
🟣 Your profile:
-
7+ years in a data engineering role, with hands-on experience in building data processing pipelines,
-
experience in leading the design and implementing of data pipelines and data products,
-
proficiency with GCP services, for large-scale data processing and optimization,
-
extensive experience with Apache Airflow, including DAG creation, triggers, and workflow optimization,
-
knowledge of data partitioning, batch configuration, and performance tuning for terabyte-scale processing,
-
strong Python proficiency, with expertise in modern data libraries and frameworks (e.g., Databricks, Snowflake, Spark, SQL),
-
hands-on experience with ETL tools and processes,
-
practical experience with dbt for data transformation,
-
deep understanding of relational and NoSQL databases, data modelling, and data warehousing concepts,
-
excellent command of oral and written English,
-
Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field.
Work from the European Union region and a work permit are required.
Candidates must have an active VAT status in the EU VIES registry: https://ec.europa.eu/taxation_customs/vies/
🟣 Nice to have:
-
experience with ecommerce systems and their data integration,
-
knowledge of data visualization tools (e.g., Tableau, Looker),
-
understanding of machine learning and data analytics,
-
certification in cloud platforms (AWS Certified Data Analytics, Google Professional Data Engineer, etc.).
🟣 Recruitment Process:
CV review – HR call – Interview (with Live-coding) – Client Interview (with Live-coding) – Hiring Manager Interview – Decision
🎁 Benefits 🎁
✍ Development:
-
development budgets of up to 6,800 PLN,
-
we fund certifications e.g.: AWS, Azure,
-
access to Udemy, O'Reilly (formerly Safari Books Online) and more,
-
events and technology conferences,
-
technology Guilds,
-
internal training,
-
Xebia Upskill.
🩺 We take care of your health:
-
private medical healthcare,
-
multiSport card - we subsidise a MultiSport card,
-
mental Health Support.
🤸♂️ We are flexible:
-
B2B or employment contract,
-
contract for an indefinite period.