🟣You will be:
-
responsible for at-scale infrastructure design, build and deployment with a focus on distributed systems,
-
building and maintaining architecture patterns for data processing, workflow definitions, and system to system integrations using Big Data and Cloud technologies,
-
evaluating and translating technical design to workable technical solutions/code and technical specifications at par with industry standards,
-
driving creation of re-usable artifacts,
-
establishing scalable, efficient, automated processes for data analysis, data model development, validation, and implementation,
-
working closely with analysts/data scientists to understand impact to the downstream data models,
-
writing efficient and well-organized software to ship products in an iterative, continual release environment,
-
contributing and promoting good software engineering practices across the team,
-
communicating clearly and effectively to technical and non-technical audiences,
-
defining data retention policies,
-
monitoring performance and advising any necessary infrastructure changes.
🟣 Your profile:
-
3+ years’ experience with AWS (Glue, Lambda, Redshift, RDS, S3),
-
5+ years’ experience with data engineering or backend/fullstack software development,
-
strong SQL skills,
-
Python scripting proficiency,
-
experience with data transformation tools – Databricks and Spark,
-
data manipulation libraries (such as Pandas, NumPy, PySpark),
-
experience in structuring and modelling data in both relational and non-relational forms,
-
ability to elaborate and propose relational/non-relational approach,
-
normalization / denormalization and data warehousing concepts (star, Snowflake schemas),
-
designing for transactional and analytical operations,
-
working knowledge of Git,
-
good verbal and written communication skills in English.
Work from the European Union region and a work permit are required.
Candidates must have an active VAT status in the EU VIES registry: https://ec.europa.eu/taxation_customs/vies/
🟣 Nice to have:
-
experience with Amazon EMR and Apache Hadoop,
-
experience with data modelling tools, preferably DBT,
-
experience with Enterprise Data Warehouse solutions, preferably Snowflake,
-
familiarity with ETL tools (such as Informatica, Talend, Datastage, Stitch, Fivetran etc.),
-
experience in containerization and orchestration (Docker, Kubernetes etc.),
-
cloud (Azure, AWS, GCP) certification,
🟣 Recruitment Process:
CV review – HR call – Interview (with Live-coding) – Client Interview (with Live-coding) – Hiring Manager Interview – Decision
🎁 Benefits 🎁
✍ Development:
-
development budgets of up to 6,800 PLN,
-
we fund certifications e.g.: AWS, Azure,
-
access to Udemy, O'Reilly (formerly Safari Books Online) and more,
-
events and technology conferences,
-
technology Guilds,
-
internal training,
-
Xebia Upskill.
🩺 We take care of your health:
-
private medical healthcare,
-
multiSport card - we subsidise a MultiSport card,
-
mental Health Support.
🤸♂️ We are flexible:
-
B2B or employment contract,
-
contract for an indefinite period.