🟣 You will be:
- Working with teams of a globally recognized American apparel brand, symbol of rugged individuality and casual style,
- Responsible for at-scale infrastructure design, build and deployment with a focus on distributed systems,
- Building and maintaining architecture patterns for data processing, workflow definitions, and system to system integrations using Big Data and Cloud technologies,
- Evaluating and translating technical design to workable technical solutions/code and technical specifications at par with industry standards,
- Driving creation of re-usable artifacts,
- Establishing scalable, efficient, automated processes for data analysis, data model development, validation, and implementation,
- Working closely with analysts/data scientists to understand impact to the downstream data models,
- Writing efficient and well-organized software to ship products in an iterative, continual release environment,
- Contributing and promoting good software engineering practices across the team,
- Communicating clearly and effectively to technical and non-technical audiences,
- Defining data retention policies,
- Monitoring performance and advising any necessary infrastructure changes,
- Responsible for dashboard development (Tableau, PowerBi, Qlik, etc),
- Responsible for data analytics model development (R, Python, Spark).
🟣 Your profile:
- Openness to work daily between till 18-19.00 pm CET,
- 5+ years’ experience as a software developer/data engineer,
-
Big Data technologies and AI/ML Life cycle,
- University or advanced degree in engineering, computer science, mathematics, or a related field,
- Strong hands-on experience in Databricks using PySpark and Spark SQL (Unity Catalog, workflows, Optimization techniques),
- Experience with at least one cloud provider solution - Azure, AWS, GCP (preferred),
- Strong experience working with relational SQL databases,
- Strong experience with object-oriented/object function scripting language: Python,
- Working knowledge in any transformation tools (DBT preferred),
- Ability to work with Linux platform,
- Strong knowledge of data pipeline and workflow management tools (Airflow preferred),
- Working knowledge of Git hub /Git Toolkit,
- Expertise in standard software engineering methodology, e.g. unit testing, code reviews, design documentation,
- Experience creating Data pipelines that prepare data for ingestion & consumption appropriately,
- Experience in maintaining and optimizing databases/filesystems for production usage in reporting, analytics,
- Working in a collaborative environment and interacting effectively with technical and non-technical team members equally well,
- Good verbal and written communication skills (English),
- Experience with ecommerce, retail or supply chains is welcome,
🟣 Work from European Union region and work permit are required.
🟣 Recruitment Process: CV review – HR Call – Interview – Client Interview – Decision
Please note that we are currently looking to expand our talent pool for future opportunities within the IT industry. While we may not have an immediate project for you at the moment, we are proactively recruiting to ensure that we have the right expertise when new projects arise. We will contact you when a potential project matching your skills and experience becomes available. Thank you for your interest in joining our team.
🎁 Benefits 🎁
✍ Development:
-
development budgets of up to 6,800 PLN,
- we fund certifications e.g.: AWS, Azure,
- access to Udemy, Safari Books Online and more,
- events and technology conferences,
- technology Guilds,
- internal training,
- Xebia Library,
- Xebia Upskill.
🩺 We take care of your health:
- private medical healthcare,
- multiSport card - we subsidise a MultiSport card,
- mental Health Support.
🤸♂️ We are flexible:
- flexible working hours,
- B2B or permanent contract,
- contract for an indefinite period.