Join a purpose-driven, international technology company that combines innovation with a commitment to meaningful impact. Our client provides cutting-edge digital solutions that empower businesses to make smarter, more responsible decisions. The global team of professionals brings expertise, creativity, and collaboration to every project, working in a supportive culture where your contribution truly matters.
Their mission is to drive progress and sustainability across industries by integrating intelligent systems and insightful analytics into our clients’ operations.
We are looking for a skilled Data Engineer to become a part of their expanding data platform team. The role involves developing and maintaining data architectures and processing solutions that empower data scientists, BI analysts, and other internal stakeholders.
- Design, build, and manage components of the data platform.
- Develop and monitor data processing pipelines to ensure reliability and efficiency.
- Maintain data serving layers, manage permissions, and automate data catalog processes.
- Integrate internal and external data sources, ensuring high data integrity and quality.
- Support business intelligence efforts through efficient data modeling.
- Collaborate cross-functionally with teams including data scientists, DevOps, architects, developers, and QA to bring new features and services into production.
- Enhance operational practices and procedures continuously.
- Full remote work from Poland or hybrid in Warsaw
- Flexible working hours and supportive team culture
- Competitive salary + annual bonus
- Flexible employment arrangements, including a permanent contract or B2B collaboration
- Private healthcare, life insurance, and wellness support
- Home office setup allowance (internet, electricity)
- Multisport, lunch card, and benefits platform
- E-learning access, language classes, and volunteering days
- Modern, pet-friendly Warsaw office (optional use)
- 2-3 years of experience in designing and implementing data processing pipelines using distributed data processing engines.
- Proficient in Python for data engineering applications and understanding of common programming principles.
- Experience with data orchestration tools.
- Familiarity with cloud platforms such as AWS, Azure, or GCP.
- Expertise in SQL, database design/structures, and ETL/ELT design patterns.
- Proficiency in English.
- Preferred: Experience with Databricks, Azure cloud, Azure DevOps, and CI/CD practices.