As Python Data Engineer you will be building scalable data flows delivering solutions for logistics and the financial services industry. The position will be focused on creating, tuning and monitoring data flows using Python, Docker, and Apache Airflow built on top of an Amazon Web Services infrastructure.
What we offer:
- Learning. Challenge. Diversity. You’ll never get bored!
- A chance to work on an enterprise-scale, data-centered application with opportunities to learn ie. the use of Docker, Airflow, Presto, columnar databases, etc. to deal with diverse data challenges;
- The opportunity to learn AWS cloud infrastructure from practitioners, not from books;
- Use of the latest technologies and flexibility to pick up the best tools for the job;
- Predominantly ‘greenfield’: you'll be the application designer and owner;
- Minimal bureaucracy to allow you to focus on technical tasks;
- Start-up environment associated with an industry leader in logistics;
- Competitive salary for the right candidates;
- Comfortable office located in Wroclaw city center.
Job requirements:
- 1+ years of Python experience. Any other programming languages are a plus;
- Open to candidates with European citizenship or those who have a work permit for Poland.
- High level of written and spoken English. Any other languages will be a plus;
- Readiness to occasionally switch projects and/or technologies;
- Experience with Linux, bash, git required;
- Basic understanding of communication protocols;
- An agile and DevOPS mindset;
- Knowledge of pandas, jupyter, airflow and/or Docker is a plus;
- Good SQL experience; we don’t expect you to be a database admin, but hands-on experience with any SQL engine is required. Tableau experience is a clear plus.
Note: Our hiring process requires a commitment of several hours, including a programming challenge at home to assess your technical flexibility.