What we offer:
- B2B Contract with a fixed monthly rate with 20 days of holidays included, fully paid bank holidays and sick leaves
- Long term engagement
- Collaborate with experts in an international project to build a next-gen cloud intelligence system
- Office located near Rondo ONZ
- Danish work culture
- Medical insurance
- Annual performance appraises
Client is successful e-commerce and retail business in rapid growth - WhiteAway Group.
Through several online shops and the franchise chain Skousen we sell white goods and premium products for the home – to customers in all of Scandinavia. The foundation was innovation, business acumen and excellent customer experience.
Culture:
- Cross-functional teams with close collaboration and fast feedback loop across different domains (Development, Business, UX/UI)
- We strive to be a technological lighthouse - high focus on quality, standards, tests and modern approach to development
- Close cooperation with Danish office on multiple levels e.g. architecture group
- the team is very open-minded for any ideas regarding improving, not only coding practices, but also teamwork & "workflow"
- Knowledge sharing across squads and the whole organization. We utilize theme-oriented guilds and inner source model of development
Responsibilities:
- Build and run data pipelines for datasets produced and consumed by our experimentation platform, digging into and working with data in large databases such as Big Query and ultimately building tools helping our employees make data-informed decisions affecting our customers across Scandinavia
- Help build out a new infrastructure offering to support various execution engines running on Kubernetes
- Contribute to modeling, crafting and maintaining data solutions to improve usability and accessibility to data
- Expand and streamline our data infrastructure by securing high-quality data ingestion and efficient pipelines
- Work with data scientists and engineers to adapt new data sources quickly and efficiently
Your profile:
- Masters in computer science, mathematics, statistics or other quantitative fields
- Hands-on experience working with data modeling, schema metadata, ETL development and data storage techniques
- Experience in coding, automating processes, data processing and storage frameworks
- Previously built APIs and libraries for Python
- You care about agile software processes, data-driven development, reliability, and responsible experimentation
- Previously worked with containerization technologies such as Kubernetes and Docker
Technical Requirements:
- Over 3 years working with SQL based databases with familiarity with cloud-based data warehousing solutions such as S3 and AWS database migration service
- Highly experienced in writing very efficient code in Python
- Experience working with AWS big data technologies (S3, Glue, Lake Formation, Redshift)
- Experience working with Apache Airflow would be a plus