We are looking for a Big Data Developer with at least 2 years of experience with Cloudera (developer not administrator) or Hortonworks. Anzo (very unique) is used in the project - knowledge of it will be a big plus. Pentaho is also a big plus.
Essential qualifications:
- Project experience with at least one of the following Big Data platforms is a must: Cloudera or Hortonworks (min. 2 years)
- Knowledge of Hadoop ETL tools (sqoop, impala, hive, oozie)
- SQL programming skills
- Bash scripting experience (min. 2 years)
- Working knowledge of at least one of the programming languages: Python, PySpark, R, Scala, Java (min. 2 years)
Key responsibilities:
- Work as the Big Data developer in a self-organizing Scrum team
- Implement data sourcing and transformation code, perform semantic data modelling, API build
- Manage the deployment, maintenance, and L3 user support of the reporting/data access tool(s)
- Create technical documentation of all delivered artifacts
- Perform other duties as assigned
*Contract of employment is also possible.