Are you?
- passionate about technology
- interested in Functional Programming paradigm
- hungry to work with and sharing knowledge with awesome, proactive people
Do you?
- want to build systems, which are flexible, loosely-coupled and scalable
- feel comfortable working remotely in 100% flat and Agile environment
- want to “Do the right thing”, delivering high-quality tech solutions & great products
Your Responsibilities:
- Creating and building a distributed Big Data processing pipeline (big amount of data)
- Improving, transporting and migrating data to allow data visualization, searching and making analytics
- Integration with different systems (Kafka, DB and so on)
- Working in an international team, sharing knowledge with your colleagues
Skills & Requirements
What we expect from you:
- Experience with Scala programming language (min. 2 years)
- Experience in test-driven development
- Fluent in English
Preferred skills:
- Knowledge and experience with Scala, Hadoop, Spark, Spark SQL, Impala, Hive, Sqoop, Oozie, Kafka, Play, Shell, MapReduce
- Experience in Akka, Catz, Scalaz, Clojure, Haskell
- Being familiar with Scrum/Agile mode + JIRA, Bitbucket, Github, Jenkins
About Scalac
Who we are:
Scalac is a software house that within just four years has managed to grow to more than 80 developers. Scalac is the Team! This is crucial. We love to work together. We specialize in systems development on a large scale, based on Functional Programming Languages. Working with Scalac means working with great Scala hAkkers, Frontend and Data engineers. We develop complex projects for various types of customers mainly in Fintech, ecommerce and health sector. We believe that truly great products happen when employer and employee go hand in hand. That's why we put strong emphasis on your well being and personal development.
Scalac Happiness Recipe:
- Work Hard
- Do the right thing
- Have fun :)