#1 Job Board for tech industry in Europe

Big Data Software Engineer
Data

Big Data Software Engineer

Kraków
Type of work
Undetermined
Experience
Senior
Employment Type
B2B, Permanent
Operating mode
Remote

Tech stack

    Hadoop

    advanced

    Java

    nice to have

    Python

    nice to have

    Ansible

    nice to have

    Puppet

    nice to have

Job description

Online interview
Friendly offer
Sign a contract with us in January and get additional 8000 PLN gross bonus!
WE ARE
SoftServe is a leading technology solutions company specializing in software product & application development and services. Our mission is to be a valuable partner for our clients (from start-ups to large enterprises) who represent different domains e.g., health care, retail, enterprise, automotive, education, etc. We measure our success by our client's success.

Our team is growing, and we are looking for an experienced Cloudera migration expert to join the core SWAT team! As a part of the team, you will get a chance to work with cutting-edge technologies in a large multinational team. Our client is one of the largest banks in Asia.

YOU ARE
Our perfect candidate if you have the following skills and abilities

  • Having Information Technology and System Architecture experience
  • Proficient in migrating CDH to CDP with fine-tuning afterward to achieve great performance – hands-on experience required
  • Skilled in architecting large scale storage, data centers and globally distributed solutions by properly selecting VMs / servers & storage hardware based on performance, availability and other requirements
  • Experienced in designing and deploying 3 tier architectures or large-scale Hadoop solutions
  • Able to understand big data use-cases and recommend standard design patterns commonly used in Hadoop-based and streaming data deployments
  • Knowledgeable of data management eco-system including concepts of data warehousing, ETL, data integration
  • Able to understand and translate customer requirements into technical requirements
  • Implementing data transformation and processing solutions
  • Designing data queries against data in the HDFS environment using tools such as Apache Hive, Kafka, Solr, yarn
  • Setting up multi-node Hadoop clusters and facilitating their upgrades
  • Confident configuring security constraints (LDAP/AD, Kerberos/SPNEGO)
  • Proficient in implementing software and/or solutions in the enterprise Linux environment and assessment of dependencies for migration purposes
  • Aware of network configuration, devices, protocols, speeds and optimizations
  • Demonstrating solid background in Database administration or design
  • Having excellent verbal and written communications with can-do attitude

Nice to have

  • Experience with Plain Virtualization, Presto, Superset
  • Having a concept of Java ecosystem including debugging, logging, monitoring and profiling tools
  • Familiarity with scripting tools such as bash shell scripts, Python and/or Perl, Ansible, Chef, Puppet
  • Familiar with Collibra for data governance

YOU WANT TO WORK WITH

  • 4.7 PT of data and 1700 VMS across different clusters
  • Experienced core SWAT team to lead the migration
  • 5 distributed master nodes run on bare metal machine
  • Defining and building the project roadmap together with the client
  • Establishing and driving a communication and collaboration process
  • Hundreds of yarn nodes within fully democratized data platform

TOGETHER WE WILL

  • Accomplish great things
  • Get a great deal of learning and development opportunities along with our structured career path
  • Care about your individual initiatives — we are open for them, come and share your ideas
  • Work directly with customers to implement Big Data solutions at scale
  • Share many other advantages with you such as attractive salary, modern office, a package of benefits, language classes and more
  • Work hard, play hard and have fun!