All offersWarszawaDataBig Data Engineer
Big Data Engineer
Data
GeoComply

Big Data Engineer

GeoComply
Warszawa
Type of work
Undetermined
Experience
Senior
Employment Type
Permanent
Operating mode
Office

Tech stack

    Spark
    advanced
    English
    advanced
    Big Data
    regular
    Java
    regular
    Kafka
    regular

Job description

Online interview
Friendly offer
We are looking for an experienced Big Data Engineer to join the team and help us build world-class scalable and efficient data storage. Which is to improve the overall performance of GeoComply products. If you enjoy working with large data sets, finding the best in class solutions to satisfy Business requests, and love challenging problems you are very welcome!
 
As a Big Data Engineer with GeoComply, you will help design and develop a world-class data management platform. You will maintain open communication with your team members and cross-functional stakeholders.

What You Will Be Doing:
  • Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities;
  • Work on ingesting, storing, processing, and analyzing large data sets;
  • Implementing ETL process to transform data from OLTP databases to OLAP DB and Data Lake using event streaming platforms such as Kafka;
  • Create scalable and high-performance web services for tracking data;
  • Develop, transform large datasets and maintain robust data pipelines that can support various use cases with high performance;
  • Translate complex technical and functional requirements into detailed designs;
  • Monitoring performance and advising any necessary infrastructure changes;
  • Defining data retention, data governance policies, and framework.

About You:
  • At least 2+ years of experience in Java or Scala programming languages;
  • At least 2+ years of experience with Big Data, Java Spring, Kafka Streams, Spark Streams frameworks;
  • Experience in large-scale deployment and performance tuning;
  • Experience with schema design and dimensional data modelling;
  • Experience with non-relational and relational databases (MySQL, MongoDB);
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and data sets;
  • Experience with data pipeline and workflow management tools;
  • Fluent in written and spoken English;
  • Strong analytical and problem-solving skills.

Bonus points if you:
  • Have experience with Delta Lake technology;
  • Good Docker/Kubernetes knowledge is a plus;
  • Good Kibana, Elasticsearch ELK stack knowledge is a plus;
  • Knowledge of core Java, Linux, SQL, and any scripting language is a plus.

Now What? Send us your resume and a cover letter. We Can’t Wait To Meet You.

At GeoComply, we live our value of Act with Integrity. Our workplace is built on mutual respect and inclusion. We know that diversity of experience and thought has led to connection, innovation, and our company’s success. We welcome applicants of all backgrounds, experiences, beliefs, and identities.

We care about your privacy and want you to be informed about your rights. Please read our Applicant Privacy Notice before applying for the position.