#1 Job Board for tech industry in Europe

  • Job offers
  • All offersKrakówDataSenior Big Data Developer
    Senior Big Data Developer
    new
    Data
    Brown Brothers Harriman

    Senior Big Data Developer

    Brown Brothers Harriman

    Kraków
    Type of work
    Full-time
    Experience
    Senior
    Employment Type
    Permanent
    Operating mode
    Hybrid

    Tech stack

      Spark

      advanced

      Snowflake

      advanced

      Java

      advanced

      Python

      advanced

      Scala

      advanced

      Kafka

      nice to have

    Job description

    Online interview
    Friendly offer

    At BBH we value diverse backgrounds, so if your experience looks a little different from what we've outlined and you think you can bring value to the role, we will still welcome your application!



    What You Can Expect At BBH:

    If you join BBH you will find a collaborative environment that enables you to step outside your role to add value wherever you can. You will have direct access to clients, information and experts across all business areas around the world. BBH will provide you with opportunities to grow your expertise, take on new challenges, and reinvent yourself—without leaving the firm. We encourage a culture of inclusion that values each employee’s unique perspective. We provide a high-quality benefits program emphasizing good health, financial security, and peace of mind. Ultimately we want you to have rewarding work with the flexibility to enjoy personal and family experiences at every career stage. Our BBH Cares program offers volunteer opportunities to give back to your community and help transform the lives of others.


    Brown Brothers Harriman is seeking a Senior Big Data Developer with working experience on Cloudera and Snowflake to help with the development of a new Data platform, infoDataFabric. BBH’s Data platform serves as the foundation for a key set of offerings running on Oracle Exadata and Cloudera's distribution


    Key Responsibilities Include:

    • Facilitate the establishment of a secure data platform on BBH’s OnPrem Cloudera infrastructure
    • Document and develop ETL logic and data flows to facilitate the easy usage of data assets, both batch and real-time streaming
    • Leverage the following (but not limited to) components of Cloudera distribution to achieve project objectives: Sqoop, Hive, Impala, Spark
    • Consistent practice in coding and unit testing
    • Work with distributed teams



    What we offer:

    • 2 additional days added to your holiday calendar for Culture Celebration and Community Service
    • Private medical care for you and your family
    • Life Insurance
    • Hybrid Working Opportunities
    • Professional trainings and qualification support
    • Thrive Wellbeing Program
    • Online benefit platform
    • Contracts for an indefinite period of time with no probation period



    Qualifications for your role would include:

    • Bachelor's degree in Computer Science or related technical field, or equivalent experience
    • 8+ years of experience in an IT, preliminary on hands on development
    • Strong knowledge of architectural principles, frameworks, design patterns and industry best practices for design and development.
    • Strong hands on experience with programming languages - Java, Scala or Python
    • 4+ years’ real project experience as a data wrangler/engineer across design, development, testing, and production implementation for Big Data projects, processing large volumes of structured/unstructured data
    • Strong hands-on experience with Snowflake, Spark and Kafka
    • Experience with Oracle database engine with PL/SQL and performance tuning of SQL Queries
    • Experience in designing efficient and robust ETL/ELT workflows and schedulers
    • Communication skills – both written and verbal, strong analytical and problem-solving skills
    • Experience working with Git, Jira, and Agile methodologies



    Nice To Have:

    • End-to-end development life-cycle support and SDLC processes
    • Working experience with Data Virtualization tools such as Dremio/Denodo
    • Knowledge of Machine Learning libraries and exposure to Data Mining
    • Working experience with AWS/Azure/GCP
    • Working experience in a Financial industry is a plus


    Check similar offers

    Data Engineer with GCP

    Data Engineer with GCP

    New
    Holisticon Connect
    3.89K - 4.8K USD
    Kraków
    , Fully remote
    Fully remote
    GCP
    NoSQL
    SQL
    Data Scientist / ML Engineer

    Data Scientist / ML Engineer

    New
    Addepto
    3.7K - 7.41K USD
    Kraków
    , Fully remote
    Fully remote
    AI
    Machine Learning
    Data Science
    Power BI Expert

    Power BI Expert

    New
    Engenious
    7.26K - 8.69K USD
    Kraków
    , Fully remote
    Fully remote
    Power BI
    MS Excel
    matplotlib
    Senior Data Engineer (NLP & LLMs)

    Senior Data Engineer (NLP & LLMs)

    New
    Onwelo
    5.45K - 6.61K USD
    Kraków
    , Fully remote
    Fully remote
    Python
    NLP
    Langchain
    Principal / Lead Cartographer- Hybrid/ Remote

    Principal / Lead Cartographer- Hybrid/ Remote

    New
    HERE Technologies
    6.74K - 9.08K USD
    Kraków
    , Fully remote
    Fully remote
    GitLab
    Mapbox
    3D