Brown Brothers Harriman is seeking a Senior Big Data Developer with working experience on Cloudera and Snowflake to help with the development of a new Data platform, infoDataFabric. BBH’s Data platform serves as the foundation for a key set of offerings running on Oracle Exadata and Cloudera's distribution
Key Responsibilities Include:
- Facilitate the establishment of a secure data platform on BBH’s OnPrem Cloudera infrastructure
- Document and develop ETL logic and data flows to facilitate the easy usage of data assets, both batch and real-time streaming
- Leverage the following (but not limited to) components of Cloudera distribution to achieve project objectives: Sqoop, Hive, Impala, Spark
- Consistent practice in coding and unit testing
- Work with distributed teams
What we offer:
Hybrid model 3 days per week in the office /2 days for a parent of a child up to 4 years
- To encourage cultural awareness and philanthropy, BBHers have 1 Culture Celebration Day and 1 Community Service Day in addition to their paid standard vacation allowance
- 24 days of occasional remote work from Poland during a calendar year
- Contracts for an indefinite period, from day one
- Private medical care
- Life Insurance
- Employee Assistance Program - offering independent and confidential counselling services for you and your family. You can get support for topics including family, marriage and relationships, finances, and legal issues.
- Professional trainings and qualification support
- Wellbeing Program
- Online Social Fund benefit platform
- Social, sport and integration events
- Onboarding Program for new hires
Qualifications for your role would include:
- Bachelor's degree in Computer Science or related technical field, or equivalent experience
- 8+ years of experience in an IT, preliminary on hands on development
- Strong knowledge of architectural principles, frameworks, design patterns and industry best practices for design and development.
- Strong hands on experience with programming languages - Java, Scala or Python
- 4+ years’ real project experience as a data wrangler/engineer across design, development, testing, and production implementation for Big Data projects, processing large volumes of structured/unstructured data
- Strong hands-on experience with Snowflake, Spark and Kafka
- Experience with Oracle database engine with PL/SQL and performance tuning of SQL Queries
- Experience in designing efficient and robust ETL/ELT workflows and schedulers
- Communication skills – both written and verbal, strong analytical and problem-solving skills
- Experience working with Git, Jira, and Agile methodologies
Nice To Have:
- End-to-end development life-cycle support and SDLC processes
- Working experience with Data Virtualization tools such as Dremio/Denodo
- Knowledge of Machine Learning libraries and exposure to Data Mining
- Working experience with AWS/Azure/GCP
- Working experience in a Financial industry is a plus