We are Sumo Logic, and we are building the Next Generation Log Management and Analytics solution --- delivered as a cloud-based service. We have 1500+ enterprise customers with $235M in funding from the world's leading investors (Accel, Greylock, Sequoia, Sutter Hill, and DFJ Growth), and we are reshaping the Big Data landscape with its cloud-based machine data analytics platform with an All-Star team.
The proliferation of machine log data has the potential to give organizations unprecedented real-time visibility into their infrastructure and operations. With this opportunity comes tremendous technical challenges around ingesting, managing, and understanding high-volume streams of heterogeneous data.
As a Backend Software Engineer, you’ll build the system that runs our elastic big data analytics platform written in Scala that runs in the AWS cloud. Our system is a highly distributed, fault tolerant, multi-tenant platform that includes bleeding edge components related to storage, messaging, search, and analytics. This system ingests and analyzes terabytes of data a day, while making petabytes of data available for search and forensic analysis, and is expected to reach substantially larger scale in the near future.
You are a strong software engineer who is passionate about large-scale systems. You care about producing clean, elegant, maintainable, robust, well-tested code; you do this as a member of a team, helping the group come up with a better solution than you would as individuals. Ideally, you have experience with performance, scalability, and reliability issues of 24x7 commercial services.
Responsibilities:
- Design and implement extremely high-volume, fault-tolerant, scalable backend systems that process and manage petabytes of customer data.
- Analyze and improve the efficiency, scalability, and reliability of our backend systems.
- Write robust code; demonstrate its robustness through automated tests.
- Work as a member of a team, helping the team respond quickly and effectively to business needs.
- Help manage exabytes of data using the latest and greatest technologies such as Kafka, Spark and Docker!
Requirements:
- B.S., M.S., or Ph.D. in Computer Sciences or related discipline
- 3+ years of industry experience with a proven track record of ownership and delivery
- Experience in multi-threaded programming and distributed systems.
- Object-oriented programming experience, for example in Java, Scala, Ruby, or C++.
- Understand performance characteristics of commonly used data structures (maps, lists, trees, etc).
- Desire to learn Scala, an up-and-coming JVM language (scala-lang.org).
Desirable:
- Experience in big data and/or 24x7 commercial service is highly desirable.
- You should be happy working with Unix (Linux, OS X).
- Agile software development experience (test-driven development, iterative and incremental development) is a plus.
- Enjoy working in an environment where stuffed squirrels and rubber bands are occasionally hurled at you.