SentinelOne is defining the future of cybersecurity through our XDR platform that automatically prevents, detects, and responds to threats in real-time. Singularity XDR ingests data and leverages our patented AI models to deliver autonomous protection. With SentinelOne, organizations gain full transparency into everything happening across the network at machine speed – to defeat every attack, at every stage of the threat lifecycle.
We are a values-driven team where names are known, results are rewarded, and friendships are formed. Trust, accountability, relentlessness, ingenuity, and OneSentinel define the pillars of our collaborative and unified global culture. We're looking for people that will drive team success and collaboration across SentinelOne. If you’re enthusiastic about innovative approaches to problem-solving, we would love to speak with you about joining our team!
We are a values-driven team where names are known, results are rewarded, and friendships are formed. Trust, accountability, relentlessness, ingenuity, and OneSentinel define the pillars of our collaborative and unified global culture. We're looking for people that will drive team success and collaboration across SentinelOne. If you’re enthusiastic about innovative approaches to problem-solving, we would love to speak with you about joining our team!
What will you do?
The Data Platform group is responsible for building SentinelOne Singularity XDR solution.
SentinelOne Singularity XDR unifies and extends detection and response capability across multiple security layers, providing security teams with centralized end-to-end enterprise vis-ibility, powerful analytics, automated response across the complete technology stack. With Singularity XDR, customers can get unified and proactive security measures to defend the entire technology stack, making it easier for security analysts to identify and stop attacks in progress before they impact the business.
As part of the Data Platform, you will work on vast amounts of data. You will take part in the architecture, design and implementation of a multi-disciplinary cloud native software platform that will serve thousands of users and process hundreds of Billions of events daily.
Design, develop, implement, test, document, and operate large-scale, high-volume and low latency applications that handle several millions of events per second.
Translate business and functional requirements into robust, scalable, operable solutions that work well within the overall data architecture.
Participate and collaborate with cross functional teams in the organization to understand the business requirements and to deliver solutions that can scale.
Maintain application stability and data integrity by monitoring key metrics and improving code base accordingly.
Understand & maintain existing codebase by regular re-factoring and applying requested fixes and features.
Learn new technologies that can solve our huge volume of data and the challenges it poses.
What experience or knowledge should you bring?
A degree in Computer Science/Software engineering from a well-regarded university OR a comparable experience from well renowned companies/military units
4+ years experience in software development experience in at least one high-level programming language (Go, Java, Scala).
Strong background in distributed data processing microservices building high-quality, scalable data products.
Advantages:
Experience working with Apache Kafka - BIG advantage
Experience working with distributed data technologies (e.g. Hadoop, Flink, Spark, Hive, etc.) for building efficient, large-scale data pipelines
Experience with Cloud Computing platforms like Amazon AWS
Experience with Docker, Helm & Kubernetes