Security Analysts want to be absolutely sure that alerts received within the platform are malicious. Too often other security solutions have an abundance of alerts coming into a Security Operations Center (SOC), and that takes time to have the analysts figure out if they are problematic. One of Endgames solutions to this problem was to create a cloud offering for malware detonation, where analysts could safely dissect the file and even compare it to other third-party verifiers.
Once the initial platform was created, Arbiter (the cloud) was developed in our San Fransisco office. Our team was brought in to start research, user-interviews, storyboard, and essentially help bring the product into life. I was the initial lead on this product, but eventually was less hands-on as I on-boarded responsibility to other product designers being hired on.
Through the next half-dozen releases our team has held many user interviews, mock scenario and A/B testing. Through our research, we have narrowed our users to four main groups: Tier 1 Analysts, Tier 3 Analysts, Forensic Hunters, and SOC Managers. Below is a description of the most basic of the roles, the Tier 1 Analyst:
A typical defensive analyst job is to maintain their knowledge of these attacks; they need to know what patterns to look for, what to spot, essentially finding that needle in a haystack - in a very short amount of time. Of course once that needle is found, these security analysts are then tasked to find where other corresponding problematic areas exist; exposing and remediating other areas of the network the attacker could have manipulated. It's a classic ‘cat and mouse’ game that keeps analysts constantly on their toes searching or reacting to malicious events.
The Design Process
Our design process followed a macro-structured timeline of events that were mapped to 3-4 month release cycles directly with the product team. This often involved initial storyboarding, and long term research engagements. For current development, our process was paired down to the 2 week engineering sprint cycles. (I had the team completely prep work at least 2 sprints out).
Analysis and Discovery: This included user validation testing, low fidelity wireframe creation, and implementation meetings/feature prioritization.
Development and VD: This included high fidelity visual design and edge-case interactions. Designers would pair daily with developers in feature teams.
Documentation: This included ongoing technical documentation for customers and internal spec documentation for QA,FE, and the Component Library.