4.6 Article

Optimal Scheduling of Cybersecurity Analysts for Minimizing Risk

Publisher

ASSOC COMPUTING MACHINERY
DOI: 10.1145/2914795

Keywords

Cybersecurity analysts; optimization; resource allocation; risk mitigation; simulation; scheduling

Funding

  1. Army Research Office [W911NF-13-1-0421, W911NF-15-1-0576, W911NF-13-1-0317]
  2. Office of Naval Research [N00014-13-1-0703, N00014-15-1-2007]

Ask authors/readers for more resources

Cybersecurity threats are on the rise with evermore digitization of the information that many day-to-day systems depend upon. The demand for cybersecurity analysts outpaces supply, which calls for optimal management of the analyst resource. Therefore, a key component of the cybersecurity defense system is the optimal scheduling of its analysts. Sensor data is analyzed by automatic processing systems, and alerts are generated. A portion of these alerts is considered to be significant, which requires thorough examination by a cybersecurity analyst. Risk, in this article, is defined as the percentage of unanalyzed or not thoroughly analyzed alerts among the significant alerts by analysts. The article presents a generalized optimization model for scheduling cybersecurity analysts to minimize risk (a.k.a., maximize significant alert coverage by analysts) and maintain risk under a pre-determined upper bound. The article tests the optimization model and its scalability on a set of given sensors with varying analyst experiences, alert generation rates, system constraints, and system requirements. Results indicate that the optimization model is scalable and is capable of identifying both the right mix of analyst expertise in an organization and the sensor-to-analyst allocation in order to maintain risk below a given upper bound. Several meta-principles are presented, which are derived from the optimization model, and they further serve as guiding principles for hiring and scheduling cybersecurity analysts. The simulation studies (validation) of the optimization model outputs indicate that risk varies non-linearly with an analyst/sensor ratio, and for a given analyst/sensor ratio, the risk is independent of the number of sensors in the system.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available