A Large-Scale Study of the Time Required To Compromise a Computer System
2014 (English)In: IEEE Transactions on Dependable and Secure Computing, ISSN 1545-5971, E-ISSN 1941-0018, Vol. 11, no 1, 6506084- p.Article in journal (Refereed) Published
A frequent assumption in the domain of cybersecurity is that cyberintrusions follow the properties of a Poisson process, i.e., that the number of intrusions is well modeled by a Poisson distribution and that the time between intrusions is exponentially distributed. This paper studies this property by analyzing all cyberintrusions that have been detected across more than 260,000 computer systems over a period of almost three years. The results show that the assumption of a Poisson process model might be unoptimalâthe log-normal distribution is a significantly better fit in terms of modeling both the number of detected intrusions and the time between intrusions, and the Pareto distribution is a significantly better fit in terms of modeling the time to first intrusion. The paper also analyzes whether time to compromise (TTC) increase for each successful intrusion of a computer system. The results regarding this property suggest that time to compromise decrease along the number of intrusions of a system.
Place, publisher, year, edition, pages
IEEE Computer Society, 2014. Vol. 11, no 1, 6506084- p.
Invasive software (viruses, worms, Trojan horses), Risk management, Network management
IdentifiersURN: urn:nbn:se:kth:diva-129251DOI: 10.1109/TDSC.2013.21ISI: 000331301100002ScopusID: 2-s2.0-84894561473OAI: oai:DiVA.org:kth-129251DiVA: diva2:651157
QC 201309262013-09-242013-09-242014-03-25Bibliographically approved