Skip navigation

The basic technique of cybercrime statistics—measuring the incidence of a given phenomenon (DDoS, trojan, APT) as a percentage of overall population size—had entered the mainstream of cybersecurity thought only in the previous decade. Cybersecurity as a science was still in its infancy, as many of its basic principles had yet to be established.

At the same time, the scientific method rarely intersected with the development and testing of new detection & prevention regimens. When you read through that endless stream of quack cybercures published daily on the Internet and at conferences like RSA, what strikes you most is not that they are all, almost without exception, based on anecdotal or woefully inadequately small evidence. What’s striking is that they never apologize for the shortcoming. They never pause to say, “Of course, this is all based on anecdotal evidence, but hear me out.” There’s no shame in these claims, no awareness of the imperfection of the methods, precisely because it seems to eminently reasonable that the local observation of a handful of minuscule cases might serve the silver bullet for cybercrime, if you look hard enough.


But, cybercrime couldn’t be studied in isolation. It was as much a product of the internet expansion as news and social media, where it was so uselessly anatomized. To understand the beast, you needed to think on the scale of the enterprise, from the hacker’s-eye view. You needed to look at the problem from the perspective of Henry Mayhew’s balloon. And you needed a way to persuade others to join you there.

Sadly, that’s not a modern story. It’s an adapted quote from chapter 4 (pp. 97-98, paperback) of The Ghost Map, by Steven Johnson, a book on the cholera epidemic of 1854.

I won’t ruin the book nor continue my attempt at analogy any further. Suffice it to say, you should read the book—if you haven’t already—and join me in calling out for the need for the John Snow of our cyber-time to arrive.

3 Comments

  1. Hi, Bob –

    I think you saw my presentation on the base rate fallacy at SIRACon last year.

    Did you know that Jeff Lowder and I did an extended version at Source Seattle?

    And that I am going to present this version at Source Boston in a couple of week?

    We really need data on the prevalence/incidence of many phenomena in InfoSec.

    Best wishes,

    Patrick

  2. Hey Patrick! Looking forward to hearing seeing the updated pres at SOURCE (unless our speaking times overlap ;-). Definitely agree on the ‘missing data’. What’s even more impressive about the “Snow” story is the huge data gathering exercise that went on during and after the cholera outbreak. I wonder just what type of breach epidemic will need to occur to spur the same actions in our discipline.

  3. The measurement technique is interesting. Thanks for the presentation.


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.