Phish tests and (not) getting people to report successful phish attacks

June 1, 2024

One of the very important things for dealing with phish attacks is for people to rapidly self-report successful phish attacks, ones that obtained their password or other access token. If you don't know that an access token has been compromised, even briefly, you can't take steps to investigate any access it may have been used for, mitigate it, and so on. And the sooner you know about it, the better.

So called "phish tests" in their current form are basically excuses to explicitly or implicitly blame people who 'fall for' the phish test. Explicit blame is generally obvious, but you might wonder about the implicit blame. If the phish test reports how many people in each unit 'fell for' the phish test message, or requires those people to take additional training, or things like that, it is implicitly blaming those people; they or their managers will be exhorted to 'do better' and maybe required to do extra work.

When you conduct phish tests and blame people who 'fall for' those tests, you're teaching people that falling for phish attacks will cause them to be blamed. You are training them that this is a failure on their part and there will be consequences for their failure. When people know that they will be blamed for something, some number of them will try to cover it up, or will delay reporting it, or decide that they didn't really fall for it and they changed their password right away or didn't approve the MFA request or whatever, or the like. This is an entirely natural and predictable human reaction to the implicit training that your phish tests have delivered. And, as covered, this reaction is very bad for your organization's ability to handle a real, successful phish attack (which is going to happen sometime).

Much like you want "blameless incident postmortems", my view is that you want "blameless reporting of successful phishes". I'm not sure how you get it, but I'm pretty sure that the current approach to "phish tests" isn't it (beyond the other issues that Google discussed in On Fire Drills and Phishing Tests). Instead, I think phish tests most likely create a counterproductive mindset in people subjected to them, one where the security team is the opposition, out to trick people and then punish those who were tricked.

(This is the counterproductive effect I mentioned in my entry on how phish tests aren't like fire drills.)

Written on 01 June 2024.
« Spammers do forge various noreply@<you> sender addresses
Stand-alone downloads of program assets has a security implication »

Page tools: View Source.
Search:
Login: Password:

Last modified: Sat Jun 1 22:52:03 2024
This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.