Always remember that people make mistakes

December 15, 2010

One very important thing to remember when trying to design practical security systems is that people make mistakes. Always. Even under the best of circumstances and the best of intentions, sooner or later someone will do accidentally do something wrong.

If your security system breaks explosively when people makes mistakes, your system is wrong in practice. Regardless of how mathematically pure it is, you have not designed something with real security. Real security needs to cope with things going wrong and people making mistakes, because that's what actually happens.

(There are all sorts of mitigation and coping strategies, depending on what the overall design goals are for your security system.)

You cannot fix this fact. You cannot exhort users to not make mistakes; it doesn't work. You cannot threaten users to get them to not make mistakes; it doesn't work, you can't make it work, and the side effects of trying this are extremely unpleasant. You can't even make it so strongly in people's self-interest to not make mistakes that they won't make mistakes; it still doesn't work. People just make mistakes.

Perhaps you're convinced that your system and environment is an exception. If so, please consider aviation's 'controlled flight into terrain', which is the dry technical term for 'a highly trained pilot with their life on the line spaced out and flew their plane into the ground'. Pilots kill themselves (and other people) in CFIT accidents every year. This happens in basically the best situation possible; commercial pilots are highly trained, they've got pretty much the best motivation possible to not do this, and there are huge support structures and millions of dollars invested in avoiding these accidents. Given that commercial pilots still fly planes into the ground, your system is not going to do better.

PS: obviously this applies to more than just security systems. It's just that security systems are the most common place for people to appeal to shiningly perfect math and dismiss actual reality as an annoying inconvenience. By now, most other computing subfields are willing to acknowledge actual human behavior and design accordingly.

Sidebar: how many mistakes is too many

It's sensible to say that you can't cope with too many mistakes at once, although ideally you will have some modeling to assess how likely this is. Please do not make this merely some handwaving math about low percentages multiplied together; for a start, mistakes are not necessarily independent events.


Comments on this page:

From 76.113.53.175 at 2010-12-15 20:22:54:

I am sure someone is going to comment how CFIT encompasses all sorts of accidents where either pilots are not trained well or have other externalities occuring. In the spirit you're talking about they happen rarely. The last one I remember is captain of Emirates A-320 getting disoriented at over-water approach at night, and their CRM sucked, so F/O did not override him although he disagreed with control inputs. Before that I remember that 737 at Unalakleet where the captain apparently confused definitions of NOS and Jepp distance circles and descended too early (cloud deck was basically touching the hills). These accidents do not seem to happen all that often and usually have some sort of poor visibility and high task load attached to them.

By cks at 2010-12-16 18:40:47:

That's a good and fair point; I was a bit too strong in banging on CFIT by commercial pilots. I'd like to argue that CFIT by non-commercial pilots still illustrates my point, but I don't know if that's completely true; on the one hand, my impression is that non-commercial pilots are still generally better trained than computer users and obviously they have their lives on the line, but on the other hand it may be that they're operating under much more stressful and busy situations.

Written on 15 December 2010.
« Fumbling towards understanding how to use DRBD
The elements of a non-event »

Page tools: View Source, View Normal, Add Comment.
Search:
Login: Password:
Atom Syndication: Recent Comments.

Last modified: Wed Dec 15 01:50:39 2010
This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.