Wandering Thoughts archives

2007-05-27

Paying for security exploits

Somewhere out on the Internet, there is probably someone waxing indignant right now about how companies generally now have to pay relatively substantial bounties for security exploits in their products. After all, why do the security researchers now demand payment for their research work?

I thin that there are two reasons: the obvious reason and the deeper reason.

The obvious reason is that companies are competing for new security exploits with the criminal groups exploiting security vulnerabilities to do various bad things. Said groups pay well for new vulnerabilities, because there is good money in exploiting them to plant various things on people's computers.

(I don't know if the companies pay as much as the underground markets, but they have a competitive advantage in the moral sphere.)

The deeper reason is simple: if you want work done, you need to pay for it. Companies have been unable to come up with non-lame rewards for reporting security vulnerabilities apart from actual cash. Since various companies want the work done, and especially since finding a new security vulnerability these days can be a reasonably large amount of work, those companies have been reduced to paying for new vulnerabilities with cash. In fact, probably as much cash as they would generally pay if they wanted to hire a skilled consultant to do a security audit of their program, which is really what they're doing (except they cleverly don't pay out unless there actually is a problem).

I think that this is the better answer to the whole question, because it does not cast the people taking advantage of the bounties as people who would otherwise sell their discoveries to the underground gangs. Instead it casts them as people who would, without the bounties, simply spend their time doing entirely different and more rewarding things. Which should surprise no one; why do work for a company for more or less free?

(Why the companies have been unable to come up with good non-cash rewards is an interesting question. I suspect that a good part of it is that companies have tried to be cheap, and people do catch on to that sort of thing after the novelty wears off.)

SecurityExploitCost written at 21:12:45; Add Comment

2007-05-25

If you want work done, you need to pay for it

It's not news that if you want work done you usually need to pay people to do it; there are not that many selfless volunteers and sooner or later you stop being able to trick people into doing it for free. Of course, the open source movement is a clear demonstration that you don't have to pay people in money; people find any number of things rewarding. But you do need to give people a reward that they find meaningful.

(You can argue that a certain amount of open source's problems come when its lack of money means that it doesn't have adequate rewards for certain sorts of work. Any time you hear something called 'thankless', it is a good tipoff that no adequate reward for it exists.)

The more time the work will take, the higher the odds are that you will have to pay real money in order to get it done. Setting aside the speed at which you get results (perhaps you're very patient), there's a big difference between how many people are willing to devote an hour of spare time to something and how many people are willing to devote a thousand hours of spare time to something.

PayForWork written at 23:19:40; Add Comment

2007-05-18

The difficulty of throwing things away

One of the interesting issues of working at a university is that it's amazingly hard to throw obsolete equipment away. Which, in its own way, illustrates how universities are peculiar places, since in a company you can at least theoretically get rid of things by throwing them in a dumpster.

Part of the problem is just that relatively scarce money causes people to reflexively cling to working (or theoretically working) equipment long past its best before date. Even when something's not working, when you are babying other computers along you tend to keep as big a spares pool as you can manage, just in case.

(We certainly do this here, and every so often we wind up actually raiding old hardware for bits. Just last week we stole some old combo FC and fibre gigabit Ethernet cards from our old SPARC fileservers so we could move our IMAP server up to gigabit Ethernet, which is only slightly more modern. As you might imagine, an IMAP server that gets all of the mail over NFS is much happier on gigabit Ethernet than it is on 100 Mbit Ethernet.)

However, the core problem is that many things in a typical university were bought with someone else's money; sometimes just government money directly, and sometimes grant funding. Both sources of funding are pretty careful (or neurotic, depending on your perspective) about making sure that you do not buy something on their dime, barely use it for a year, and then pass it to your buddy at a dirt cheap price. Thus, equipment disposal is a huge pain that requires piles of forms and procedures, and most of the pain is externally imposed and thus not something the university can ever do anything about.

(For bonus fun, try to figure out what funding source paid for an ancient piece of equipment that you now want to get rid of, as different funding sources often have different disposal rules.)

The effect is predictable: when it is a huge pain to throw equipment away, people don't. Even when it's broken, it's less of a pain to stick it in a corner than it is to dispose of it properly. And the result of this is a huge clutter of ancient, obsolete, and broken equipment, stuffed into any available corner and kept because people aren't sure if it's still needed or how to dispose of it.

(If you dispose of it but not properly, sooner or later the auditors get you. This is apparently reasonably uncomfortable.)

Unfortunately, we have a lot of corners around here. The result is probably fascinating to a hardware archaelogist, but I'm not one.

The whole situation is a bit sad. We've almost certainly got a number of old machines that various worthy causes could put to good use, but getting to where we could give the machines away takes so much work that no one can afford to do it. And at one point we worked out that trying to sell some hardware we didn't need any more to a used computer broker would actually cost more money than the university could possibly recoup, so of course we sat on the machines until they were completely useless.

(This entry was prompted by recent attempts to clean up our area and maybe finally get rid of some very dead and broken hardware.)

UniversityDisposalProblem written at 14:31:08; Add Comment

2007-05-08

Supporting the real world

Every so often, a vendor's support people tell me something like 'we are not going to investigate your bug report merely because you are running an unsupported configuration'. And by 'unsupported configuration' they do not mean 'something explicitly documented not to work', they mean 'some set of hardware and software not in our narrow list of supported setups'.

(Nor are the products labeled as being so fragile that they work only in certain limited configurations.)

There are two things I hear when vendors say this:

  • that the vendor is more interested in closing my support case than in investigating a situation where their product malfunctions.
  • and, more importantly, that the vendor is not interested in working in real world heterogeneous configurations, only in places that are willing to pay large sums of money for a security blanket.

The narrow list of supported setups that most vendors offer is invariably expensive and out of date. The only people in the real world who use completely supported setups are people with a great deal of money who feel that they cannot afford to take any risks, and so are doing as much as they can to mitigate them.

A vendor that demonstrates not very much interest in improving the overall quality of their product is one that fails to fill me with confidence. A vendor that is not interested in working in the real world is not one that I want to do business with.

(Technically we could factor in the risk of unavailable support versus the possible benefits of the product if everything just works. But I prefer to just not deal with vendors who've clearly indicated that they aren't interested in my business.)

Sidebar: what I expect a vendor's 'supported configurations' to be

Without strong and visible disclaimers to the contrary, I expect a vendor's supported configurations to be the configurations that they have actively tested and thus certify that it works in. Since they are carefully tested configurations, people who do not ever want to run into any surprises and issues that have to be worked through then use one of them.

I do not demand that vendor products be trouble-free even outside the supported configurations. I do expect that the vendor be willing to try to fix any problems that turn up.

RealWorldSupport written at 23:19:39; Add Comment

2007-05-07

What computer security is

There is a vital thing to remember about computer security: security is not math, security is people. Thinking that security is math gets you mathematical perfection and white hot disasters.

(These disasters happen for much the same reason that firewalls are dangerous: people think that they are completely protected once they have math on their side and thus don't take any further precautions.)

You can create the most mathematically perfect computer security system known to humanity and people will misuse it, or bypass it. If you make it impossible to bypass, you will sooner or later discover that all actual work is being done on people's personal laptops, passed around on USB keys, and only enters your perfect system when someone needs to file an archival copy (assuming they remember to).

(This is true of any cumbersome system, of course. People are lazy and so are very good at getting their work done in the most efficient way, without much concern for the larger picture.)

The near stranglehold that math has on computer security is very unfortunate. We would probably all have much more practically secure machines if computer security was considered a subfield of human factors research.

One corollary: anything involving people involves compromises. Real security is not mathematically perfect. It is better to have a usable but somewhat flawed security system than a flawless one that is unusable in practice because it is too complex and unwieldy.

SecurityIsPeople written at 23:12:58; Add Comment

By day for May 2007: 7 8 18 25 27; before May; after May.

Page tools: See As Normal.
Search:
Login: Password:
Atom Syndication: Recent Pages, Recent Comments.

This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.