The irritation of single-instance applications
There is a certain habit in applications these days of only allowing you to run one copy of themselves; for example, Firefox. I find myself more and more irritated by this, because it is usually implemented sloppily.
A good implementation must hide the fact that it is not actually running a second copy of the application. Sloppy implementations fail to completely hide this fact, generally by missing various differences between the environment that the original version of the application is running in and the environment that the second copy is running in.
Firefox on Unix provides a particularly blatant example of this, but it's a hard problem
in general; there are a lot of different bits of the environment and
it is in practice very challenging to make sure that you have covered
everything. It may even be impossible; what if you're using a library
that changes its behavior in response to environment variables? If this
seems far-fetched, consider
Given the difficulties, you should really not do this unless you have to, for example because running two copies of the application at once corrupts data somewhere. If you want to offer this as a convenience feature, please make sure that it can be turned off, ideally with both a global configuration option and a command line switch.
Sidebar: how to do this really badly
Even worse than sloppy single-instance applications are programs that do
not even bother pretending, where running a second copy of the program
does bad things to whatever you were doing in the first copy. For
example, Adobe Acrobat Reader, where you can only ever be reading one
PDF at once; if you try to start a second copy of
acroread, the second
PDF replaces what you were reading in the first one.
(Back in the days when Acroread was my only real choice for PDFs, this was infuriating. Now that I have evince it is merely extremely irritating.)