The fun of 32-bit bugs

August 16, 2006

As computers (and disk space, and memory, and etc etc) get larger, the quantities that we deal with routinely get bigger too. And when they get bigger, fun things start happening.

Today's fun thing was that I doing some measurements of disk IO speed on a machine with 2 gigabytes of memory. My usual rule of thumb is to work on at least twice the amount of main memory to crush cache effects, which meant I was telling my benchmarking program to read and write 4 GB.

Which turned out to be kind of a problem, because I had declared a variable as int instead of long (or better yet, off_t). At 4GB it rolled over and various interesting things happened. I count myself fortunately that it was instantly obvious that something was bad; it could have just resulted in quietly wrong numbers that I might not have noticed.

As our systems and what we do with them get bigger, I imagine I can look forward to more and more incidents like this. Already, 2GB files are becoming pretty common and it is more and more irritating when tools don't deal with them, or don't deal well with them (some versions of less will eat large files but mangle the percentages and jumping to percentages, as I found out recently).

One benefit of 64-bit computing is that many of these problems can be papered over on 64-bit platforms by a recompile, since that makes long big enough again. (Some people will decry that as a quick fix.)

(Yes, technically this wasn't a 32-bit bug, it was a 31-bit bug. Close enough, says I.)

Sidebar: why not just use bonnie++ or the like?

Two reasons. First, bonnie++ benchmarks too much; at the moment I'm only interested in streaming read and write speeds. Second, bonnie++ was giving me odd results and I wanted to crosscheck them with something else.


Comments on this page:

From 24.98.83.96 at 2006-08-18 22:27:01:

Have you looked at filebench?:

http://sourceforge.net/projects/filebench

It works great, and allows you to apply workload profiles (e.g., OLTP database workloads or mail spool processing) to load-tests.

- Ryan

By cks at 2006-09-22 00:21:41:

I've now looked at filebench, and all I can report is that it doesn't compile on Fedora Core 5, and I'm not impressed:

  • the only release on Sourceforge is from 2005 and is labeled 'alpha'.
  • despite having a configure script, it bombs out during compilation if you don't have the GNU Scientific Library installed.
  • having fixed that, it bombs out trying to compile filebench/parser_gram.c with an undefined identifier.

Debugging the parser_gram.c failure is interesting, because the identifier in question is theoretically -D'd on the command line. You have to look really closely to see the spurious '-I' on the command line that swallows the -Dblahblah as a very peculiarly named directory. In turn this seems to come from the GNU Scientific Library not needing a special -I directory, as the Makefile.in has a '-I@GSLINC@' bit.

I could probably get this going if I wanted to work hard at it. However, this many problems in compiling it gives me no confidence that the results will be trustworthy and useful.

(And the 2005 last release date gives me no confidence that filing bugs about these issues would do anything except waste my time.)

(I persisted, of course. It really expects the GSL to be a local library, not a system one, and it wants -laio without throwing in an explicit configure check for it. But I did eventually club it into compiling.)

Written on 16 August 2006.
« Hardware RAID versus software RAID
The quick secret to bootable USB keys »

Page tools: View Source, View Normal, Add Comment.
Search:
Login: Password:
Atom Syndication: Recent Comments.

Last modified: Wed Aug 16 22:36:39 2006
This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.