The aftermath of our mirroring
- On Monday, we transfered 198.64 gigabytes in 7,781 requests, despite not getting much traffic until about 10am Eastern time.
- On Tuesday, we transfered 316.36 gigabytes in 13,979 requests.
(At a later date I may wrangle gnuplot into producing some nice graphs of this.)
There's no logging for how many simultaneous connections the web servers
saw, but I was looking periodically with
lsof and never saw more than
30 or so connections to each mirror. The load was split roughly evenly
between the two mirrors, so each saw about half the requests and pushed
half the bandwidth.
At a rough guess, we averaged about 32 Mbps of outbound traffic on Monday from 10am onwards, and about 30 Mbps on Tuesday. I was watching subnet utilization graphs on Monday, and there were occasional peaks up towards 100 Mbps over the two subnets involved. (Both subnets have routine traffic fluctuations from other activities, which made it hard to be sure what the mirrors were adding.)
Although these numbers look impressive, they're not all that large compared to what we expected and feared; my mirror system turned out to be rather over-engineered, and I actually could just have used an existing Apache installation. (I suspect that a lot of people found the Google Video version good enough, or at least the clip not compelling enough to get them to download 53+ Mbytes.)
I don't regret the preparation; it was fun, I learned a number of interesting things, and better safe than sorry. Still, it was a little bit disappointing to prepare for a flood and then just get my toes lapped by a lethargic little wave. (The actual THEMIS site apparently got a much more impressive amount of traffic.)
Sidebar: an inbound traffic surge
Interestingly, outbound traffic for the movie clips isn't the full story; there's also inbound traffic.
|Mirror 1 connections
|Mirror 2 connections
|Mirror 1 volume
|Mirror 2 volume
Mirror 1 usually has no inbound traffic, and mirror 2 usually runs about 60,000 inbound connections and 4 Mbytes or so of inbound traffic a day on weekdays.
Some of this extra traffic is simply the inbound HTTP requests for the movies. Some of it is from other requests to the web servers (people looking for a favicon, the ASU mirror monitoring system checking our status, and so on); there were 3,162 such additional web requests on Monday and 5,213 on Tuesday. My cynical guess is that much of the rest of it is from lots of people poking the machines because they were suddenly much more visible to the world.
Update: It turns out that what our traffic monitoring system was reporting as 'connections' was actually the packet count, which makes these numbers far more reasonable. A trawl through our IDS logs suggests that the machines were poked by the outside world no more often than usual.
On the good side, this did cause us to find and fix this problem in the traffic monitoring system's reports.