When I measure upstream combined link effective speed with speedtest.aa.net.uk I get a lot of statistical noise, in that upstream results vary wildly in a group of back-to-back tests, for reasons unknown. I always take the max and discard the rest. Long-term, over a timescale of weeks or months, I see a really large variation, sometimes with max values in a test group that are even up to 50% higher, consistently in a test group, and at other times this variation is absent. There seem to be ‘high rate seasons’ or ‘periods’ during which all test groups will give a high max result, and then these high group max results go away. A high season result is 1.50-1.65; a low season result is 1.0-1.2.
I just have no clue why this is happening. The upstream sync rates do not change. It’s not associated with cold weather. I recently got a record high upstream of 1.65 Mbps, again from speedtest.aa.net.uk, and now a week or so later this high result has gone away, yet the temperature has not changed significantly. In any case wouldn’t a change in temperature have to cause a change in sync rate for there to be any effect?
The sync rates are currently:
#1: down 3194 kbps, up 570 kbps
#2: down 3030 kbps, up 528 kbps
#3: down 2913 kbps, up 396 kbps
#4: down 3290 kbps, up 576 kbps
Line 3 upstream is always rubbish; lines 1 and 4 are pretty good although I have seen figures over 600k on occasion.
What am I missing? How can the upstream test figures vary like this? Some weirdness in the tester?
And remember that I always do a group of tests and take only the max, so as to rule out the reducing effects of any alien competing traffic either at my end or into the speed tester. And I sometimes do speed tests in the early hours, although I suppose backups could be running, but the ‘take the max only’ rule excludes those bad results.