I know we're digressing a bit, but why do Be not use those top-end tones?
/snip/
I don't know how to translate that into speed/throughput.
Impossible to say - it would entirely depend on how many bits your line will be able to allocated to those missing tones.
each bit in a channel is 3dB of REAL SNR for that particular frequency with a Max bit allocation of 15 per tone. (min 2 bits or 6dB SNR).
Therefore the really close lines will loose more speed than a line which is a tad further away. If your still getting a good strong signal with good bit allocation at tone 475, then the amount of speed lost for those missing tones would be more than someone whose line is already starting to tail off and the router is only able to allocate say 8 bits to those particular frequencies.
I estimate that block could carry up to a maximum of around 1.5Mbps.
Blocking those tones shifts the bit allocation to higher frequencies which obviously are more likely to be attenuated. Therefore the the SNR wont be quite as good for the higher frequencies after the block, than they would have been at the slightly lower frequencies which are blocked.
At a guess for a line thats doing pretty nicely before the blocked tones - circa 1Mb loss. On a much shorter line the loss would be negliable, since anything below 6dB is going to be discarded.
On my own line which is allocating 12 bits per tone at around carrier 475, I'd say 1.1 Mbps. (I'd prevously worked out that my own bitrate is 4.14kbps - sad git that I am).
Off the top of my head based on your bit to bin allocation, I'd say around 800 kbps loss would not be too far off the mark.
- BTW the x100 is a co-incidence that doesnt indicate the underlying calculation but is based instead on your bitrate being similar to mine which Im not sure if everyone's is. and tbh Im not sure how interleaving would affect the sync and bitrate... just that I know it does.
As regards to why Be do this - very good question.. theres been various theories and explanations given by Be, some of which turned out to be false which Brett did admit when queried further. There isnt a real reason that I know of and afaik they are the only provider which are blocking them.
---
For a really quick and nasty cross check you can divide your sync speed by the no of channels in use for the downstream (say 452) to get the average speed allocation per channel then multiply that by the no of blocked tones (24).
Obviously this later calculation wont work very well if your bit allocation before those blocked channels is low, Mine comes out at 1.25Mb which is a bit higher because its based on averages rather than actual bits allocated.
I cant stress enough that this method is very crude since a tone which is carrying 15 bits obviously carries more data than a bin which only has 6 bits, and therefore the more bits the higher sync speed, so it will always over-estimate based on 15 bits.
* kitz hopes Ive done my sums right.