ADSL, ADSL2+ and VDSL2 all use a "symbol rate" of 4,000 symbols per second.

That means that any one tone, configured with N bits of bit-loading, can transfer (N * 4000) bits per second.

I always wondered why the number of 4,000 was chosen, and why it stayed the same over each iteration of technology, and now I understand why.

G.Fast is designed for 48,000 symbols per second. However, while it uses 106MHz of spectrum, it doesn't use more tones that VDSL2. It uses 2048 tones, each with over 51kHz of spectrum. And that turns out to be the link - the 51kHz of spectrum for each tone is what makes 48,000 symbols per second possible.

In older DSL iterations, the tones were each allocated 4.3175kHz - enough to carry 4,000 symbols per second. I guess the bandwidth allocated to a tone dictates how much time the DSP hardware requires to identify the analogue QAM components of each symbol.

Lightbulb moment