My theory (& that's all it is) is that as much bit loading as possible is carried out in the higher frequency bands on those connections that can actually make use of them.
This would then reduce the potential 'swamping' effect at lower frequencies that longer connections are only able to use (mine, for example).
I think there may be something in that theory. Further complicated by the various PSD masks of course.
When it comes to bit loading, the water-filling method is traditionally used. However when it comes to fttc, particularly in the U1 channel, its evident that this is not the case and some other form of algorithm is in place.
From my own line (see below with that lovely 'L' shape) the U1 band shows something that for sure, is neither indicative of water-filling or PSD masking.
The bit loading in my U1 band, to me looks more like what is known as bog standard greedy bit loading. "Greedy" bit loading isnt always good....because the lower tones fill up first... the effect of which causes more x-talk and why water-filling is the preferred method, as it forces more use of the higher tones for shorter lines. There is no reason why BT would move to straight forward greedy bit-loading, so that implies there is something else going on with the various channels, there are algorithms that can analyse multiple lines and adjust accordingly, but afaik these can be time consuming and processor intensive. Therefore BT are up to something else.
The rest of my channels (other than U1) dont seem to display this behaviour. Is it possible that they have indeed some algorithm that disperses with waterfilling for U1, and if the line is capable shunts what bits it can over to U2. The effect of this would mean less x-talk in the region of tone 1000 ish onwards for the longer lines.
(yes I'm aware Im still using an old version of dslstats, but a sick PC and poorly & busy kitz has meant she hasnt got around to sorting things yet - but at least you get the idea)