That’s an important point: let’s have a look at the bit-loading (below). Here’s line 1:
And for comparison, here’s line 3, much faster downstream, about 13%, below:
With modem 3, I can see a lot more tones with 12 bits. Also the high end goes down faster, you can more or less draw a straight line eg. from (80, 10) down to (120, 5) for modem 3. Notice how the diagonal lines of the two modems have different gradients. The equivalent diagonal line for modem 1 is (78, 10) down to (120, 4). The left-shift by a couple of tones doesn’t matter (?) but the line is descending down to an endpoint that is a whole tone lower. So averaging it out, we have (120-80)*(5-4)/2 * 4000 bps = 80 kbps. That’s very rough calculation and I haven’t done anything about the effect of the left shift by about two tones that you see when comparing the two slopes. To estimate the effect of the left shift, do we just knock out a column 2 tones wide and 10 bits high ? So that would be 2 * 10 * 4000 = 80 kbps again. So adding the two contributions we get 80 + 80 = 160 kbps. But we still have a long way to go to make up the difference in sync rate; in fact we have not quite made it half way.
Now my guess for the contribution for the very top, the missing 12-bit tones. I think there are about 12 tones up there in the case of modem 3 that are absent in modem 1, and I can see only about 3 12-bit tones for modem 1. So 12 * 4000 = 48 kbps, very roughly.
So adding that contribution in too, we have 80 + 80 + 48 = 208 kbps. So we are now starting to make some progress.
But the question now is the original one: why is the bit-loading so much worse that we lose 350 kbps downstream sync rate?