I personally can't see the need to inject a noise signal in series with the line, as the differential bit means that one leg of the pair is pushing, whilst the other leg is pulling a signal around it ! If the signal was injected into both legs of the pair in the same phase, then as you say, the interfering signal would cancel itself out at the modems. The fact that the differentially injected signal goes to both ends of the circuit does not stop the signal from being injected......... If my assumption was not correct, the there would never be a problem with crosstalk on cables.
I must confess that I have not had the time to study all of the postings, or the original university paper on this topic, but in what I have seen, I am not convinced that a number of the assumed facts were correct in the paper. The other point I wish to make is that if the line were a Virgin Media using coax cable to the cabinet, it would be a proper 75 ? ohm transmission line that would be terminated with the correct impedance. Therefore test equipment would be available for the correct impedance to get meaningful results.
But as I suggested in an earlier post underground cables behave more like a capacitor than a proper transmission line. Therefore the impedance will tend to drop with frequency. Due to this fact, how do you measure what power of noise signal you are actually injecting into modem, in relation to the signal being received.
In my opinion, because of BT lines being of varying characteristics, perhaps the best way of evaluating the different modems is to just to compare them in the field on various lines. In fact I am sure that BT has probably carried this out at some point already, and already know the answer !