After a couple of days toying with this, it is clear that injecting measured amounts of noise into a local loop is something of a black art!
For this experiment, enough Gaussian noise has to be injected to induce line errors (to test the modem's response). But too much noise causes an irretrievable framing loss and ultimately loss of sync. However, there's a very fine threshold between those two noise levels.
Furthermore, these noise tests using the Hantek AWG, above, exhibit something of the Boiled Frog syndrome! That is: "start with a cool pan, slowly raise the heat, and the frog won't even notice!". That seems to be the case with line noise, too!
Suddenly injecting a high power level of noise almost certainly causes a lost connection with immediate effect. One loss of framing signal is enough.
By contrast, there is 'success' when the line is synced with no noise. Only then is the noise slowly increased. Starting with just a few microvolts amplitude, the noise can be raised to several hundred millivolts over the time span of a few tens of seconds or minutes. Presumably as the CPE furiously bit-swaps in search of better carriers.
So what? Who cares?
Well, we might speculate that this behaviour has some relevance to the effects of crosstalk noise, as VDSL2 is rolled out further.
When a VDSL circuit is newly commissioned, it's possible that it will act as an immediate and severe 'disturber' on its neighbouring pairs. Inducing large amounts of Gaussian noise into them from the moment that the new port is activated at the DSLAM. There is nothing in the VDSL2 specs (sfaik) for building up transmit power on a slowly-slowly basis. So what is going to happen in those cases?
This is pure speculation, but it might be found that as a new VDSL2 circuit is activated, the crosstalk it creates causes the immediate de-syncing of (borderline) neighbours..Those lines then re-sync, knocking out the new circuit. Which then tries to resync itself. In a ping-pong effect! Just speculation though. No evidence of that actually happening, yet!
Though what has been seen is as follows: the Gaussian noise power level is incrementally raised (simulating disturbance from increased crosstalk). The actual net data rate remains static, a property of the DLM configuration. However, the attainable rate still drops. Causing the Relative Capacity Occupation (RCO) , a percentage measurement, to rise, quite possibly above 100%. Meanwhile, with the noise floor raised, the SNR margins, especially in the higher downstream bands, are seen to wither downwards, towards zero.
As those SNR margins are slashed, and with that static actual net data rate, the error rates creep up. Eventually, the DLM algorithm has to intervene. So it re-profiles the line with lower transmission rate bands, and the SNR margins rise again.
Sfaics, those stages could mark the typical effects of crosstalk at the CPE as more and more subscribers sharing the same cable binder opt for a VDSL2 service.
cheers, a