As you can see the SNMR increases abruptly over 5 mins, just as the DS TxPwr drops along with the ATTDR. Would anyone care to hazard a guess why this would happen? Other than it being a long line of course...
Downstream Transmitter Power is determined and controlled by the DSLAM. Why is it doing that? To mitigate DSL crosstalk onto other pairs, perhaps? If the DSLAM transceiver's output power is being cut back - i.e., that's not a reporting error - I can't think of any other reason.
Ahh, apparently, according to my mother, an engineer some engineers ago reduced the power on the line to make it more stable. The analogy he used was that of a leaky hosepipe. If you turn on the water pressure too high a lot of water will escape through the gaps. If you turn the pressure down you lose less along the way.
God knows. Not something I understand. Perhaps one of our resident (off-duty) Openreach engineers could explain?
This would be general "line gain" ? In terms of transmission line theory, when the TX power is increased, the SNR increases (further elevating the signal above the background noise), and the reliability of the communication is therefore enhanced (reduced bit error rate, or similar measure). But there's a compromise when too great a TX power causes crosstalk onto other twisted pairs in the same cable bundles from you to the cabinet/exchange. DSL services on those pairs will interpret that crosstalk as noise and it will impair the transmissions on those lines. That's why CPE (and DSLAM linecards) have Power Spectral Density (PSD) masks. They are laid down by Ofcom. The PSD masks limit the TX power according to frequency, in the hope of limiting that crosstalk, for the benefit of all subscribers.
That said, the DSLAM appears to be cutting back its TX power some time after initialisation. Yet, the PSD mask should be in place at the initialisation. I really don't know. Sorry.
Is crosstalk a strong function of length? Or would, say, a really old bit of line that might have some damage along the way have higher crosstalk to a newer bit of line?
There are two types of DSL crosstalk, near-end (NEXT) and far-end (FEXT). FEXT must be a function of loop length. The further away is the far-end, the more noise the signal will acquire en route. NEXT must presumably have some correlation to length, too. A signal from a long loop arrives at the DSLAM very degraded, so NEXT would only aggravate that.
The line statistics data obtainable from a CPE modem, e.g. the HG612, is limited insofar as it is only frequency domain-based. The metrics Hlog, SNR, Bit Loading, QLN, obtained from the modem are all measured per subcarrier or DSL tone or bin, spaced every 4312.5 Hz. Those statistics are not going to capture Repetitive Electrical Impulse Noise (REIN) - an electric fence discharging every two seconds, perhaps. Nor are they likely to show a physical fault in the copper pair. Perhaps a high resistance joint or a bridged tap on the line. Only something like a TDR will identify those problems. The HG612 captures data (HLin) which could, in theory, be transformed into the time domain, for those purposes.
Les-70 (from this very forum) investigated this to some length, but a breakthrough remains elusive
cheers, a