Concerning some of the points above:
The info I reported from --linediag was on my ADSL2+ connection. The modem/router is the Billion 7800DXL. This is a high end model, so it's quite possible that it processes commands more quickly than some others.
Quite possibly the speed of the modem will have an impact, as will the PC's speed & the vastly reduced amount of data produced via an ADSL connection.
Pro-rata, I estimate my setup would produce the linediag data in about 0.22 seconds for an ADSL2+ connection
The attenuation values reported in the main stats correspond with the LATN (line attenuation) values in --linediag.
Thanks for confirming that.
Line attenuation shouldn't really change much at each resync, apart from a slight increase during warmer weather or if there is indeed a physical change to the cable properties (such as a permanent or intermitent HR issue).
Signal attenuation is a dynamic measure of 'interference' probably such as equipment being switched on/off or even increased crosstalk.
Quite small changes in Signal attenuation (less than 0.5dB) appear to make quite a difference (2 to 3 Mbps) to sync speeds if resyncing when signal attenuation has changed.
That's on my own connection, where I can't achieve anywhere near the 40 Mbps service cap.
Other achieving the their service cap, with further spare attainable rates/SNRM probably wouldn't see any differences from such small changes in signal attenuation.
I can see those changes in my pbParams data, but they aren't reported in LATN & SATN data.
As we now know there is a difference between Line & Signal attenuation reported by some modems on ADSL connections, plotting changes in SATN along with the already plotted LATN data might just be useful.
My interleave values are the same as b*cat's (32 down, 1 up), but if I tweak the SNRM down by 3 dB the downstream interleave level increases to 64, and sometimes the upstream will become 2. Over the years these values have always been in that region, so I consider them to be 'normal'.
It seems that interleaving depth adjustments are far more granular for VDSL2 connections, possibly down to an adjustment of 1.
It certainly can be adjusted by less than 10, as seen on my connection.
I have no idea what constitutes low medium or high depths for VDSL2, but perhaps the smaller adjustments allow more flexibility when DLM determines whether or not to apply the next (more severe) stage of delay & INP?