The DLM info is immense, (as Kitz alludes to herself). I can clarify that FEC's are taken into account, however I can't find info on the threshold values ??
The bulk performance metrics DLM uses, appears to have 39 values reported every 15mins (such as FEC, Errors, attenuation, Traffic Count) ............. and 26 values reported every 6hrs (such as bits-per-tone, SNR-per-tone, INP).
Before I start.. no Im not shooting the messenger. I very much value your input BS.
However, that seems weird, why would the DLM function part of RAMBo need all those parameters such as atten and traffic count etc.
Everything Ive seen just says errors and retrains are used by the element managers for the DLM and sync speeds for RAP. It also records uptime in seconds for each 15 min bin.
The DLM algorithms are supposed to be
MTBR = Uptime / Retrain Count
MTBE = Uptime / Errored & Severely Errored Second Count
which are normalised over a 24 hr period.
If a line is classified as performing very poor (& mention of 10/20 retrains within a short time) , then the RAMBO will take over and perform direct monitoring of the line with the DSLAM (ie cutting out the usual process) and monitor other parameters such as SNRm variance.
---
PS also
Changing these parameters is based on two performance metrics, errors (in particular, in this embodiment, errors caused by code-violations) and re-trains (i.e. re-syncs).FEC isnt a code violation - those are CRCs/Err Secs/SErr Secs