About @NS & @BE's line settings...
I'm trying to figure out the motivation for setting such low figures for INP and/or delay, which results in very different pictures for the FEC and interleaving settings (R, D, I, and N).
In the old "standard" settings, INP would take a value of 3.0 or 4.0 symbols; as DSL runs at 4,000 symbols per second, these INP settings demand recovery from a noise burst that is 0.75 to 1.0ms in length. Each symbol would be from 2 to 15 bits, so the amount of bits that would need to be recovered would be high - requiring a high FEC overhead. I guess the end result is to spread the interleaved data over 8ms, while expecting one eighth (1ms) to be corrupted. That would perhaps explain why FEC tends to steal one eighth (12%), or more, of overhead from the bandwidth.
This means that the FEC settings like that ought to overcome a 1ms noise burst within an 8ms spread of data; that would work so long as the bursts of noise occurred no more frequently than every 8ms - or no faster than 125Hz. Electrically-induced REIN would happen, I guess, at either 50Hz or 100Hz, so these settings would solve that. In facty, they look designed to fix precisely that problem - but at a heavy cost.
However, perhaps BT are now motivated to try some softer settings first, to see if they help the line, rather than jumping straight in to assume heavy-grade REIN. The increased take-up of FTTC is now seeing many more people hit the double-whammy of losing speed to crosstalk (perhaps 20%) and then having DLM intervene too (perhaps another 15%) - plenty of motivation there.
Thinking further along those lines ... if crosstalk is now the source of most of the extra errors encountered on lines, then the errors will no longer look like REIN, in either duration or periodicity. I suspect they would probably be much smaller errors, occurring much more randomly.
In such a world, I think these small errors need a very different type of FEC correction - just a few percent of overhead - with much less need for data to be spread over a long period.
I *think* that the newer style of DLM settings, such as INP=0 and delay=1, help achieve that. Or settings that turn on FEC without any interleaving at all. These kind of "strange" settings are turning up more commonly now; perhaps it is a consequence of the same sort of research that has led to G.INP being activated.
If crosstalk is turning into the dominant noise effect, rather than REIN, then the "default line profile" (as used by a RESET) might be set based on the level of occupancy of the cabinet (high occupancy leads to a high expectation of crosstalk), or might be set based on some individual crosstalk-detection mechanism.
The only question left is this: Why did BaldEagle have his line set this way a *long* time ago?
I'm left wondering if BE, having such a troublesome line, might have found himself subject to some manual settings - perhaps even a guinea pig in some research.