On the XdB off-switch...
I'm still convinced that DLM got redesigned (after G.INP fiasco #1) into 2 layers:
- one operational layer, working as before, each night
- one monitor layer, that set the boundaries that the first layer worked in.
The supervisor would identify external conditions (such as modem type) to determine whether G.INP was a suitable quality-improvement tool, on this line, for day-to-day use.
I then think a lot of new features come with an inhibitory off-switch ... including G.INP and XdB. These switches force the monitor to discount that feature.
When we see a rollout take 3 months+, turning on N lines per day, I suspect that the underlying DLM process changed on day 1. The slow release is made to happen by turning the off-switch on for a few lines at a time.
In this model, each process has its own history for a line; the operational process changes daily, but the monitor is slower to respond. What we used to see as a reset of DLM can now happen in two ways, depending which histories get cleared.
With this model, I think banding changed ... such that it became part of the monitor layer, and became hard to dislodge. And immune to some DLM resets.
So... with such a model in mind, it is easy to think of multiple inhibitory off-switches existing, some controlled manually (as BS did), and some automatically (such as during a rollout, or by modem detection).