Sorry if I didnt explain it well what I meant about about the tx power.
One of the parameters that can occur during the bitswap process is the gain in power, in all of the tones can be redistributed slightly to try and increase the overall SNR. This gives a tad more overhead for the other tones to be able to load an extra bit(s) which it may not have been able to do during the original sync process. This is why during a fresh sync your power may be at 18dBm, but after a while it may show as say 19 or 20dBm. Im not quite sure how else to phrase what I meant. :/
The bitwapping process can adjust bits and power. These 2 aspects are called bit-swap and gain-swap.
>> in my adsl days generally my modem would mark the tone unuseable once bitloading hit 0, if I ever seen a used tone hit 0 it became unused until a fresh sync event.
If the SNRm does get so low that 0 bits are loaded, yes the router can then mark that tone as unused.
That tone wont become available again until a fresh resync and the SNRm will be reported as 0dB as you say.
However, some routers do still continue to show the 'real' SNR even if its too low too load any bits.
I don't know if the HG612 is one of them or not. BE gave his SNR (not SNRm) figures.
>> I dont know if vdsl is still the same but on adsl 3db of snr was required per bitloading on a tone, adsl1 had a min bitloading of 2 so 6db of snr was required to make it useable, with 45db of snr needed for a 15 bitloading on the tone,
Agreed again. DMT technology is still the same behind it. adsl1 used a less efficient coding algorithm for error correction overheads, which is why it needed the extra bit loaded & 6dB minimum.
iirc its something to do with the less error correction overhead that allows the 1 bit min loading.
TBH I think we are all singing from the same hymn sheet.
Yes I agree, although in my case I am learning some new verses!
Thanks for the explanation, Kitz, I was not aware of gainswapping. On my line the Tx power never seems to change by more than 0.1dBm.
GB gave another very valid reason why a line could show the messy zero bits like what we see at the end of BE's range and Im not disagreeing with what he says.
All I was trying to say is that I have a strong feeling that on BE's line it was more to do with how the channel analysis had deemed his line during the sync up process, rather than after a period of bit loading.. purely for the fact it was still there after a fresh resync.
BE said "I forced a modem resync today, just in case a fresh connection changed anything, but it didn't." <<---- That is why I suspected channel analysis. If it was due to bitswapping, then I'd expect those tones to show at least some SNR after a fresh boot.
Yes, I guess it depends on the predominant types of noise/interference/crosstalk seen by the particular line. The sync process can only take a detailed snapshot of the line characteristics over a period of a few seconds, whereas bitswapping can then carry out dynamic adjustments as conditions change. On BE's line the crosstalk may be more or less constant, whereas on mine the noise on some tones seems to be more intermittent (e.g. HF radio interference in particular atmospheric conditions?), so that it is less likely to be "seen" during sync.
From BE's stats the new algorithms seem to be managing to load one or two bits on
some D2 tones up to about Tone 1750 on his line, whereas the old ones gave up completely above about Tone 1550.