the bitloading adapts to available snr kitz.
Yes
agreed - I knows that
.
Sorry if I didnt explain it well what I meant about about the tx power.
One of the parameters that can occur during the bitswap process is the gain in power, in all of the tones can be redistributed slightly to try and increase the overall SNR. This gives a tad more overhead for the other tones to be able to load an extra bit(s) which it may not have been able to do during the original sync process. This is why during a fresh sync your power may be at 18dBm, but after a while it may show as say 19 or 20dBm. Im not quite sure how else to phrase what I meant. :/
The bitwapping process can adjust bits
and power. These 2 aspects are called bit-swap and gain-swap.
>> in my adsl days generally my modem would mark the tone unuseable once bitloading hit 0, if I ever seen a used tone hit 0 it became unused until a fresh sync event.
If the SNRm does get so low that 0 bits are loaded, yes the router can then mark that tone as unused.
That tone wont become available again until a fresh resync and the SNRm will be reported as 0dB as you say.
However, some routers do still continue to show the
'real' SNR even if its too low too load any bits.
I don't know if the HG612 is one of them or not. BE gave his SNR (not SNRm) figures.
>> I dont know if vdsl is still the same but on adsl 3db of snr was required per bitloading on a tone, adsl1 had a min bitloading of 2 so 6db of snr was required to make it useable, with 45db of snr needed for a 15 bitloading on the tone,
Agreed again. DMT technology is still the same behind it. adsl1 used a less efficient coding algorithm for error correction overheads, which is why it needed the extra bit loaded & 6dB minimum.
iirc its something to do with the less error correction overhead that allows the 1 bit min loading.
TBH I think we are all singing from the same hymn sheet.
GB gave another very valid reason why a line could show the messy zero bits like what we see at the end of BE's range and Im not disagreeing with what he says.
All I was trying to say is that I have a strong feeling that on BE's line it was more to do with how the
channel analysis had deemed his line during the sync up process, rather than after a period of bit loading.. purely for the fact it was still there after a fresh resync.
BE said
"I forced a modem resync today, just in case a fresh connection changed anything, but it didn't." <<---- That is why I suspected channel analysis. If it was due to bitswapping, then I'd expect those tones to show at least some SNR after a fresh boot.
Obviously bear in mind the target snrm had to be factored in also, so if a 6db target noise margin was set then on adsl1 12db of snr was needed to make a tone useable and 9db on adsl2.
oops got me.. I simplified things in my example by using 6dB.
Although
I am fully aware there's a floor level which is loosely based around the target snrm, what I did forget to include was the fact that it needs another 3dB to load one bit (even though I know this). It was supposed to be a very quick example showing what happens during chan analysis - I did say it was a quick example using made up figures. In reality, its actually far more complex than
Sync speed = (Tones_in_use * (SNR - Target SNRm) / 3dB)) because the router also has to allow somewhere for interleaving and error correction (BER rate) when it calculates the sync speed. There is an algorithm out there somewhere involving QAM, but my eyes just glaze over when I look at it.
My example would be about right for a very rough example for adsl2+ & VDSL.... but because BToR sets a min 6dB* of SNRm then my example should have said 9.1 for the pass & 8.9 for fail when it comes to BE's connection
* or whatever the target SNRM is set at.