Broadband Related > Broadband Technology

Unused time exploitation in a modem

(1/3) > >>

Weaver:
I wonder if a modem could be designed that used idle time when there is no data being transmitted in one direction or the other to send test patterns of certain types and thereby improve performance. A kind of continual training. Part of me thinks that this is i,possible, but then we already have opportunities for adaptation to changes in line conditions that are exploited now, in bitswap. Also when a retrain happens the modems make better choices than they did before, so it makes me think that more frequent retrains would be a very good thing. Of course we cannot practically do that with things as they are because it will upset the DLM gods and because it takes time.

I wonder about some system that would give us an opportunity for the application of some kind of detailed precoding though. What would happen if you regularly got a picture of the impulse respond if the line for example. Probably that never ever changes unless there is something rare and dramatically wrong so that is I suspect a really bad example. You could look out for nonlinearity in the link, if there ever is any, and you could look for echos too. But is there anything else that is actually worthwhile?

niemand:
If I recall correctly there is no 'unused time'. The modems don't send bursts they are, like ATM, a constant transmission of frames/cells with synchronisation frames periodically sent to ensure timing remains correct.

If the FIFO buffers on either side are empty, IE they're sending 'null' frames though these won't look empty due to going through a scrambler, the best way to go is to lower spectral density and, in turn, transmission power to save electricity and reduce heat output from the chipsets.

Weaver:
Indeed so. It could not be done without radical change. The two devices would have to talk to one another and out some kind of marker into the low level data stream. This would also add a very small amount of latency too.

niemand:
I must admit I don't get this. They can determine the quality of the signal with very high accuracy by how close to the centre of the constellations each carrier comes out as, and from that determine if there's phase or other distortions to try and correct for. No need for a predictable pattern.



Weaver:
I confirm that I was not optimistic about this.

There are two questions:

i. what info if any can be gained about intrinsic line characteristics in situ that will help increase performance ? (about the physical line itself, not including noise which is an environmental factor not something that is an intrinsic property of the physical line itself.)

ii. what changes if any may be seen in those results if measured at a later time ?

So my musing is concerned with whether there are any opportunities that can be exploited in the answer to (ii) rather than just (i).

Iirc some decades ago digital signal processing was used in an analog telly to measure ghosting / multipath and then cancel it out by deriving the correct precoding parameters. I think this might have been done using a test signal, but I am not 100% sure as memory fails me. Another example that set me thinking, although I have no idea about the details, is the 1990s’ 56k modem and its downstream link. These are examples of (i), and do not as far as I know speak for the worth of (ii).

Navigation

[0] Message Index

[#] Next page

Go to full version