Kitz Forum

Broadband Related => Broadband Technology => Topic started by: Weaver on July 17, 2018, 04:14:23 AM

Title: Unused time exploitation in a modem
Post by: Weaver on July 17, 2018, 04:14:23 AM
I wonder if a modem could be designed that used idle time when there is no data being transmitted in one direction or the other to send test patterns of certain types and thereby improve performance. A kind of continual training. Part of me thinks that this is i,possible, but then we already have opportunities for adaptation to changes in line conditions that are exploited now, in bitswap. Also when a retrain happens the modems make better choices than they did before, so it makes me think that more frequent retrains would be a very good thing. Of course we cannot practically do that with things as they are because it will upset the DLM gods and because it takes time.

I wonder about some system that would give us an opportunity for the application of some kind of detailed precoding though. What would happen if you regularly got a picture of the impulse respond if the line for example. Probably that never ever changes unless there is something rare and dramatically wrong so that is I suspect a really bad example. You could look out for nonlinearity in the link, if there ever is any, and you could look for echos too. But is there anything else that is actually worthwhile?
Title: Re: Unused time exploitation in a modem
Post by: niemand on July 17, 2018, 02:08:02 PM
If I recall correctly there is no 'unused time'. The modems don't send bursts they are, like ATM, a constant transmission of frames/cells with synchronisation frames periodically sent to ensure timing remains correct.

If the FIFO buffers on either side are empty, IE they're sending 'null' frames though these won't look empty due to going through a scrambler, the best way to go is to lower spectral density and, in turn, transmission power to save electricity and reduce heat output from the chipsets.
Title: Re: Unused time exploitation in a modem
Post by: Weaver on July 17, 2018, 07:01:47 PM
Indeed so. It could not be done without radical change. The two devices would have to talk to one another and out some kind of marker into the low level data stream. This would also add a very small amount of latency too.
Title: Re: Unused time exploitation in a modem
Post by: niemand on July 17, 2018, 11:30:12 PM
I must admit I don't get this. They can determine the quality of the signal with very high accuracy by how close to the centre of the constellations each carrier comes out as, and from that determine if there's phase or other distortions to try and correct for. No need for a predictable pattern.

(https://i.stack.imgur.com/Bva7g.jpg)

(https://www.gaussianwaves.com/gaussianwaves/wp-content/uploads/2012/10/16QAM_EbN0_8dB.png)
Title: Re: Unused time exploitation in a modem
Post by: Weaver on July 18, 2018, 01:54:18 AM
I confirm that I was not optimistic about this.

There are two questions:

i. what info if any can be gained about intrinsic line characteristics in situ that will help increase performance ? (about the physical line itself, not including noise which is an environmental factor not something that is an intrinsic property of the physical line itself.)

ii. what changes if any may be seen in those results if measured at a later time ?

So my musing is concerned with whether there are any opportunities that can be exploited in the answer to (ii) rather than just (i).

Iirc some decades ago digital signal processing was used in an analog telly to measure ghosting / multipath and then cancel it out by deriving the correct precoding parameters. I think this might have been done using a test signal, but I am not 100% sure as memory fails me. Another example that set me thinking, although I have no idea about the details, is the 1990s’ 56k modem and its downstream link. These are examples of (i), and do not as far as I know speak for the worth of (ii).
Title: Re: Unused time exploitation in a modem
Post by: niemand on July 18, 2018, 11:23:22 AM
Gotcha.

I would recommend looking into how cable modems do this. DOCSIS 3.0 and below use SC-QAMs - in the case of Europe the spectrum is split into 8 MHz channels each carrying 6.952 million symbols a second at either 64QAM or 256QAM - 6 or 8 bits per hertz. Scale up the carrier count and scale down the channel width and you have basically the same story with xDSL. Both run on the same basic methodology at the physical level - 2 signals, one in phase and one out of phase, with bits encoded by phase and power level. These carriers can be decoded to form constellations like the ones I copied below. From there you can detect impediments over the transmission medium and compensate. Then by measuring the performance of individual QAMs against one another you can build up a picture of the characteristics across the different frequencies - where there's extra or lower attenuation, phase irregularities, etc.
Title: Re: Unused time exploitation in a modem
Post by: Weaver on July 19, 2018, 12:01:34 AM
I don't even know if there even is much non-linearity (in whatever sense you like, your choice) in the copper links that we have to endure. If there is any, then we might want to map it in case it varies between one situation or another or even worse, but less likely changes over time, but faults that are time-dependent in the sense that they have an onset time could represent a significant case. In that case though the question is whether or not we want to bother dealing with them or ignore them because the user just needs to get their line mended. Or else we could at least detect them from the symptoms as telling the user would be nice.

In fact that is an excellent separate topic: a modem that detects faults and reports them to the user. Why do we not have one of these?
Title: Re: Unused time exploitation in a modem
Post by: burakkucat on July 19, 2018, 12:09:54 AM
I don't even know if there even is much non-linearity (in whatever sense you like, your choice) in the copper links that we have to endure.

The classic case being a joint that is beginning to show semi-conductive or HR tendencies. Viewing the US SNRM, listening with an ear and the Openreach CIDT will usually disclose such a fault.
Title: Re: Unused time exploitation in a modem
Post by: Weaver on July 19, 2018, 12:26:05 AM
That was indeed what I was thinking of, the crystal cat’s whisker / rectifier / diode - like nightmare physical defect. Aside from horrid faults like that, point contacts, corrosion and contacts with inadequate contact mechanical pressure, I wonder if there is anything tasty of this kind to be measured when all is 'well'.
Title: Re: Unused time exploitation in a modem
Post by: burakkucat on July 19, 2018, 12:41:47 AM
According to a relatively old TI document (to which K5 recently posted a link), without an xDSL link active on a circuit (i.e. just the basic underlying telephony circuit) there should be an average noise floor of -140 dBm/Hz. (That is a figure I tend to quote almost like a reflex reaction.) So anything noisier will either be the wanted xDSL signal or a metallic pathway defect.
Title: Re: Unused time exploitation in a modem
Post by: konrado5 on July 19, 2018, 12:56:09 AM
Quote from: burakkucat
So anything noisier will either be the wanted xDSL signal or a metallic pathway defect.
Or crosstalk. :)
Title: Re: Unused time exploitation in a modem
Post by: burakkucat on July 19, 2018, 01:18:06 AM
Or crosstalk. :)

Yes, indeed.  :D