Kitz Forum

Broadband Related => Broadband Technology => Topic started by: WWWombat on June 02, 2016, 06:20:13 PM

Title: DSL Symbol Rate - Why?
Post by: WWWombat on June 02, 2016, 06:20:13 PM
ADSL, ADSL2+ and VDSL2 all use a "symbol rate" of 4,000 symbols per second.

That means that any one tone, configured with N bits of bit-loading, can transfer (N * 4000) bits per second.

I always wondered why the number of 4,000 was chosen, and why it stayed the same over each iteration of technology, and now I understand why.

G.Fast is designed for 48,000 symbols per second. However, while it uses 106MHz of spectrum, it doesn't use more tones that VDSL2. It uses 2048 tones, each with over 51kHz of spectrum. And that turns out to be the link - the 51kHz of spectrum for each tone is what makes 48,000 symbols per second possible.

In older DSL iterations, the tones were each allocated 4.3175kHz - enough to carry 4,000 symbols per second. I guess the bandwidth allocated to a tone dictates how much time the DSP hardware requires to identify the analogue QAM components of each symbol.

Lightbulb moment  :idea:
Title: Re: DSL Symbol Rate - Why?
Post by: zhadoom on June 02, 2016, 08:09:31 PM
ADSL, ADSL2+ and VDSL2 all use a "symbol rate" of 4,000 symbols per second.

That means that any one tone, configured with N bits of bit-loading, can transfer (N * 4000) bits per second.

I always wondered why the number of 4,000 was chosen, and why it stayed the same over each iteration of technology, and now I understand why.

G.Fast is designed for 48,000 symbols per second. However, while it uses 106MHz of spectrum, it doesn't use more tones that VDSL2. It uses 2048 tones, each with over 51kHz of spectrum. And that turns out to be the link - the 51kHz of spectrum for each tone is what makes 48,000 symbols per second possible.

In older DSL iterations, the tones were each allocated 4.3175kHz - enough to carry 4,000 symbols per second. I guess the bandwidth allocated to a tone dictates how much time the DSP hardware requires to identify the analogue QAM components of each symbol.

Lightbulb moment  :idea:

The old POTs system used 300Hz-3400Hz range for voice in analog systems. Digital PBX migration consider the voice bandwidth to 4KHz and adopt a 8KHz sampling rate ( 8000 samples per second - minimal for Nyquist-Shannon rule ).
First Dsl systems adopted a 256 sub band over a 0-1.1MHz ( 1105.28MHz to be precise ) range resulting in 4.3175MHz.
The first sub band ( bin 0 ) is used for the voice channel. Since its width is a little larger than 4000Hz its retro compatible with analog channel using a simple low pass filter or a low pass + high pass filter ( splitter ).

The only exception to use non 4k sub bands ( in xdsl technology ) is the VDSL2 30a profile. It uses a 8.625KHz band that is 2x 4.3175KHz.
Title: Re: DSL Symbol Rate - Why?
Post by: BrianG61UK on September 30, 2019, 01:45:01 AM
I guess the bandwidth allocated to a tone dictates how much time the DSP hardware requires to identify the analogue QAM components of each symbol.
I think you may not completely understand.
If you send symbols faster each tone will take up too much bandwidth and disturb reception of its neigbour tones at the other end.
Once the symbol rate is decided they want to be able to send as many tones as possible to make the broadband as fast as possible so essentially they pack them as close together as they can without the tones causing interference to each other. The transitions between symbols must also be carefully smoothed for it to work with the tones this close.
https://en.wikipedia.org/wiki/Orthogonal_frequency-division_multiplexing
Title: Re: DSL Symbol Rate - Why?
Post by: Weaver on September 30, 2019, 03:19:19 AM
Sending tones faster implies the duration of each symbol is shorted; taking the Fourier transform of a gated periodic waveform which is off then on for so long then off again, the shorter the duration the more the distribution of frequency components spreads out wider in frequency space. This would then bleed out into neighbouring tonesí frequency ranges. DSL uses some clever trickery to improve things when you change state from the time interval of the duration of one symbol to the state during the next symbol; this is written up in various DSL textbooks.

It seems to me that having wider tone spacing is more efficient because the gaps in frequency space between the tones are fewer so less wasteful. Against this there is the counter-argument that conditions in the environment and the medium are varying through difference frequencies - this is why the multi-tone design is used in the first place - and if things change fast as a function of frequency then what is optimal at A frequency at the bottom end of the inside of a bin range will not be optimal at the top end, and vice versa. For example a really narrow noise spike could wipe out the entire of a tone and if the tone is wider than needed then a more valuable tone occupying more frequency space will have been lost. It could be that tweaking the bin width ( = tone spacing ) based on experience would look at dX/df, where X is whatever observable you like, and the frequency-width of some typical noise-spike 𝚫f are the things that guide you and in doubt you make the bins somewhat wider.

Did VDSL2 30a make the change to double width as an optimisation because of reduced wastage?
Title: Re: DSL Symbol Rate - Why?
Post by: burakkucat on September 30, 2019, 04:48:59 PM
I'm not too sure of the motive for this three years old thread being resurrected.  :-\
Title: Re: DSL Symbol Rate - Why?
Post by: Weaver on September 30, 2019, 05:23:12 PM
I didnít notice the date. I donít routinely look at such things. Mea culpa
Title: Re: DSL Symbol Rate - Why?
Post by: licquorice on September 30, 2019, 08:16:38 PM
nolite vestra culpa, you just replied to the resurrectee (doubt there is such a word)
Title: Re: DSL Symbol Rate - Why?
Post by: grahamb on October 01, 2019, 08:08:37 PM
Think you mean "re-animator"?  ;)
Title: Re: DSL Symbol Rate - Why?
Post by: kitz on October 01, 2019, 11:10:57 PM
I'm not too sure of the motive for this three years old thread being resurrected.  :-\

Happens.  The poster seems legit.   

I nearly did the same thing myself a month or so when googling for something I came across a post on TBB where someone had made a post quoting something I'd posted on here and said I was wrong.   What really had happened is was them who had totally misinterpreted what I'd said because they are under the mistaken impression that Interleave=1 and FAST channel are exactly the same thing - which they are not as I'd been talking about the paths.
Was only after I'd logged in that I realised the post was well over a year old that I cba and closed down the browser.    In days gone by I may well have continued and explained further.  These days, the [advantage] of pain means I type less..  and morphine makes me care less.    :D