(Work In Progress)
So ... I've been thinking a little about this Sync-Speed: IP-Profile ratio, and trying to figure *why* it could be different.
The difference is 0.1%. When a line is running at 80Mbps, the difference amounts to 80kbps - or perhaps thought of as 1 extra overhead byte per 1000 data bytes. My mind immediately suggests that it could, just as much, be 1 byte in 1024 too - I don't think we measure closely enough to distinguish.
I started by playing around with the different "AgR" figures produced by Broadcom chips, to see if I could find something with that difference, but I can't.
Then, having looked at @maybecrazy's latest statistics, compared to mine, I noticed something else:
- His statistics show, on bearer 0, a non-zero OHFErr count alongside a zero OHF count.
- My statistics show zero for both.
In previous analysis, I had figured that the OH (overhead) had moved into bearer 1, so there was no way to get either a count or an error in bearer 0.
But perhaps this is wrong.
I then went to look into the G.INP specifications, to see what happened to CRC checks: I wondered what happened when you moved the OH (which holds the CRC) out of bearer 0.
It turns out that there are considerable limitations on the framing structure that can apply with G.INP in place, and the formulation of the existing frame structure in re-transmittable DTU's.
But, significantly, one of the options they *do* allow for is whether to include a CRC byte within a DTU.
If we end up with a DTU of around 1,000 bytes, and some implementations use a CRC byte, and some implementations don't use one ... could that explain the difference? This sounds a promising avenue to approach, I thought...
In the G.INP specification, the restriction on DTU size is that the user-data portion made up of an integer number of 65-octet PTM "codewords" (ie a 65-byte block), while the encoded/FEC-protected output (including header and CRC byte) must be an integer number of RS blocks. The specification embodies this in a formula:
- Q * H - 2 -V - W = A * 65, where
- Q = the integer number of RS blocks; can be seen in Broadcom line statistics.
- H = number of "proper" bytes in an RS codeword. Is this the same as "N-R"?
- V = extra bytes stuffed into the datastream, to round it. Can be seen in Broadcom line statistics.
- W = the CRC bytes, if used. Guesswork?
- A = integer number of PTM codewords. Guesswork?
For my line, the formula might work out as follows:
- Q * H - 2 -V - W = A * 65, where
- Q = 16
- H = 131
- V = 14
- W = 0
- A = 32
- Total DTU size = 16 * 131 = 2096
For @maybecrazy's line, the formula might work out as:
- Q * H - 2 -V - W = A * 65, where
- Q = 16
- H = 179
- V = 2
- W = 0
- A = 44
- Total DTU size = 16 * 179 = 2864
Now I know the formula, I can see that "V" is also significant. It represents empty space in a DTU: perhaps more empty space affects that % calculation.
Anyway, I have to leave my investigation for a few days, but I thought someone might find this train of thought interesting, even if incomplete. Unfortunately, I can't see that we'll ever get the right level of detail out of the ECI/H5A boxes to compare....