Chat > Tech Chat
Interleave - psychology
Weaver:
Why is it that some people really really hate interleaving on their DSL lines? Gamers are one well-known group. Is that all though?
johnson:
I guess a lot of it is about it being a clear number you can measure and that creates a heavy subjective influence.
But modern websites can create an insane amount of requests from other places before loading, and a good deal of those will not be in parallel. With enough serial requests you can get up into noticeable fractions of a second.
Say your baseline latency is 10ms, add 16 for the minimal level of interleaving and it would take 20 ads/scripts/widgets/etc that depend on each other - far from unheard of in the modern web, to generate over half a second of time before your "simple page" loads. For every page load. This sounds tiny but the results seem to be clear in browsing feel.
I remember articles from years ago, cant remember the analysts, might of even been google, about whether people stayed on a page. It was strongly correlated with page load time in the ms range. People are super sensitive to it, so a doubling in latency (which interleaving often does and more) could have a significant effect on peoples perception of their experience.
That at least is one reason I believe people care about latency.
As for the gaming one, its an even stronger subjective effect... I lost because of this etc.
Weaver:
Given that the client is allowed to fire up multiple parallel TCP connections (in http 1.x anyway, not sure about http 2.0) as you say if you had objects that depend on one another, then that would be bad. A css file that references a bitmap or a font file would be an example. I wouldn't have thought you could get long chains of them, although an i frame that references something else might add another step into a chain making it longer still.
But given that we have to put up with TCP slow-start in HTTP every time then there is that to consider in the long chain case. I wonder how much delay that contributes compared to round trip time?
I currently have an RTT of ~40ms, with interleave has recently been reduced a bit due to PhyR Broadcom L2 RETX on ADSL2.
So, my questions:
* Do we really think that anyone notices that, or times two, or times three even in a longer chain?
* Or, do they just think that they will notice it?
* Or do they actually experience slowness, but it is due to other reasons, like slow-start? Or lack of cacheability of the objects on the server due to rubbish design or server config?
Rubbish web design meaning no cacheability will really hurt visitors and the server, because all this junk like js, CSS, bitmaps, favicons, fonts, P3P is all highly cacheable so it just needs the server config to be done right. And stuff needs the usual headers - date stamps, etags. Plus the handling of conditional GETs must be done properly. Cacheability completely does away with aabsolutely all of those delays because if it is does right then there should not even need to be any requests at all of any kind for those files. Maybe there is still a lot of this madness and ignorance about. I have indeed seen some shocking sites. A particularly sluggish tourism website for somewhere in the Channel Isles that I remember had the browser refetching absolutely everything even when you move from page to page within the site and then do a back! Some poor fools who are just copying magic incantations that the have seen somewhere with zero understanding are actually going out of their way to explicitly try to disable all caching because “caching is bad”, someone has told them, “dangerous”. It is vital to thrash the server as hard as possible at all times, and to rack up the biggest possible bill for network i/o.
It is true it seems to me that websites have remained fairly sluggish in terms of time to fire up a page over the last 15 years or so even though internet connections have got vastly faster. It is perhaps the increasing amount of page complexity, mountains of javascript, perhaps lots of ads and trackers, stupid social media buttons have proliferated, that junk is an example of something that was not there 16 years ago. The latest bloatware fads of 2028 are a (i) new wave of moronic cowardly cookie law notices and (ii) a rash of CSS position:fixed (iirc0windows that just do nothing but cut down you viewport size and rob you of vital screen real estate all because the designers are pushing some function as being so vital that not having it permanently visible could bring disaster and having to scroll would be an unforgivable tragedy.
4candles:
On the original question - maybe because their router stats used to say "FastPath", so they assume they're now on a "slow path".
johnson:
You make a lot of good points Weaver. I'd love an actual web dev to weigh in though.
Navigation
[0] Message Index
[#] Next page
Go to full version