Kitz Forum
Broadband Related => FTTC and FTTP Issues => Topic started by: Samad on September 29, 2017, 04:13:55 PM
-
I have FTTC and a Samknows Whitebox which has flagged that there is a minor problem with Website Loading Times.
I have reported this to BT and have been fobbed off with download speeds and contractual obligations which does not answer my question and have been referred to the Ombudsman if I am not happy with the explanation.
For the record my Download Speed is what I would expect at 28Mbps.
I have had FTTC for a year and all was ok until Samknows flagged the minor problem around 23 July 2017.
Attached is a screenshot of website loading times for 3 sites since 1 July 2017 and it clearly shows a deterioration in times from around 23 July 2017
1) BBC Homepage
2) Youtube Homepage
3) Ofcom test page
Can anyone enlighten me as to cause?
-
You may need to ask samknows to explain exactly what the results of the test mean.
If it's what it sounds like - it loads the BBC or youtube home pages in a web browser then records the time taken - an increase in time could also be due to changes to the BBC or youtube sites, not just a change to your internet connection.
-
You may need to ask samknows to explain exactly what the results of the test mean.
If it's what it sounds like - it loads the BBC or youtube home pages in a web browser then records the time taken - an increase in time could also be due to changes to the BBC or youtube sites, not just a change to your internet connection.
From Samknows Website:-
Website load
We test the load speeds of your most popular websites. This is a sort of combination test that includes download, latency and DNS in one test that accurately mimics real world usage. We record how long it takes to receive all of the assets on a website. We even measure whether any problems are caused by the site itself or anything in between you or the site, like a content delivery network like Akamai.
-
Apart from looking at the graphs, have you actually noticed any poor performance?
-
I run Smokeping (https://oss.oetiker.ch/smokeping/) probes from a server here to a number of sites including Apple, BBC, eBay etc., and I've not seen an increase in latency, in fact for the BBC quite the opposite as seen in the attached image.
What I notice now I've looked a little closer though is that all the aforementioned sites respond to an HTTP probe with an HTTP 301 (https://en.wikipedia.org/wiki/HTTP_301) status code i.e., permanently moved, and the new location is the secure (HTTPS) site.
smf22@lampu:/etc/smokeping/config.d$ curl -v http://www.bbc.co.uk
* Rebuilt URL to: http://www.bbc.co.uk/
* Trying 212.58.246.91...
* Connected to www.bbc.co.uk (212.58.246.91) port 80 (#0)
> GET / HTTP/1.1
> Host: www.bbc.co.uk
> User-Agent: curl/7.47.0
> Accept: */*
>
< HTTP/1.1 301 Moved Permanently
< Server: nginx
< X-BBC-No-Scheme-Rewrite: 1
< X-Cache-Action: HIT
< X-Cache-Hits: 11049
< Vary: X-BBC-Edge-Scheme
< Cache-Control: public, max-age=3600
< X-Cache-Age: 1573
< Content-Type: text/html
< Date: Fri, 29 Sep 2017 16:05:11 GMT
< Location: https://www.bbc.co.uk/
< Content-Length: 178
< Connection: Keep-Alive
<
<html>
<head><title>301 Moved Permanently</title></head>
<body bgcolor="white">
<center><h1>301 Moved Permanently</h1></center>
<hr><center>nginx</center>
</body>
</html>
* Connection #0 to host www.bbc.co.uk left intact
Not sure how long it's been this way, but perhaps this is why you're seeing an increase? My probes don't follow any redirection so I wouldn't have seen any change.... other than the decrease for the BBC where I'm no longer pulling real content.
-
Interesting ... my Samknows box does not report separate results for different targets.
All I get (for Plusnet) is this:
-
Apart from looking at the graphs, have you actually noticed any poor performance?
Difficult to say as one gets used to a slow down over time.
-
Interesting ... my Samknows box does not report separate results for different targets.
All I get (for Plusnet) is this:
There was a software upgrade earlier this year when I was able to split the results.
I note your website load times are measured in milliseconds when mine are measured in "seconds"
-
I run Smokeping (https://oss.oetiker.ch/smokeping/) probes from a server here to a number of sites including Apple, BBC, eBay etc., and I've not seen an increase in latency, in fact for the BBC quite the opposite as seen in the attached image.
What I notice now I've looked a little closer though is that all the aforementioned sites respond to an HTTP probe with an HTTP 301 (https://en.wikipedia.org/wiki/HTTP_301) status code i.e., permanently moved, and the new location is the secure (HTTPS) site.
smf22@lampu:/etc/smokeping/config.d$ curl -v http://www.bbc.co.uk
* Rebuilt URL to: http://www.bbc.co.uk/
* Trying 212.58.246.91...
* Connected to www.bbc.co.uk (212.58.246.91) port 80 (#0)
> GET / HTTP/1.1
> Host: www.bbc.co.uk
> User-Agent: curl/7.47.0
> Accept: */*
>
< HTTP/1.1 301 Moved Permanently
< Server: nginx
< X-BBC-No-Scheme-Rewrite: 1
< X-Cache-Action: HIT
< X-Cache-Hits: 11049
< Vary: X-BBC-Edge-Scheme
< Cache-Control: public, max-age=3600
< X-Cache-Age: 1573
< Content-Type: text/html
< Date: Fri, 29 Sep 2017 16:05:11 GMT
< Location: https://www.bbc.co.uk/
< Content-Length: 178
< Connection: Keep-Alive
<
<html>
<head><title>301 Moved Permanently</title></head>
<body bgcolor="white">
<center><h1>301 Moved Permanently</h1></center>
<hr><center>nginx</center>
</body>
</html>
* Connection #0 to host www.bbc.co.uk left intact
Not sure how long it's been this way, but perhaps this is why you're seeing an increase? My probes don't follow any redirection so I wouldn't have seen any change.... other than the decrease for the BBC where I'm no longer pulling real content.
Thanks for this insight. I notice that your times are measured in milliseconds whereas mine are measured in "seconds" after 21 July 2017.
-
In Smokeping the axis of the graphs are automatically scaled based upon the results. In the attached you can see that the graph for Google is scaled in milli-seconds, but that for Sithon is shown in seconds.
Whilst they appear in seconds, the SamKnows graphs still show in milli-seconds based on the results. If you look at WWWombat's graphs you can see they show times of between 0.100 (100ms) and 0.150 (150ms).
I don't know what SamKnows uses 'under the covers' but curl allows you to break down the transaction time into component parts, and there's a neat little Python utility called httpstat (https://github.com/reorx/httpstat) that "visualizes curl(1) statistics in a way of beauty and clarity."
If you've a Linux host you can use this to break down the times you're seeing. For example:
smf22@lampu:~$ httpstat https://www.bbc.co.uk
Connected to 212.58.244.69:443 from 192.168.1.200:33524
HTTP/1.1 200 OK
Content-Type: text/html; charset=utf-8
ETag: W/"43c62-yTK69WxJpxTrrZWJzalO0ViC39w"
X-Frame-Options: SAMEORIGIN
Content-Length: 277602
Date: Sat, 30 Sep 2017 10:45:14 GMT
Connection: keep-alive
Set-Cookie: BBC-UID=75a93c4f4715ebbaba5d2f5811a35650b744bccf5744c4d65a2057024d94a11c0curl/7.47.0; expires=Wed, 29-Sep-21 10:45:14 GMT; path=/; domain=.bbc.co.uk
X-Cache-Action: HIT
X-Cache-Hits: 416
X-Cache-Age: 45
Cache-Control: private, max-age=0, must-revalidate
Vary: Accept-Encoding, X-CDN, X-BBC-Edge-Scheme
Body stored in: /tmp/tmpApSeZ0
DNS Lookup TCP Connection TLS Handshake Server Processing Content Transfer
[ 28ms | 11ms | 118ms | 24ms | 217ms ]
| | | | |
namelookup:28ms | | | |
connect:39ms | | |
pretransfer:157ms | |
starttransfer:181ms |
total:398ms
-
Equally, my Samknows graph is shown in seconds (1st posting) and having checked the raw data is scaled correctly.
In any event my website loading times are taking between 3 and 5 seconds rather than less than a second between 1 July and 21 July.
-
Hi
I am a samknows tester as well, and just checked my graphs/timings.
I think the tests may have changed slightly, and now show full overall loading times (images, text, assets, advertising etc).
However, as most websites are progressive loading, a normal user may not notice the full load time, as the website may be useable before full loading of website. Example ebay uses progressive loading
I hope that helps but apologies if I'm wrong
Many thanks
John
-
I have referred my minor problem to Samknows.
-
I took a look at the Web Browsing Test on the SamKnows CPE Embedded Client SDK (https://www.testmyisp.com/embedded-client-sdk) page to see what it was doing and why what you were seeing was so vastly different to the results of the Smokeping HTTP tests I'm running. So I could compare I downloaded and compiled the webget source code (https://github.com/steffiejacob/simweb) so I could run my own tests without the need for a SamKnows Whitebox.
When I run a test with Smokeping, if I tell it to measure to the BBC web site it simply connects to the site, grabs the index page and counts the total time to undertake that. The SamKnows webget test connects to the site, grabs the index page, extracts all the URLs included within that page including those to the CSS, Javascript etc., and then goes and grabs those. For tests using the BBC home page SamKnows currently pulls 34 different URLs and 3,765,118 bytes of data compared to a single URL and 217,797 bytes of data that my Smokeping test grabs.
I guess what that means is that the results could change simply because the web site owner has changed things. It will be interesting to hear what SamKnows have to say.
-
I have receved a reply from Samknows regarding this problem:-
"The short version is that we've recently been updating our web browsing test, not only fixing bugs but allowing it to present itself as a more modern client and be given more data as a result. This combination of updating / fixing the application and changing the parameters of the test did result in higher load times for more complicated sites".