I am hoping that well-tuned caching-friendliness in http headers might reduce some of the traffic on Kitz’s a lot in order to save money and help with snappy responsiveness.
There are limits though, info that changes dynamically all the time as part of even an old page. When an old, supposedly unchanged page is re-viewed it can mess up cacheability as the actual total content of the page changes because of any such auxiliary bits of included information. I am thinking of "Total time logged in" "Recent topics" and "Currently logged-in users" which are shown on some pages. Those destroy cacheability because they are calculated dynamically. That could be fixed though, by putting them into an external file and pulling that in in an
iframe so that the main page would not change and would then be cache-friendly. Also in old pages, blurb about particular users could have changed from the time the page was originally last really altered.
Adding calculated
Etags in the headers is easy and does not require any brain power - I did it once, so that proves it :-). Adding a last-modified date header for cacheability is I think too hard for forum html files, and so is
if-modified-since support, but supporting the other query,
if-none-match would be very good.
Saving you from a huge amount of traffic can only be done if there are not things in the way because of annoying little dynamic content changes, so the
iframe tweaks will fix things if nothing is missed and we will know because the Etags will stay the same if they are derived by hashing the content (although you can derive Etags in any way you like, from either last-changed dates (if you have any) or content hashes or both, and dates are easy on the server because you don't have the cost of reading the whole file and the cpu time of hashing it all. I used dates because I had them available, but I think you would have to use hashing to make Etags. You just use any fast hash on the whole file content.
There will be a lot of people around who know all about how to help with this or perhaps your hosting company would just do it all for you. Just add Etags into the headers and make sure the server is being generally cache-control friendly.
For all images then there should be full cacheability with etags and preferably
last-modified dates all determined by
if-none-match queries and even better by
if-modified-since queries, with a long cache lifetime because images do not change. In Apache you can set that up by file type, so that it knows images do not change and it will not keep sending them to the user again and again.
Articles may end up being fully cacheable anyway because they do not change very often and as long as they are not pulling in minor calculated content then they will be cache-friendly
If we find that forum pages are actually changing for no good reason we will see it because their Etags will keep changing if they Etags are derived by hashing. Then we will know that we won't be saving as much traffic as we should potentially be, and perhaps the
iframe trick would work. Another thing that will really help to save some traffic is not having any inline js or css but putting it all into external files so they are not refetched ever because they do not change.
And could have a long lifetime, plus date-related cache control info on them, like images and treat them separately by file type again. That is what I did: images, txt files, fonts, pdfs, js, css, favicons and a dozen more things all were listed with different long cache lifetimes in apache.
Whether this is worth doing or not depends on how much money there is to be saved. The work involves someone making some changes to the web server config file. In apache it would mean adding a few lines. Then an optional stage 2 would be making it much more effective, by doing things like the
iframe tweak or a simple copying over of js and css to external files. All current advice is to include js and css inline for high performance, because this removes the latency of a tcp connection startup to get each external file, but that is the opposite of saving money on traffic and it is irrelevant if the js and css are cached anyway because the cache-friendliness has been set up properly.
There are bound to be some people around who know a hundred times more about this than me. I wrote some code to plug into a web server to do all of this automatically many years ago, that’s how I know a little about it.
There used to be a tool that would assess a website for cache-friendliness and give an excellent report and checklist.
Just a thought. God, I am so sorry this post has grown to be so long. Sincere apologies
if you knew all of this stuff already.