Welcome to WebmasterWorld Guest from 188.8.131.52
However, I have made one change to my website that seems to have made a difference to all of the above, and a subsequent rise in income.
Most of my site visitors are in the US, but as I'm located in the UK so was my web hosting. I figured it didn't really matter where the host was. I have recently moved hosting to the US for a variety of reasons including the fact that most of my visitors are there, and it just might give them a faster load time to the pages. I guess that the pages load faster in the US now - they are certainly loading faster here.
Interestingly enough, the effect was immediate once the domain had transferred. ctr and clicks went up. I've always kept the pages minimalist so as to provide faster loading times - especially the main index page. If the assumptions are correct, then it would appear that the load times are pretty crucial to if a visitor stays, and if they click or not!
"You have 7 seconds to sell a prospect that they have found a site that is relevant to them".
The 7 seconds includes the time it takes to load the page to the point of usability. If your page takes longer than 7 seconds to load you have zero selling time. If it takes 3 seconds you have 4 secs of selling time...etc.,etc.,.....
The faster the load, the more time you have to sell to the prospect! Simple stuff! I never offer a page that takes longer than a second to load at 14.4K.......I like to maximize selling time!
I've been shopping online for the last 2 hours, gave up now!. Can someone please explain to me why 95% of the sites I've visted make it hard for me to find the prices of their products? That is rule #2.........know what your prospect is expecting to find and make it easy for them!
Some questions for you guys:
http compression - where and how do we implement this?
Caching of images only (especially high KB logos. Is there an easy way to apply this?
I'm on a Unix box using php (div/css combo)
Here is one crude method to determine if you already have http GZIP compression.
Below you'll see two hits of the same webpage from a website's logs, one from the new Mozilla Googlebot and one from the old Googlebot. Google is still crawling the web with both BOTS.
"GET /AAABBB.htm HTTP/1.0" 200 20365 "-" "Googlebot/2.1 (+http://www.google.com/bot.html)"
"GET /AAABBB.htm HTTP/1.1" 200 5976 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
Notice the size difference! 5976 bytes versus 20365 bytes for the same web page. Almost a 4X reduction.
Basically Google is doing this test for you.
If you see no difference you have no compression. First thing to do is ask your webhost for GZIP compression. I have found that in many cases even the tech support departments will not know what you mean!
There is a way to implement this with PHP yourself. There are some real gothca's with this, so be careful!
TEST, TEST, TEST, especially error pages and non-existant pages tests. I used to like to redirect mispelled pages to my site map, this should return a not found code too. After I first installed GZIP I found out I was serving blank error pages!
If you have control of the compression level, only use level one, most gain, least CPU.
Brett's explaination for not using GZIP, it's not always right for all.
NOTICE the old dates on these posts keep that in mind. Wow found some good new stuff myself, Thanks.
"Bad bots" are run by:
1) The terminally lazy/stupid who try to download your entire site automatically rather than wait for the next page to load.
2) Content thieves/scrapers looking to purloin your content for their MFA/whatever site.
3) The 100-or-so robots that are on every one of my sites all the time (and everyone else's) wasting our bandwidth looking for (mis)uses of trademarked/registered terms such as "Coka-Kolla" or "Prince" or whatever. A VERY significant waste of bandwidth and money for me, BTW.
4) Bots run on compromised home PCs or directly by SPAMmers scraping for email addresses.
Category (4) can be blocked with careful use of, for example, the sbl-xbl SPAMHAUS block list.
(3) and (2) and (1) are probably best regulated by bandwidth and related automatic "reasonable usage" limits.
PS. I am probably almost alone in believing that robots.txt is not the way to block any of these major misuses because it relies on the remote machine to cooperate with you and their behaviour has already shown them to be unwilling to behave well. I use robots.txt only to stop duplicate content being indexed, not to try to protect the content itself.
PPS. I really have had too many coffees this afternoon!
and on a somewhat different note...From some chats I've had, it actually is getting affordable, even for small business, to use one of the major content-delievery companies like Akami to speed things up... Thoughts?
I use 5 geo-balanced servers (UKx2, US, AU, SG), because I believe it *will* help with perceived speed and with local search, especially given how lumpy AsiaPac connectivity apparently is and AP is an area that I am targetting.
But I can't prove that geo-sensitive serving helps (yet), and bandwidth anywhere outside the US (and possibly UK/DE) is much more expensive, so the jury hasn't even met for my experiment!
I could tell a difference, because while I had steadily increasing traffic, the PV ratio had remained constant with the increase.
Once I made the changes, you could tell a big difference in PV ratios.
It's not that I moved content around or spread it out, Instead it was this:
- Removed an ad unit
- Changed my navigation around to be more simple
- Reduced the number of article categories (consolidated them in some cases)
- Cleaned up the graphics I had on the page (logo/header) and worked to reduce their size.
- Reduced the number of stories displayed on the front page down to around 8 (from 15 stories previously - I'll see 8 new stories every 2-3 days as it is).
- Reduced the amount of information about each story on the front page (basically I reduced content from each story, that was then placed on the page that somebody clicking the "read more" link would see).
- Had a large number of links in one area of the sidebar (65 or so) about individual widgets - reduced those down to a four links to pages that then linked to the widgets in question (about 16 or so widget links, grouped logically)
This last thing helped people find more information easier (increasing the PVs). On those pages that each linked to 16 or so sub-pages, I gave much better descriptions to those sub-pages than was possible in the sidebar. This increased interest, and was probably the biggest thing overall to both reduce page load times, and to increase PVs.
As a result, my site is much less cluttered, loads faster, saves me on bandwidth, looks more professional/well-maintained.
A large group of people interested in my site are on dial-up - I know they probably especially appreciated my work.
I general I have always tried to keep page payload size to something bearable on 9600bps dialup, though I think that I may have taken my eye off the ball a bit with ads.
I have spent the last couple of days wrapping ad content in iframe tags to ensure that at least the page body displays quickly, and have had one of the busiest Sundays for a while. Which proves nothing, but is encouraging.
The idea of putting ads in iframes to make the page content display more quickly sounds good, but I thought it wasn't recommended. From the Adsense Help pages:
Our targeting technology is not optimized to serve ads within an IFRAME. If you placed the AdSense code in an IFRAME, your site may display less targeted ads or public service ads.
# Try number of CPU's*2 for thread_concurrency
It has improved my website's performance a lot. Add this only if you have 512 mb of ram. You can find the templates somewhere in your mysql installation. Search for my-large.cnf or my-medium.cnf
[That sounds funny :-D lol]