| 6:50 am on Feb 24, 2006 (gmt 0)|
Have to say that epc and ecpm are pretty well unchanged. What has changed is impressions and clicks and consequently earnings. Although the stats from the raw log files are up, (including a slight rise in page views per visitor) they aren't up by as much as the google stats.
| 6:57 am on Feb 24, 2006 (gmt 0)|
There is a very, very, very old rule in website design:
"You have 7 seconds to sell a prospect that they have found a site that is relevant to them".
The 7 seconds includes the time it takes to load the page to the point of usability. If your page takes longer than 7 seconds to load you have zero selling time. If it takes 3 seconds you have 4 secs of selling time...etc.,etc.,.....
The faster the load, the more time you have to sell to the prospect! Simple stuff! I never offer a page that takes longer than a second to load at 14.4K.......I like to maximize selling time!
I've been shopping online for the last 2 hours, gave up now!. Can someone please explain to me why 95% of the sites I've visted make it hard for me to find the prices of their products? That is rule #2.........know what your prospect is expecting to find and make it easy for them!
| 7:10 am on Feb 24, 2006 (gmt 0)|
Alright, this is a great thread with lots of useful info.
Some questions for you guys:
http compression - where and how do we implement this?
Caching of images only (especially high KB logos. Is there an easy way to apply this?
I'm on a Unix box using php (div/css combo)
| 1:49 pm on Feb 24, 2006 (gmt 0)|
Here is one crude method to determine if you already have http GZIP compression.
Below you'll see two hits of the same webpage from a website's logs, one from the new Mozilla Googlebot and one from the old Googlebot. Google is still crawling the web with both BOTS.
|"GET /AAABBB.htm HTTP/1.0" 200 20365 "-" "Googlebot/2.1 (+http://www.google.com/bot.html)" |
"GET /AAABBB.htm HTTP/1.1" 200 5976 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
Notice the size difference! 5976 bytes versus 20365 bytes for the same web page. Almost a 4X reduction.
Basically Google is doing this test for you.
If you see no difference you have no compression. First thing to do is ask your webhost for GZIP compression. I have found that in many cases even the tech support departments will not know what you mean!
There is a way to implement this with PHP yourself. There are some real gothca's with this, so be careful!
TEST, TEST, TEST, especially error pages and non-existant pages tests. I used to like to redirect mispelled pages to my site map, this should return a not found code too. After I first installed GZIP I found out I was serving blank error pages!
If you have control of the compression level, only use level one, most gain, least CPU.
Brett's explaination for not using GZIP, it's not always right for all.
NOTICE the old dates on these posts keep that in mind. Wow found some good new stuff myself, Thanks.
| 2:27 pm on Feb 24, 2006 (gmt 0)|
Hey look webmasterworld is GZIPped, congratulations Brett!
| 4:06 pm on Feb 24, 2006 (gmt 0)|
|Posted by incrediBILL: |
Turned out when bad bots overloaded the server at the same time Google/Yahoo/MSN/visitors were tying to use it revenues went down.
What are the "bad bots" that we should block using robots.txt?
| 4:17 pm on Feb 24, 2006 (gmt 0)|
"Bad bots" are run by:
1) The terminally lazy/stupid who try to download your entire site automatically rather than wait for the next page to load.
2) Content thieves/scrapers looking to purloin your content for their MFA/whatever site.
3) The 100-or-so robots that are on every one of my sites all the time (and everyone else's) wasting our bandwidth looking for (mis)uses of trademarked/registered terms such as "Coka-Kolla" or "Prince" or whatever. A VERY significant waste of bandwidth and money for me, BTW.
4) Bots run on compromised home PCs or directly by SPAMmers scraping for email addresses.
Category (4) can be blocked with careful use of, for example, the sbl-xbl SPAMHAUS block list.
(3) and (2) and (1) are probably best regulated by bandwidth and related automatic "reasonable usage" limits.
PS. I am probably almost alone in believing that robots.txt is not the way to block any of these major misuses because it relies on the remote machine to cooperate with you and their behaviour has already shown them to be unwilling to behave well. I use robots.txt only to stop duplicate content being indexed, not to try to protect the content itself.
PPS. I really have had too many coffees this afternoon!
| 4:25 pm on Feb 24, 2006 (gmt 0)|
Take a look at WW's robots files:
From what I can see now there is a script for dynamic robot response and filtering.
| 10:01 pm on Feb 24, 2006 (gmt 0)|
"You have 7 seconds to sell a prospect that they have found a site that is relevant to them".
Afree,old still workimg rule
| 9:05 am on Feb 26, 2006 (gmt 0)|
surprised that no one has mentioned geographical load balancing thus far--this has helped even within the united states (having a couple servers on the east-coast and a couple on the west and then using third-party DNS servers). I would be curious to see if anyone's colocated or rented in several countries and how this has impacted revenue. Worth it for the small guys?
and on a somewhat different note...From some chats I've had, it actually is getting affordable, even for small business, to use one of the major content-delievery companies like Akami to speed things up... Thoughts?
| 2:16 pm on Feb 26, 2006 (gmt 0)|
I use 5 geo-balanced servers (UKx2, US, AU, SG), because I believe it *will* help with perceived speed and with local search, especially given how lumpy AsiaPac connectivity apparently is and AP is an area that I am targetting.
But I can't prove that geo-sensitive serving helps (yet), and bandwidth anywhere outside the US (and possibly UK/DE) is much more expensive, so the jury hasn't even met for my experiment!
| 3:47 am on Feb 27, 2006 (gmt 0)|
I've found that PV have gone up when I reduced pageload size (and therefore time). I went from a 100k+ front page down to the 40k or so mark, and this translated throughout the site to faster load times overall.
I could tell a difference, because while I had steadily increasing traffic, the PV ratio had remained constant with the increase.
Once I made the changes, you could tell a big difference in PV ratios.
It's not that I moved content around or spread it out, Instead it was this:
- Removed an ad unit
- Changed my navigation around to be more simple
- Reduced the number of article categories (consolidated them in some cases)
- Cleaned up the graphics I had on the page (logo/header) and worked to reduce their size.
- Reduced the number of stories displayed on the front page down to around 8 (from 15 stories previously - I'll see 8 new stories every 2-3 days as it is).
- Reduced the amount of information about each story on the front page (basically I reduced content from each story, that was then placed on the page that somebody clicking the "read more" link would see).
- Had a large number of links in one area of the sidebar (65 or so) about individual widgets - reduced those down to a four links to pages that then linked to the widgets in question (about 16 or so widget links, grouped logically)
This last thing helped people find more information easier (increasing the PVs). On those pages that each linked to 16 or so sub-pages, I gave much better descriptions to those sub-pages than was possible in the sidebar. This increased interest, and was probably the biggest thing overall to both reduce page load times, and to increase PVs.
As a result, my site is much less cluttered, loads faster, saves me on bandwidth, looks more professional/well-maintained.
A large group of people interested in my site are on dial-up - I know they probably especially appreciated my work.
| 11:51 am on Feb 27, 2006 (gmt 0)|
I general I have always tried to keep page payload size to something bearable on 9600bps dialup, though I think that I may have taken my eye off the ball a bit with ads.
I have spent the last couple of days wrapping ad content in iframe tags to ensure that at least the page body displays quickly, and have had one of the busiest Sundays for a while. Which proves nothing, but is encouraging.
| 4:54 pm on Feb 27, 2006 (gmt 0)|
The idea of putting ads in iframes to make the page content display more quickly sounds good, but I thought it wasn't recommended. From the Adsense Help pages:
|Our targeting technology is not optimized to serve ads within an IFRAME. If you placed the AdSense code in an IFRAME, your site may display less targeted ads or public service ads. |
If ads within iframes do generate good income, I'd be interested in knowing the risk-free way of implementing it, and why Adsense advise against it.
| 6:15 pm on Feb 27, 2006 (gmt 0)|
| 4:56 am on Feb 28, 2006 (gmt 0)|
Just to add to the optimization discussion. I recently realized that my mysql was not optimized. So I researched and added the following line under the [mysqld] in my my.cnf file [which is usually found in the etc directory]
# Try number of CPU's*2 for thread_concurrency
It has improved my website's performance a lot. Add this only if you have 512 mb of ram. You can find the templates somewhere in your mysql installation. Search for my-large.cnf or my-medium.cnf
[That sounds funny :-D lol]
| This 46 message thread spans 2 pages: < < 46 ( 1  ) |