homepage Welcome to WebmasterWorld Guest from 54.227.215.140
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google AdSense
Forum Library, Charter, Moderators: incrediBILL & jatar k & martinibuster

Google AdSense Forum

This 46 message thread spans 2 pages: < < 46 ( 1 [2]     
Optimized Pageload Time Increased CTR & Income
david_uk

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 12281 posted 7:17 am on Feb 23, 2006 (gmt 0)

I've been around Adsense, and here long enough to know that fluctuations in clicks, ctr etc can be caused by almost anything, and jumping to conclusions is a pointless excercise.

However, I have made one change to my website that seems to have made a difference to all of the above, and a subsequent rise in income.

Most of my site visitors are in the US, but as I'm located in the UK so was my web hosting. I figured it didn't really matter where the host was. I have recently moved hosting to the US for a variety of reasons including the fact that most of my visitors are there, and it just might give them a faster load time to the pages. I guess that the pages load faster in the US now - they are certainly loading faster here.

Interestingly enough, the effect was immediate once the domain had transferred. ctr and clicks went up. I've always kept the pages minimalist so as to provide faster loading times - especially the main index page. If the assumptions are correct, then it would appear that the load times are pretty crucial to if a visitor stays, and if they click or not!

 

david_uk

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 12281 posted 6:50 am on Feb 24, 2006 (gmt 0)

Have to say that epc and ecpm are pretty well unchanged. What has changed is impressions and clicks and consequently earnings. Although the stats from the raw log files are up, (including a slight rise in page views per visitor) they aren't up by as much as the google stats.

percentages

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 12281 posted 6:57 am on Feb 24, 2006 (gmt 0)

There is a very, very, very old rule in website design:

"You have 7 seconds to sell a prospect that they have found a site that is relevant to them".

The 7 seconds includes the time it takes to load the page to the point of usability. If your page takes longer than 7 seconds to load you have zero selling time. If it takes 3 seconds you have 4 secs of selling time...etc.,etc.,.....

The faster the load, the more time you have to sell to the prospect! Simple stuff! I never offer a page that takes longer than a second to load at 14.4K.......I like to maximize selling time!

I've been shopping online for the last 2 hours, gave up now!. Can someone please explain to me why 95% of the sites I've visted make it hard for me to find the prices of their products? That is rule #2.........know what your prospect is expecting to find and make it easy for them!

CainIV

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 12281 posted 7:10 am on Feb 24, 2006 (gmt 0)

Alright, this is a great thread with lots of useful info.

Some questions for you guys:

http compression - where and how do we implement this?

Caching of images only (especially high KB logos. Is there an easy way to apply this?

I'm on a Unix box using php (div/css combo)

bumpski

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 12281 posted 1:49 pm on Feb 24, 2006 (gmt 0)

CainIV

Here is one crude method to determine if you already have http GZIP compression.

Below you'll see two hits of the same webpage from a website's logs, one from the new Mozilla Googlebot and one from the old Googlebot. Google is still crawling the web with both BOTS.

"GET /AAABBB.htm HTTP/1.0" 200 20365 "-" "Googlebot/2.1 (+http://www.google.com/bot.html)"

"GET /AAABBB.htm HTTP/1.1" 200 5976 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"

Notice the size difference! 5976 bytes versus 20365 bytes for the same web page. Almost a 4X reduction.
Basically Google is doing this test for you.

If you see no difference you have no compression. First thing to do is ask your webhost for GZIP compression. I have found that in many cases even the tech support departments will not know what you mean!
There is a way to implement this with PHP yourself. There are some real gothca's with this, so be careful!
TEST, TEST, TEST, especially error pages and non-existant pages tests. I used to like to redirect mispelled pages to my site map, this should return a not found code too. After I first installed GZIP I found out I was serving blank error pages!
If you have control of the compression level, only use level one, most gain, least CPU.

Some links:
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]

Brett's explaination for not using GZIP, it's not always right for all.
[webmasterworld.com...]

NOTICE the old dates on these posts keep that in mind. Wow found some good new stuff myself, Thanks.

bumpski

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 12281 posted 2:27 pm on Feb 24, 2006 (gmt 0)

Hey look webmasterworld is GZIPped, congratulations Brett!

georgei

10+ Year Member



 
Msg#: 12281 posted 4:06 pm on Feb 24, 2006 (gmt 0)

Posted by incrediBILL:
Turned out when bad bots overloaded the server at the same time Google/Yahoo/MSN/visitors were tying to use it revenues went down.

What are the "bad bots" that we should block using robots.txt?

Thanks

DamonHD

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 12281 posted 4:17 pm on Feb 24, 2006 (gmt 0)

Hi,

"Bad bots" are run by:

1) The terminally lazy/stupid who try to download your entire site automatically rather than wait for the next page to load.

2) Content thieves/scrapers looking to purloin your content for their MFA/whatever site.

3) The 100-or-so robots that are on every one of my sites all the time (and everyone else's) wasting our bandwidth looking for (mis)uses of trademarked/registered terms such as "Coka-Kolla" or "Prince" or whatever. A VERY significant waste of bandwidth and money for me, BTW.

4) Bots run on compromised home PCs or directly by SPAMmers scraping for email addresses.

etc...

Category (4) can be blocked with careful use of, for example, the sbl-xbl SPAMHAUS block list.

(3) and (2) and (1) are probably best regulated by bandwidth and related automatic "reasonable usage" limits.

Rgds

Damon

PS. I am probably almost alone in believing that robots.txt is not the way to block any of these major misuses because it relies on the remote machine to cooperate with you and their behaviour has already shown them to be unwilling to behave well. I use robots.txt only to stop duplicate content being indexed, not to try to protect the content itself.

PPS. I really have had too many coffees this afternoon!

bumpski

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 12281 posted 4:25 pm on Feb 24, 2006 (gmt 0)

Take a look at WW's robots files:

[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]

[webmasterworld.com...]

From what I can see now there is a script for dynamic robot response and filtering.

wmuser

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 12281 posted 10:01 pm on Feb 24, 2006 (gmt 0)

"You have 7 seconds to sell a prospect that they have found a site that is relevant to them".

Afree,old still workimg rule

rohitj

5+ Year Member



 
Msg#: 12281 posted 9:05 am on Feb 26, 2006 (gmt 0)

surprised that no one has mentioned geographical load balancing thus far--this has helped even within the united states (having a couple servers on the east-coast and a couple on the west and then using third-party DNS servers). I would be curious to see if anyone's colocated or rented in several countries and how this has impacted revenue. Worth it for the small guys?

and on a somewhat different note...From some chats I've had, it actually is getting affordable, even for small business, to use one of the major content-delievery companies like Akami to speed things up... Thoughts?

DamonHD

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 12281 posted 2:16 pm on Feb 26, 2006 (gmt 0)

Hi,

I use 5 geo-balanced servers (UKx2, US, AU, SG), because I believe it *will* help with perceived speed and with local search, especially given how lumpy AsiaPac connectivity apparently is and AP is an area that I am targetting.

But I can't prove that geo-sensitive serving helps (yet), and bandwidth anywhere outside the US (and possibly UK/DE) is much more expensive, so the jury hasn't even met for my experiment!

Rgds

Damon

gendude

5+ Year Member



 
Msg#: 12281 posted 3:47 am on Feb 27, 2006 (gmt 0)

I've found that PV have gone up when I reduced pageload size (and therefore time). I went from a 100k+ front page down to the 40k or so mark, and this translated throughout the site to faster load times overall.

I could tell a difference, because while I had steadily increasing traffic, the PV ratio had remained constant with the increase.

Once I made the changes, you could tell a big difference in PV ratios.

It's not that I moved content around or spread it out, Instead it was this:

- Removed an ad unit
- Changed my navigation around to be more simple
- Reduced the number of article categories (consolidated them in some cases)
- Cleaned up the graphics I had on the page (logo/header) and worked to reduce their size.
- Reduced the number of stories displayed on the front page down to around 8 (from 15 stories previously - I'll see 8 new stories every 2-3 days as it is).
- Reduced the amount of information about each story on the front page (basically I reduced content from each story, that was then placed on the page that somebody clicking the "read more" link would see).
- Had a large number of links in one area of the sidebar (65 or so) about individual widgets - reduced those down to a four links to pages that then linked to the widgets in question (about 16 or so widget links, grouped logically)

This last thing helped people find more information easier (increasing the PVs). On those pages that each linked to 16 or so sub-pages, I gave much better descriptions to those sub-pages than was possible in the sidebar. This increased interest, and was probably the biggest thing overall to both reduce page load times, and to increase PVs.

As a result, my site is much less cluttered, loads faster, saves me on bandwidth, looks more professional/well-maintained.

A large group of people interested in my site are on dial-up - I know they probably especially appreciated my work.

DamonHD

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 12281 posted 11:51 am on Feb 27, 2006 (gmt 0)

Hi,

I general I have always tried to keep page payload size to something bearable on 9600bps dialup, though I think that I may have taken my eye off the ball a bit with ads.

I have spent the last couple of days wrapping ad content in iframe tags to ensure that at least the page body displays quickly, and have had one of the busiest Sundays for a while. Which proves nothing, but is encouraging.

Rgds

Damon

21_blue

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 12281 posted 4:54 pm on Feb 27, 2006 (gmt 0)

Damon,

The idea of putting ads in iframes to make the page content display more quickly sounds good, but I thought it wasn't recommended. From the Adsense Help pages:

Our targeting technology is not optimized to serve ads within an IFRAME. If you placed the AdSense code in an IFRAME, your site may display less targeted ads or public service ads.

If ads within iframes do generate good income, I'd be interested in knowing the risk-free way of implementing it, and why Adsense advise against it.

DamonHD

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 12281 posted 6:15 pm on Feb 27, 2006 (gmt 0)

Hi,

It is not the AdSense ads that are going in iframes; it is my Tribal Fusion and AdBrite ads that were slowing down page building. Casale Media already uses an iframe, and although AS runs JavaScript initially (1) it seems to drop in an iframe and (2) load very quickly almost all the time anyway.

Rgds

Damon

roycerus

5+ Year Member



 
Msg#: 12281 posted 4:56 am on Feb 28, 2006 (gmt 0)

Just to add to the optimization discussion. I recently realized that my mysql was not optimized. So I researched and added the following line under the [mysqld] in my my.cnf file [which is usually found in the etc directory]

[mysqld]
skip-locking
set-variable= key_buffer=256M
set-variable= max_allowed_packet=1M
set-variable= table_cache=256
set-variable= sort_buffer=1M
set-variable= record_buffer=1M
set-variable= myisam_sort_buffer_size=64M
set-variable= thread_cache=8
# Try number of CPU's*2 for thread_concurrency
set-variable= thread_concurrency=8
log-bin
server-id= 1

It has improved my website's performance a lot. Add this only if you have 512 mb of ram. You can find the templates somewhere in your mysql installation. Search for my-large.cnf or my-medium.cnf

[That sounds funny :-D lol]

This 46 message thread spans 2 pages: < < 46 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google AdSense
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved