homepage Welcome to WebmasterWorld Guest from 50.16.130.188
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Visit PubCon.com
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 69 message thread spans 3 pages: < < 69 ( 1 [2] 3 > >     
Avoiding the over-optimisation penalty
...are the rules upside-down now?
AnonyMouse




msg:150239
 8:27 am on May 22, 2004 (gmt 0)

This was posted under the "dropped-site checklist thread", but I'd like to know more about it:

=======================
Have you more aggressively optimized recently?
Internal changes that can lead to potential problems include:
• More aggressive kw optimization, e.g., changes to Titles, META's, <Hx> tags, placement and density of kw's, etc.
• Link structure changes, and especially link text changes. Updates to link text or structure, if done for optimization reasons, can push a site into filter/penalty territory. Look in particular for overuse of kw's.
=======================

Given that the above were more-or-less the rules of SEO, I'm confused now - are we supposed to *stop* doing all the above?!? In which case, how do you indicate what your target keywords/subject is?

I've been out of the loop for a little while, so would appreciate any pointers to threads on this subject (wish there was search function for the forums!), or a re-visit to the topic by our learned members?

If these were the old "rules", what are the new ones:
1. Keywords in Title, Meta tags, H1 - all matching.
2. Keywords mixed into other Hx tags
3. Keywords at beginning, middle, end of page - good density, but natural
4. Keywords in alt text of images
5. Keywords in anchor text of links to your page, matching the keyword phrase used in your Title/Meta/H1 from above.
6. More links to the pages that you want to have more PR - i.e. link back to home page on every page, assuming that home page is important!

Comments appreciated, need to dig myself out of the SERPS hole I just fell into...any guidance on what the new rules of SEO are is much appreciated!

 

AnonyMouse




msg:150269
 2:15 pm on May 25, 2004 (gmt 0)

Of course, just about everything talked about in these forums are no more than theories. Some withstand more scrutiny and last longer than others as the empirical evidence is clearer e.g. more links = more PR.

Of course, if it doesn't happen to you, then you don't have the evidence to validate the theory. I too ignored the theme...until it appeared to have happened to me!

Given that my theory is that I tripped some sort of over-opt filter, the obvious way to add more validity to my theory is to roll-back the optimisation. I'll let you know how I get on...thanks for the support, DVDBurning!

MHes




msg:150270
 4:03 pm on May 25, 2004 (gmt 0)

The people who can't get their head around the over optimised filter are usually:

1) People who have not experienced it
2) People with sites not targeting the search phrases that it hits.

When it first hit, there was massive effect, and then I believe Google narrowed the phrases they were targeting, so some sites came back with no changes done, and others remained in oblivion unless they made those changes. The combination of refreshing the content and probably making better pages usually does the trick, but a match between H1, title and anchor is a dangerous game in some sectors.

It is tiresome to keep reading posts saying it does not exist. This algo hit a lot of peoples lives and there are too many reports to suggest it never existed.

"is it now a no-no to put your keywords in title/H1/top page/throughout content/in links/in anchor text"

Its good to do this in some sectors, but with certain keyword phrases you need to avoid an exact match, which, rightly or wrongly, Google then applies a more ruthless algo's to the page.

'Black hat' is a nonsense phrase. As far as I can see anybody who merely thinks about seo is delving into 'black hat'. We are all here to try and get our sites up the ranking, from doing stuff to let the spider in, to doing stuff to make it think we are the best site on the planet. There are degrees of 'Black hat', some clever and fair play, others obviously spammy and unhelpful to both Google and the user. However, 'Black Hat' is what the webmaster does.... the 'over optimisation algo' is something Google does and is thus very different and not under our control. It is their attempt to find sites that are trying to manipulate the spider, if we do too many 'tricks' on a page, they will spot it as suspicious and apply a stronger algo to that page. They only target certain sectors because they ain't got the time to apply it everywhere.... and it keeps us guessing!

ogletree




msg:150271
 7:08 pm on May 25, 2004 (gmt 0)

There are a lot of things going on at Google. An over-optimisation penalty is not one of them. Anybody that thinks they have a penalty I can show you another site doing well that does everything they do. I can't say I understand why a site falls off and never comes back. I do know how to start over and fix it.

I think there is an over-optimisation penalty but it does not involve good prctice on page factors. I have seen a site fall off because it had a 1pix link on it. There may be some credit to the theory that your site does not do well because you don't have enough one way links from quality sites. I think that is the culpret most of the time. People say I have a ton of links. That is good but if they are not good links or links that Google has deemed to not be worth as much then Google will devalue those links.

steveb




msg:150272
 9:59 pm on May 25, 2004 (gmt 0)

It is so bizarre to actually see anyone actuallyy say, let alone believe this overoptimization nonsense. Normally someone posts about "my site" when talking about it, which of course is silly. Look at 1000 sites at the top of 100 competitive searches then draw some conclusions. The vast majority have either solid seo-within-the-rules, or solid spam-seo, or are a very legitimate site with bad seo that is so powerful it can't hurt itself. The over optimization idea is laughable if you look at what sites are ranking well.

Mostly it comes down to people not understanding what optimization is. If a dude is 15 pounds overweight, he can work out and diet and lose ten pounds and be in better shape. If he loses fifteen via diet and exercise, that is optimized. (Losing 15 only via only diet likely will NOT be optimized though.) But if he loses 20 or 25, or 50, or starves to death, then that is not "optimized" in any sense.

SEO is not some simpleton exercise. It is mixing a variety of elements to make a healthy soup. Too much pepper ruins the soup, even if too little makes it less tasty. The soup is peakly "optimized" with the correct amount of pepper.

MHes




msg:150273
 10:38 pm on May 25, 2004 (gmt 0)

Ogletree- "I think there is an over-optimisation penalty.."

steveb- "Too much pepper ruins the soup..."

So you both believe you can 'over optimise' and this can cause a problem.

Fact: It can only be a problem if Google looks for it, finds it, and acts accordingly.... hence an 'over optimised' algo. How can you call it nonsense when you agree?

grant




msg:150274
 10:52 pm on May 25, 2004 (gmt 0)

Like many SEOs, I had sites tumble from the Florida update. However, I don't buy an "over-optimization" theory.

I believe that what happened is that onsite factors such as the title tag became far less important than offsite factors. There was a day when you could simply put your keywords in the title tag and an H1 and your site would rank high.

If you read some of the research papers by Taher Haveliwala (Stanford grad now at google), you'll see how much effort goes into efficiency. Google simply does not take the time to identify subtle or not so subtle optimization tactics.

ogletree




msg:150275
 11:17 pm on May 25, 2004 (gmt 0)

I don't think there is no peanlty. What happens is that Google devalues your incomeing links and your site really has no PR and No backlinks. Make up several words and put that in your title and you will see it show up.

steveb




msg:150276
 11:27 pm on May 25, 2004 (gmt 0)

"So you both believe you can 'over optimise' and this can cause a problem."

Of course not. It's an ridiculous statement. Why is it some people refuse to understand the simple basics? If you have a page about widgets, it is good to have the word widgets on the page. It is not good to have the word widgets be the ONLY word on the page, and repeat it 1000 times!

Frankly, only a person who has no idea what the word "optimization" means would call that 1000-widget-word page "over-optimized". It's beyond dumb.

Some folks may need to either open a dictionary or find another line of work. Optimization is merely getting your pages seen in the most favorable light. The way to do that changes (a bit) all the time.

JayC




msg:150277
 11:40 pm on May 25, 2004 (gmt 0)

>>Fact: It can only be a problem if Google looks for it, finds it, and acts accordingly.... hence an 'over optimised' algo. How can you call it nonsense when you agree?

My objections to the "over-optimization penalty" thing are on two points: first, the concept of "over optimization" itself is bogus. If you do "too much" of something, you're not doing it optimally. SEO isn't about using any one technique heavyhandedly and as much as you can; it never was. When keyword density was what people obsessed over, the concept was clearly that there was a range within which kwd was optimized... a page that simply repeated the keyword phrase over and over was not optimized -- and it didn't rank well. Did that mean there was an "over-optimization penalty" in play?

If you heavy-handedly use the same keyword phrase in every possible on- and off-page element, you won't rank well. But it's not "over-optimization" that's working against you; it's failure to change your optimization techniques as the rules have changed. An approach that used to work in a lot of cases is now not likely to.

So my second objection is the use of the term "penalty." Rankings in these cases are not because of a penalty -- they're algorithmically attained results.

steveb




msg:150278
 11:48 pm on May 25, 2004 (gmt 0)

Excellent post, JayC.

jaffstar




msg:150279
 9:41 am on May 26, 2004 (gmt 0)

Nice Post , JayC!

MHes




msg:150280
 10:13 am on May 26, 2004 (gmt 0)

Jeeze this forum is fun! If only I was a masochist I would enjoy it more.... spank me, spank me!

Some of you senior members are a bit muddled, let me try and explain in simple terms.

Once upon a time Google evaluated the subject and focus of pages according to a set of rules, including keywords in title, h1, bold etc and keyword density. They also thought links in were a good sign of 'quality'. Then they discovered that webmasters were deliberately putting these keywords in all these special places, and this no longer produced good quality results for google. So they decided to ignore text that was 'too good to be true' and repeated in every single special place. They implemented this in 'Florida', but this was too harsh and took up too much time, so they only applied it to some keywords, identifyed from Adwords data and let high pr sites survive.

This had a great side effect, it totally confused senior members at webmaster world, so they change the keywords it applies to every now and again, along with other algo changes..... and chuckled at the mayhem.

The confusion over the phrase 'over optimisation' is where people are missing the point. In possibly the majority of sectors you can optimise your site with a title and h1 keyword match and rank well. This tactic is thus inherant in what we all understand to mean 'optimised'. It is 'The procedure or procedures used to make a system or design as effective or functional as possible'. However, in other sectors this tactic is not the optimal way of designing your page because that proceedure is identifyed as being 'optimised' for the current global algo and then a new set of rules applied. Thus, Google is looking for 'optimisation' techniques and 'optimised' pages, then applying a special algo for some keywords. It no longer just looks for a set of requirements, be it KW density or pr and ranks accordingly across all sectors globally. Google now, in some sectors, will apply a special algo to sites that appear to have exactly what it is looking for, in other words, they now actively isolate pages that are currently 'optimised' and apply a different set of rules. This makes these pages not optimised for these sectors, but optimised for others.... hence the confusion.

Thus the phrase 'over optimised', which means the page is optimised for the overall algo but in some sectors this optimisation is then triggering a new set of rules. These pages could be described as obviously 'not optimised', but now we consider the cure..... take out some of the optimised techniques which work in other sectors and then the page ranks better. Hence the phrase 'over optimised' which applies to some keywords/sectors. The page IS optimised for a current algo, but then the rules change and it becomes not optimised for the new algo which is triggered. hence the phrase 'over optimised' which defines this effect.

Googles tactic is even more clever in these targeted sectors, in that I suspect on a search they isolate the pages that have all the correct optimisation, then pick the highest pr and let that go. They then run a new algo on the rest.... which confuses people even more.

In order to move this discussion on, the phrase 'over optimised' helps identify this effect. Playing semantics is not helpful, many people on the board understand what we all mean.

Analogy? People say stuff like "He was literally red with rage"....(senior member reply) "You cannot be literally red as literally means 'exactly' or 'precisely'....blah blah blah"

For heavens sake get a life! The above is my own understanding of what may be happening, we are all dealing with theory, so arrogance in opinion and suggestions that people should give up helps noone.

Hey, I can be patronising too :)

Wail




msg:150281
 11:09 am on May 26, 2004 (gmt 0)

I'm not a senior member.

Over Optimisation is an oxymoron.

Clearly, I don't believe in it. I don't think Google divides the web up into sectors either.

Thus the phrase 'over optimised', which means the page is optimised for the overall algo but in some sectors this optimisation is then triggering a new set of rules.

There's one algorithm for the main Google index. It takes a huge amount of resources to index the web, apply the algorithm to the internal contents of pages, to the content of the pages which link to them, to the content of the pages which are linked to by them and then establish the relationships between the set. Imagine how much more intense that would be if you had to do this once, work out which sector specific algorithm/filter to apply, apply it - and then, what, stop there or look again at the new view of the web pages and apply another wave of sector specific algorithms?

Google mines data. I believe (yes, speculation) that Google is looking for the natural use of language - the things people talk about, and the way they talk about them (hello Gmail) - and this includes, for example, knowing what sort of phrases tend to be used as headings and titles. If a site declares something "unnatural" is a significant header and title then I believe Google will notice that.

Monkscuba




msg:150282
 11:25 am on May 26, 2004 (gmt 0)

What about this new UOP I hear about? I have a site where the index page has no H1 tag, the title is "Title" and I have almost no links in..it doesn't rank well at all for my keywords. I think Google has an "Under Optimisation Penalty" and it's not fair. Waaah! Mummeeee! The nasty man stole my ranking!

an attempt at light relief

MHes




msg:150283
 11:29 am on May 26, 2004 (gmt 0)

Hi Wail

The oop algo applied could be quite simplistic. They already do a similar thing with duplicate content, and list the highest pr site, dropping the rest.

ukgimp




msg:150284
 11:31 am on May 26, 2004 (gmt 0)

I find myself on both sides considering the last two posts. I don’t believe that there is an OO filter. The problem comes with defining OO. Is keyword stuffing OO, or even producing a big long list of terms in H1 or cloaking absolute garbage content. There may well be an unnatural filter, in that it is highly unlikely; that a page about blue widgets on some lowly site will get 1000 links with those exact phrases as the anchor text etc. When has that ever happened naturally? Yeah that’s right, never. The flaw was exploited and now it has been shrunk.

Now on the other side, some filters have occurred before now. It was reported that links on pages called links, or guestbook’s have been filtered so they don’t pass their benefit. This was the same as FFA pages. This was discussed heavily in supporters a while back, it was occurring.

There is nothing wrong, or filter provoking with having natural text, dead on balls accurate, titles, h* etc. The problem comes when you move into the realms of what G think is spam. After all it was not spam to use FFA until G said it was, or meta tag stuffing. So being aware of the environment is mucho important.

When all the above is said and done, I thin there are multiple things and tests going on, sandbox, semantics and dare I say it, even a balls up here and there :)

stever




msg:150285
 11:37 am on May 26, 2004 (gmt 0)

(I am not steveb nor his glove-puppet)

MHes

I agree with much of your analysis when you keep it to the facts. Where I am a confused non-Senior Member is understanding why you define what you describe as "over-optimisation".

Yes, Google treats certain selections of words and phrases "differently". But where does this translate into "over-optimisation"?

That implies that one can "de-tune" a web page and return to an optimum level of optimisation. And that is something that I have yet to see satisfactorily demonstrated in any work I have done or in looking at the work of others.

(Sorry, Liane, I appreciate your honesty and the work that went into your posts, but I have fundamental disagreements with your conclusions.)

Far more likely, IMO, is that pages in certain keyword areas have to reach a certain standard of authority AS WELL AS do all the other things they did before (whether those standards are to do with semantic scope or differing kinds of links is another debate).

And it is entirely possible that the idea of networks and neighbourhoods has been rethought.

None of these ideas are in contradiction to what you posted above - yet if we debate "over-optimisation" there is a great danger that it becomes an accepted wisdom just by being the subject of so many posts.

Show me the money. Take a ranking page. Over-optimise it so it disappears. Now de-optimise it so it reappears. Now over-optimise it again. Now line up for an evening of drinks on me at Pubcon...

Monkscuba




msg:150286
 11:44 am on May 26, 2004 (gmt 0)

"Take a ranking page. Over-optimise it so it disappears. Now de-optimise it so it reappears. Now over-optimise it again."

Oh wow, I think we're going to need David Blaine or maybe Houdini.

stever




msg:150287
 11:54 am on May 26, 2004 (gmt 0)

>>Oh wow, I think we're going to need David Blaine or maybe Houdini.

LOL

But there was a serious point there.

Quite a number have said: "My page was over-optimised, I de-optimised it and it came back".

Which implies that a believer in "over-optimisation" subsequently realises what to "de-optimise" and thus what not to do to "over-optimise".

AnonyMouse




msg:150288
 3:15 pm on May 26, 2004 (gmt 0)

<quote>Show me the money. Take a ranking page. Over-optimise it so it disappears. Now de-optimise it so it reappears. Now over-optimise it again. Now line up for an evening of drinks on me at Pubcon... </quote>

You're on! As I said earlier on this thread, I have a PR7 site, which sat at positions between 4 and 6 in the SERPS for many 2-word phrases - but within the site, I had optimised for 3-word phrases. An example: optimising for "[location] usa hotels" resulted in great SERPS for "[location] hotels"...

...but I wanted to be in top3, and the guys above me had done *some* optimisation for "[location] hotels", so I changed the optimisation on my pages to "[location] hotels" not "[location] usa hotels" in the title/H1 etc etc. NOTHING ELSE WAS CHANGED. According to the accepted SEO rules, by targetting "[location] hotels" more closely, my SERPS should have improved - but no, they dropped between 5 and 20 positions depending on the term.

Please note (again) that when I refer to optimising my pages, I don't mean stuffing, I mean the usual title/H1/alt tags etc plus a smattering in the content.

I'm about to roll back my changes to that of e.g. "[location] usa hotels" - if my SERPS go back up, will you buy me that drink?!?

(BTW, great post MHes!)

[edited by: ciml at 3:35 pm (utc) on May 26, 2004]
[edit reason] Examplified. [/edit]

MHes




msg:150289
 3:52 pm on May 26, 2004 (gmt 0)

" There may well be an unnatural filter, in that it is highly unlikely; that a page about blue widgets on some lowly site will get 1000 links with those exact phrases as the anchor text etc. When has that ever happened naturally? Yeah that’s right, never."

Errr, I could give 1000 links to that page, no problem :)

stever -

"Yes, Google treats certain selections of words and phrases "differently". But where does this translate into "over-optimisation"?"

I may not be understanding you correctly, but if google has decided that the search 'hotel in Spain' has become a spam paradise, it will isolate all sites that have an exact match of phrase in recognised optimisation techniques, usually h1, title etc. It will then apply the oop filter.

"That implies that one can "de-tune" a web page and return to an optimum level of optimisation. And that is something that I have yet to see satisfactorily demonstrated in any work I have done or in looking at the work of others."

People here were ranked well, then dropped. If you avoid the filter by 'de tuning', but maintain a degree of optimisation, you rank well and become optimised again! Symantics dictate that you are just 'optimising', but hopefully we are now beyond that silly argument.

"Far more likely, IMO, is that pages in certain keyword areas have to reach a certain standard of authority AS WELL AS do all the other things they did before (whether those standards are to do with semantic scope or differing kinds of links is another debate). "

'authority' seems to just mean non recipricol links out pretending to be useful to the user. This probably explains why so many directory sites do well.... I can't see it lasting. Websites are no different to shops, and good shops do not put signs up saying 'try the shop next door'.

"None of these ideas are in contradiction to what you posted above - yet if we debate "over-optimisation" there is a great danger that it becomes an accepted wisdom just by being the subject of so many posts. "

I hope so. No smoke without fire.....

Wail




msg:150290
 3:56 pm on May 26, 2004 (gmt 0)

*cough*

Sorry.

Google for: [location] hotels.
That's a search for two words.

Google for: "[location] hotels".
That's a search for the phrase.

By taking "usa" out of your keyword real estate you flipped from words in proximity to adjacent words, from two words to the phrase.

Midhurst




msg:150291
 5:23 pm on May 26, 2004 (gmt 0)

DVDBurning.
I really don't think having the same anchor text on all pages referring to the home page as you descibe is the problem.
I run a disease site; and I decided to optimise the site for ONE disease only. So I put on every disease page (about 100), Back to Widget Disease Home Page. Of course the Home page had a lot about Widget Disease on it.
The site has been stable for Widget Disease at #10 in the serps before and since Florida. Nary a wobble.

But, this might interest you.

Another site called Fancy Widgeteer was doing well for Fancy Wigets at #5 in the serps, although for the more popular (ie competitive) term Fancy Widgeteeer never rose higher than circa #30.
I decided to try and push Fancy Widgets to the top with on page tweaking and adding more keywords to the text but in good semantic context.
But, didn't bother to arrange with outside websites linked to ours to change the INCOMING ANCHOR TEXT.
Result, the main keywords ONPAGE were now out of sync with INCOMING ANCHOR TEXT.
Result: disaster. Went down to circa #50 for Fancy Widgets.
This was a classic example of bad optimisation I would submit, not over optimisation.
Was you internal anchor text just in sync with the title or in sync with the front page content?
Regards
Midhurst

seomike2003




msg:150292
 5:35 pm on May 26, 2004 (gmt 0)

I got a solution!

Ok go to Google.com >> click about google at the bottom.
Under the Our search category >> click Google Services & Tools

At the very bottom of the page >> click Add Google to your Browser

At the very bottom of the page in the Macintosh OS X Service project >>
click press the Shift-Cmd-G link

It takes you to a doorway page that links out to various sites. All of which
have a high PR do to the Google link. Some are page 1 for the link text.

WHY DON'T YOU ASK GOOGLE FOR A LINK FROM THEIR SPAM DOORWAY PAGE?

Good crosslinking is a key to get the spider in there but G is heavy on inbound high PR links that are on topic. Once you get your site built and and keywords in place I'd say about 90% of doing well in the serps on hard terms from that point on are about inbound links.

stever




msg:150293
 5:56 pm on May 26, 2004 (gmt 0)

I may not be understanding you correctly, but if google has decided that the search 'hotel in <location>' has become a spam paradise, it will isolate all sites that have an exact match of phrase in recognised optimisation techniques, usually h1, title etc. It will then apply the oop filter.

Hmm, I don't see that that is what it does (isolates specific sites (pages?) and then applies a filter).

I would argue that the 'hotel in <location>' search trips a different or additional algo for ALL relevant pages and that the trip is not due to over-optimisation of specific pages in the original results. Perhaps enough authority pages? Perhaps relative popularity of the search? Perhaps, as you say, that it has become "a spam paradise".

But my impression of what happens then is that results are held to another standard (or, if you will, a different algo or a filter).

If your page passes those extra hurdles, it doesn't matter two figs whether you "overoptimised" or not (in other words, it doesn't matter any more or less than it would in any other SERPs).

'authority' seems to just mean non recipricol links out pretending to be useful to the user. This probably explains why so many directory sites do well.....

That's not what I meant when I used that word - my observations lead me to believe that variety and quantity of inbound and outbound links relevant to the query, and scope of related and supporting content (which is what I meant by "semantic scope") are more likely to have a greater effect than any degrees of on-page or sitewide optimisation. But as I said before that's probably outside the scope of this thread.

DVDBurning




msg:150294
 5:58 pm on May 26, 2004 (gmt 0)

seomike2003,
Wow! Interesting find. How on earth would Google link to this? The resulting page has a PR of 9! And who are these guys?

I don't get it... but I wish I could get this kind of backlink.

seomike2003




msg:150295
 6:02 pm on May 26, 2004 (gmt 0)

me too :)

DVDBurning




msg:150296
 6:08 pm on May 26, 2004 (gmt 0)

seomike2003,
Did you actually download this Search Google tool? I guess it is a Macintosh CD disc image. Has anyone installed this tool?

Just amazing... Google links to this guy's site, and he has all kinds of spammy links from his site that offers this Google search tool... lawyers, hotels, pharmaceuticals.

Life isn't fair.

seomike2003




msg:150297
 6:11 pm on May 26, 2004 (gmt 0)

Actually this topic is pretty big news on other forums except this one.

DVDBurning




msg:150298
 6:19 pm on May 26, 2004 (gmt 0)

Midhurst,
Well, I have 2 sites that were dropped from the index recently. One was an industry related / hobby site (my WW nick might give you a clue). The other is a site about my home town where the domain is city-state.tld. In both cases, the biggest key phrase is a 2 word phrase, and in both cases I had the phrase in the title, description, H1 tag, and fairly frequently in the content. The content is written for humans, and it makes sense for what I am trying to say... it is hard to refer to the topic on each site without using the 2 word key phrase.

My hometown site was previously #2 for the city state search in Google. It is #2 in Y! It just recently was dropped from the index.

My industry / hobby related site is in a category of widgets that is full of spammy affiliate marketing or fraud sites. I have a feeling that special rules apply to related searches in this area.

Anyhow, I am reducing keyword density, and trying to fix it... but only time will tell what the problem is and what the fix turns out to be.

AnonyMouse




msg:150299
 6:25 pm on May 26, 2004 (gmt 0)

"By taking "usa" out of your keyword real estate you flipped from words in proximity to adjacent words, from two words to the phrase."

Don't you think it's a little weird that my SERPS were better for that phrase BEFORE I targetted the exact phrase?!?

This 69 message thread spans 3 pages: < < 69 ( 1 [2] 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved