Welcome to WebmasterWorld Guest from 54.196.244.45

Message Too Old, No Replies

Improving Google Webmaster Guidelines

Updates and ideas you wish to see added to the guidelines

     
9:00 am on May 23, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 6, 2005
posts:1678
votes: 71


Hi Folks!

As you might already know Google Webmaster Guidelines are suggestions to help webmasters understand the main principles when dealing with Google index. At the top of the guidelines, we read:

.. these guidelines will help Google find, index, and rank your site. Even if you choose not to implement any of these suggestions, we strongly encourage you to pay very close attention to the "Quality Guidelines," which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise penalized. If a site has been penalized, it may no longer show up in results on Google.com or on any of Google's partner sites.

Because Google and especially Google Search Quality Team / WebSpam Team penalize sites which are not in accordance with the guidelines, its of importance to make those guidelines easy to understand and keep them updated as new factors which affect ranking/indexing emerge.

Personally I wish to see the following two points to be included asap within the guidelines

-More details and tips about how to file a reinclusion request. At present there is very valuable relevant info on Matt Cutts blog and I wish to see them added to the guidelines.

-A paragraph covering Google position regarding paid links.

Anything else you wish to see updated or added to the guidelines?

11:20 pm on May 23, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 28, 2005
posts:3053
votes: 16


I think G needs to seperate guidelines between basic and advanced [ in depth explanations ]. The advanced guidelines could be communicated inside "Webmaster Tools".

On specifics:

- Duplicate content guidelines with mult TLD's [ [webmasterworld.com...] ]
- Interlinking of sites, what's permissible, what's not.

1:50 am on May 24, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Adam Lasnik's comments about thin affiliate pages [webmasterworld.com], plus boilerplate and stub pages [webmasterworld.com] would be good to broadcast more widely, too.
6:39 am on May 24, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 6, 2005
posts:1678
votes: 71


Search geeks know for sure the importance of Google Webmaster Guidelines [google.com] and know where to find them on Google site. But for the rest of webmasters and site owners it could be not an easy task to find those guidelines.

Therefore I suggest adding a link to the guidelines on Google front page. Somewhere either beside or just under the search box.

That way would Google also signal the importance of following those guidelines.

8:42 pm on June 5, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Google has now updated their guidelines -- looks like the wording is a lot more explicit and clear, and it doesn't feel like they're saying everyone is a spammer until proven innocent any more!

[google.com...]

9:06 pm on June 5, 2007 (gmt 0)

Junior Member

10+ Year Member

joined:Jan 25, 2006
posts:106
votes: 0


Very nice additions, and official stance on paid links calling for reporting of such. It also hints at an official statement on exchanged links being in the same boat.

[google.com...]

Google works hard to ensure that it fully discounts links intended to manipulate search engine results, such link exchanges and purchased links.
9:12 pm on June 5, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 6, 2005
posts:1678
votes: 71


Thanks for the heads up, tedster.

And it seems that they have improved the section under:

Quality guidelines - specific guidelines

by linking to more info.

I still wish to see a link to the guidlines somewhere around the search box on Google front Page. That will signal that GOOG means business when asking webmasters and sites owners to follow the Google Webmaster Guidelines or else... ;-)

5:19 am on June 6, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 28, 2005
posts:3053
votes: 16


Reseller - i just want to congratulate you on what appears to be a great response from G to your initiative here , or maybe it was just timely ;)

I found the following improved guidelines very useful and more explicit:

-Duplicate Content - [google.com...]
-Thin affiliates - [google.com...]

Are there any other major clarifications and improvements noted?

What i still find ambiguous is the use of "paid" links and i started a thread over here to discuss it [webmasterworld.com...]

[edited by: Whitey at 5:20 am (utc) on June 6, 2007]

5:47 am on June 6, 2007 (gmt 0)

Full Member

5+ Year Member

joined:Apr 30, 2006
posts:349
votes: 0


Google works hard to ensure that it fully discounts links intended to manipulate search engine results, such link exchanges and purchased links.

How far does google go in discounting link exchange? Is the value of the incoming link exchange partner similar with a "no follow" incoming link?

If I link to a partner's homepage from http://www.example.com/directory/page1.html and my partner links back to my homepage from his (i) homepage or (ii) his internal pages, will this also be discounted?

Or it will only be discounted if the incoming and out going links points to each other? In this case, http://www.example.com/ pointing to his homepage, and his hompage pointing back at my homepage.(ie my site that has his backlink).

6:33 am on June 6, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


I am pretty sure that there is no one-size-fits-all answer in this area of link weighting and devaluing. Just this week at SMX, Matt Cutts made a comment that they take the market niche's common practices into consideration when looking at the linking profile for a domain. So a retail store might not "get away with" the same linking practices that a real estate site might.
7:31 am on June 6, 2007 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 6, 2005
posts:1678
votes: 71


The section How do I request reinclusion of my site? [google.com], which I consider verty important part of the guidelines, is still very poor. No sufficient info or tips.

IMO, Matt Cutts blog still contains better relevant info than the official Webmaster Guidelines. For example Filing a reinclusion request [mattcutts.com].

Hopefully our good friends at the plex will pay more attention to that matter next time they update the guidelines ;-)

12:06 pm on June 7, 2007 (gmt 0)

Preferred Member

10+ Year Member

joined:Oct 19, 2004
posts:351
votes: 0


WOW,

Google's explaining in words that even i can understand,

Vimes.

12:39 pm on June 7, 2007 (gmt 0)

Junior Member

10+ Year Member

joined:June 29, 2004
posts:81
votes: 0


This is very good info. But what I see is Thematic Links Still have a worth to try for.

It's all about we should look for thematic links.

1:33 pm on June 7, 2007 (gmt 0)

Full Member

10+ Year Member

joined:Feb 27, 2003
posts:298
votes: 0


Guidelines to identify a penalised sites so that webmaster can avoid linking to it, in other words more information on linking to bad neibourhoods.
2:38 pm on June 7, 2007 (gmt 0)

Junior Member

10+ Year Member

joined:June 29, 2004
posts:81
votes: 0


Guidelines to identify a penalised sites so that webmaster can avoid linking to it, in other words more information on linking to bad neibourhoods.

It's all about you got to look for
- PR (beware people also fake the PR)
- Look for how many pages have been indexed in a website and also page you'll be seeing your link should be cached.
- Understand that the linking methodology they are using.
- See how thematic they are for your Industry coz it's good to have one thematic link rather to have 10 worthless links of no use.
- Atlast it's all about understanding the behavior of bots on your website and your linking wesbite.

Cheers!

3:38 pm on June 7, 2007 (gmt 0)

Junior Member

5+ Year Member

joined:June 12, 2006
posts:72
votes: 0


Automated Queries guidelines
"Google's Terms of Service do not allow the sending of automated queries of any sort to our system without express permission in advance from Google. Sending automated queries absorbs resources and includes using any software (such as WebPosition Gold™) to send automated queries to Google to determine how a website or webpage ranks in Google search results for various queries."

I run WebPositionGold once a week for my clients. How could GG possibly enforce this, as anyone could run mulitple automated queries a day using a competitor's domain? Don't get me wrong, I understand they don't want the extra resource usage, but is this a punishable offense or a favor google is asking?

What alternatives should I consider beyond looking at the sitemap data in the webmaster area or hand checkin'?

Thanks,
Greg

12:58 am on June 8, 2007 (gmt 0)

Preferred Member

10+ Year Member

joined:June 10, 2003
posts:410
votes: 0


So... an honest question for the group. We check our SERPs nightly for a small number of terms, and yes, we do it programatically. We issue perhaps 30 queries a night covering main keywords across several different sites. We do it nicely: wait a few seconds between each hit, request 100 results per page once unless we know most of our terms are one the first page, in which case we request only 10. We have been doing this for several years for several sites; so far we have seen no negative implications.

Needless to say, this is valuable, and more convenient than doing in manually, which I would probably do, for example by setting up a set of firefox tabs to open in a single click. I don't believe doing the same thing manually would violate terms of use.

We're also trying to cover our tracks a little. Our software sets a single user agent that matches a browser's signature. We also randomize the hits. Both of these are done in a feeble attempt to look human, since of course they all come from the same IP address.

So this is clearly in violation of Google's terms of service, and I recognize this.

My question is whether this is more like going 50MPH in a 45MPH zone, which most cops would not care about (and yes, I would hit the brakes if I saw the cop in a blatant attempt to disguise my crime). Or if anyone actually noticed, would it potentially be a thing we could get dinged for? Maybe a nice note to our webmaster account?

Any opinions as to whether this is going to bite us in the butt, or whether gentle use such as ours would be overlooked if it were even noticed.

Thx.

2:27 am on June 8, 2007 (gmt 0)

Preferred Member

10+ Year Member

joined:Mar 1, 2004
posts:433
votes: 3


What alternatives should I consider

Consider using Google's API. That is the way Google would like you to run your automated queries.