Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Focusing site with <meta name="robots" content="none">

worthwhile or harmful? (I'm using this a lot now.)

         

Tonearm

5:21 pm on Aug 4, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've decided to remove pages from Google's index that I don't think are valuable to searchers, by adding this tag to the pages:

<meta name="robots" content="none">

I'm removing pages like these:

About Us
Log In
View Shopping Cart
Checkout
New Products
Best Sellers

as well as a myriad of URLs that have somewhat unique content, but aren't at all focused and bring in "unfocused" traffic.

I'm hoping this will keep Google focused on my pages that are valuable to searchers, and thereby increase their visibility. Is this worthwhile, or could I be doing more harm than good?

tedster

7:08 pm on Aug 4, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I can see how Login, Checkout, and View Shopping Cart pages might seem superfluous. But especially for the others, my preference would be to make those pages have value for searchers, rather than to deny their indexing in Google.

Ever notice that when Google awards SiteLinks to the #1 domain on a search, these are often some of the pages that they feature? Yes, this might be because they are often universally linked within the site. But still, why would Google continue to include them in the SiteLinks algorithm if they do not hold value for their end user? As a consumer, I have clicked on the "About Us" SiteLink many times.

My preference is to let Google index as it wants, as much as is possible. I've seen this happen over and over - as a general rule, actions that attempt to overly control Google can have an unforeseen downside. Yes, there certainly are pages that must remain private - but these pages you mention aren't of that type.

There is also a kind of "bootstrap" effect that pages indexed from within the same domain have for each other. The PageRank equation makes no mention of "domains", it only talks about "pages". One of our members here once called it something like "internally generated PR". I'm reluctant to remove any page from Google's index that is not a true problem when viewed by the general public.

I am also aware that many people are curently doing this kind of thing page de-indexing. It seems like a fad to me - the SEO flavor of the month. The thinking behind it seems to be related to "PageRank leakage" or something like that. I think it's misguided - not disastrous, just misguided.

Tonearm

8:17 pm on Aug 4, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Internally generated PR... I see what you mean by pages being mentioned in regards to PR and not domains. That sounds like something that could be easily abused though.

Reno

8:46 pm on Aug 4, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



As usual tedster, your commentary is very insightful. I do have a brief comment/question...

It seems that if a siteowner is doing OK with Google -- decent traffic, minimal pages in supplementals, etc -- then using the meta tags and/or robots.txt to alter the movement of the Googlebot may be counterproductive. In other words, leave well enough alone.

However, if the site has been on the web for awhile and is getting little or no traffic from Google, then wouldn't this sort of approach be something to consider? The idea being, "when ya' ain't got nothing, ya' got nothing to lose" (to quote Mr Bob).

I'm certainly not suggesting anything illegal, such as hidden text, link farming, etc. Instead, I'm wondering if "shaking up" the bot's normal indexing might produce a more positive response? Especially if, as I said, the siteowner has not seen any measurable traffic from Google over a reasonable length of time.

Is there anything to that line of thinking?

..................................................

tedster

9:59 pm on Aug 4, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes, I can see that point of view. If you're doing everything you can think of, and still have no workable results, then why not try something else. However, if you're doing poorly already, removing a few urls from Google will probably not generate a windfall for you.

My concern is that so many people who are actually doing OK will try to tweak that "OK" into something better by trying almost everything they hear or read about - and they shoot themselves in the foot.

There was a near phobia about the Supplemental Results tag. A year ago - with Google's buggy initial execution of the Supplemental Index - there may have been some good reason for those fears, but not now.

The keys to success rarely lie in some arcane corner of knowledge. The basics of a good website (and a good business altogether) are almost always where long term improvements need to happen.

SEO has a kind of "legacy of tricks". Especially in the 90's, there were all kinds of easily exploitable holes in the search engine algos of the day, and many of us drove trucks through those holes. But that legacy of tricks still drives people to look for the sneaky quick fix today, rather than putting their energies toward solid business development. That's a very short term plan, IMO, and not really the way forward.

Tonearm

10:22 pm on Aug 4, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



tedster,

While I agree with what you're saying, I wouldn't consider this a trick or a hole in the algo. It seems more like another tool with which to communicate with Google.

I think this whole issue comes down to two opposing theories:

1. PR is generated internally, so the more pages linking to each other the better

2. internal PR is somewhat finite and indexed pages should be limited to those which are likely to draw quality traffic

edit: There is another possible issue raised here:

[webmasterworld.com...]

and if Google does throttle your traffic, you would want your "quota" to be filled with the most quality traffic possible.

[edited by: Tonearm at 10:26 pm (utc) on Aug. 4, 2007]

Reno

10:49 pm on Aug 4, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thanks tedster for your thoughts. But I'm still pondering Tonearm's quote:
It seems more like another tool with which to communicate with Google.

That makes real sense to me, in that just about every website on the internet has pages which truly have no real "indexable value", such as "contact us"; "trade links with us"; "report deadlinks", etc.

So I too was thinking -- perhaps mistakenly -- that I'd be doing Google a favor in saying "don't bother with these pages -- there is nothing there that is worthy of your index".

Let's say for example that there are 100 million different websites and each of them said to Google that they had 3 pages of no real search value, then Google would save the space/time it would take to deal with 300,000,000 pages. Seems like that would be a good thing (not that I'm expecting a pat on the back -- just trying to be helpful!)

Are we off-base here?

........................................................

g1smd

12:24 am on Aug 5, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>> A year ago - with Google's buggy initial execution of the Supplemental Index - there may have been some good reason for those fears, but not now. <<

I still believe there are issues if you haven't resolved at least the "/" vs. "/index.html" and the "www" vs. "non-www" problems as far as Duplicate Content goes.

Tonearm

12:47 am on Aug 5, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



g1smd,

I'm curious what you think about using the above mentioned meta tag.

g1smd

4:46 pm on Aug 5, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I wouldn't show login and checkout pages, but I would show the others.

StickyNote

9:22 pm on Aug 5, 2007 (gmt 0)

10+ Year Member



Personally, I prefer to keep any 'shopping cart' pages out of the index. This is not so much for SEO reasons, but rather to keep anyone from finding my sites via cart pages. I have had too many people looking for particular loopholes in shopping cart backends, and too few paying customers coming in from shopping cart search results.

I usually put them in a robot excluded directory, or better yet in another domain/ subdomain.

'New items', 'best sellers' etc. can actually bring in people you really want at your site.

willybfriendly

10:06 pm on Aug 5, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Tedster, sometimes it is a dup content issue. We have the following (in part) on a shopping site:

Disallow: /login.php/
Disallow: /shopping_cart.php/
Disallow: /create_account.php/
Disallow: /product_reviews_write.php/
Disallow: /shopping_cart.php/

We have also modified the script so that review pages without actual reviews "don't exist".

Making the above changes had a profound effect within 30 days of implementation.

Empty product review pages are a perfect example of duplicate content - sometimes hundreds or even thousands of pages with different URLs and identical content.

tedster

10:17 pm on Aug 5, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



That makes sense to me, willy.