homepage Welcome to WebmasterWorld Guest from 54.204.215.209
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
How to get a page removed from Google's supplemental index?
vicky




msg:3812687
 12:27 pm on Dec 22, 2008 (gmt 0)

How to get the page removed from the Google's supplemental index?

 

tonynoriega




msg:3812813
 4:04 pm on Dec 22, 2008 (gmt 0)

maybe try setting <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
in the <head> for that specific page?

also, if your not registered, sign up for google webmaster tools...

see if that page is listed as a "sitelink" and you can block it from there...

dont know if that will work, but its worth a look.

travelin cat




msg:3812836
 4:16 pm on Dec 22, 2008 (gmt 0)

Vicky,

Are you asking how to remove it from the supplemental index in to the main index or remove it entirely from Google?

vicky




msg:3813166
 4:27 am on Dec 23, 2008 (gmt 0)

yes i am asking how to get a page removed from supplemental index to the main index

tedster




msg:3813196
 6:01 am on Dec 23, 2008 (gmt 0)

Three basics that I know of:

1. More PR for the url - an external backlink or two works wonders, or improving the internal link structure to circulate more PR to the page.

2. Unique Title element, relevant to the page.

3. Unique meta description, relevant to the content of the page and concise (no more than 160 total characters.)

If there are any canoncial URL issues [webmasterworld.com] with the domain, get them patched up as much as you can, too. That will also help make the most of your PR.

Halfdeck




msg:3815669
 4:58 pm on Dec 29, 2008 (gmt 0)

I don't believe unique TITLE/META Description will help. Duplicate content is an issue in that people may link to multiple versions of the same content, diluting PageRank.

I assume there are various undisclosed factors, and I also assume disclosed factors change over time, but two other factors that have been mentioned by Google are 1) URL complexity and 2) content staleness.

tedster




msg:3815727
 6:28 pm on Dec 29, 2008 (gmt 0)

Unique meta descriptions and title elements have definitely been an issue in borderline or "edge" cases - low PR specifically. I have restored scores of urls to "findable" status by putting attention on just these two factors.

The Supplemental Index, in its original incarnation at any rate, was not fully searchable. That is, all the content was not analyzed and tagged for relevance, but key factors such as Title and Description always were. So if those two were not unique, the page looked more like something Google wouldn't include in the main search results.

tibiritabara




msg:3815756
 7:15 pm on Dec 29, 2008 (gmt 0)

Hi Vicky,

Most common problems to go SPI:
- duplicate pages
- identical title tags and identical description meta tags
- not enought indexible content
- too many parameters in the URL
- the page is not linked to well enough to warrant regular crawling or the page has no incoming links but remains indexed because it used to have links

Halfdeck




msg:3816303
 4:12 pm on Dec 30, 2008 (gmt 0)

Unique meta descriptions and title elements have definitely been an issue in borderline or "edge" cases - low PR specifically. I have restored scores of urls to "findable" status by putting attention on just these two factors.

Tedster, you are also changing on-page content, which we do know influences a page's "findable" status.

It's also extremely difficult to isolate factors on an online SEO "experiment" because there are often unanticipated factors at play. For example, tests run to prove that Google dropped the second link when there are two links pointing to the same URL failed because the testers didn't anticipate factors like identical anchor text.

A site's PageRank distribution also shifts like waves over time - it isn't a constant even if you don't lay one finger on a site and even if backlinks to a site remains the same.

all the content was not analyzed and tagged for relevance, but key factors such as Title and Description always were.

Right, but duplicate text isn't an issue as Googlers have often reiterated.

Supplemental index is a way to prioritize crawl order: to pick up fresh content on high authority sites quickly while shelving stale content on untrusted sites for later crawling.

In that process, content freshness, url complexity and PageRank are three factors we know with absolute certaintly which influence a URL's supplemental status.

[edited by: Halfdeck at 4:17 pm (utc) on Dec. 30, 2008]

jimbeetle




msg:3816403
 6:11 pm on Dec 30, 2008 (gmt 0)

Put me solidly in the unique title and description camp; all the evidence I've seen over the past few years has pointed to those being crucial in getting pages crawled. If a page looks like a possible dupe during G's initial sniff test it isn't going to index it completely until the PR crosses whatever the threshold happens to be at that time.

tibiritabara




msg:3816436
 6:57 pm on Dec 30, 2008 (gmt 0)

Hi there,

I've made some tests to identify Google SPI factors including small site (15 pages) medium size ( 1500 pages) in two different languages and I've identified several factors mentioned in my last post. You should consider that Google is working with several data center around the world.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved