Welcome to WebmasterWorld Guest from 184.73.66.157

Forum Moderators: Robert Charlton & aakk9999 & andy langton & goodroi

Message Too Old, No Replies

How to get a page removed from Google's supplemental index?

     
12:27 pm on Dec 22, 2008 (gmt 0)

Junior Member from IN 

5+ Year Member

joined:Dec 17, 2008
posts: 77
votes: 0


How to get the page removed from the Google's supplemental index?
4:04 pm on Dec 22, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 8, 2006
posts:1232
votes: 0


maybe try setting <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
in the <head> for that specific page?

also, if your not registered, sign up for google webmaster tools...

see if that page is listed as a "sitelink" and you can block it from there...

dont know if that will work, but its worth a look.

4:16 pm on Dec 22, 2008 (gmt 0)

Moderator from US 

WebmasterWorld Administrator travelin_cat is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Feb 28, 2004
posts:3199
votes: 9


Vicky,

Are you asking how to remove it from the supplemental index in to the main index or remove it entirely from Google?

4:27 am on Dec 23, 2008 (gmt 0)

Junior Member from IN 

5+ Year Member

joined:Dec 17, 2008
posts:77
votes: 0


yes i am asking how to get a page removed from supplemental index to the main index
6:01 am on Dec 23, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Three basics that I know of:

1. More PR for the url - an external backlink or two works wonders, or improving the internal link structure to circulate more PR to the page.

2. Unique Title element, relevant to the page.

3. Unique meta description, relevant to the content of the page and concise (no more than 160 total characters.)

If there are any canoncial URL issues [webmasterworld.com] with the domain, get them patched up as much as you can, too. That will also help make the most of your PR.

4:58 pm on Dec 29, 2008 (gmt 0)

Full Member

10+ Year Member

joined:Nov 10, 2005
posts: 240
votes: 0


I don't believe unique TITLE/META Description will help. Duplicate content is an issue in that people may link to multiple versions of the same content, diluting PageRank.

I assume there are various undisclosed factors, and I also assume disclosed factors change over time, but two other factors that have been mentioned by Google are 1) URL complexity and 2) content staleness.

6:28 pm on Dec 29, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Unique meta descriptions and title elements have definitely been an issue in borderline or "edge" cases - low PR specifically. I have restored scores of urls to "findable" status by putting attention on just these two factors.

The Supplemental Index, in its original incarnation at any rate, was not fully searchable. That is, all the content was not analyzed and tagged for relevance, but key factors such as Title and Description always were. So if those two were not unique, the page looked more like something Google wouldn't include in the main search results.

7:15 pm on Dec 29, 2008 (gmt 0)

New User from ES 

5+ Year Member

joined:Aug 5, 2008
posts: 32
votes: 0


Hi Vicky,

Most common problems to go SPI:
- duplicate pages
- identical title tags and identical description meta tags
- not enought indexible content
- too many parameters in the URL
- the page is not linked to well enough to warrant regular crawling or the page has no incoming links but remains indexed because it used to have links

4:12 pm on Dec 30, 2008 (gmt 0)

Full Member

10+ Year Member

joined:Nov 10, 2005
posts: 240
votes: 0


Unique meta descriptions and title elements have definitely been an issue in borderline or "edge" cases - low PR specifically. I have restored scores of urls to "findable" status by putting attention on just these two factors.

Tedster, you are also changing on-page content, which we do know influences a page's "findable" status.

It's also extremely difficult to isolate factors on an online SEO "experiment" because there are often unanticipated factors at play. For example, tests run to prove that Google dropped the second link when there are two links pointing to the same URL failed because the testers didn't anticipate factors like identical anchor text.

A site's PageRank distribution also shifts like waves over time - it isn't a constant even if you don't lay one finger on a site and even if backlinks to a site remains the same.

all the content was not analyzed and tagged for relevance, but key factors such as Title and Description always were.

Right, but duplicate text isn't an issue as Googlers have often reiterated.

Supplemental index is a way to prioritize crawl order: to pick up fresh content on high authority sites quickly while shelving stale content on untrusted sites for later crawling.

In that process, content freshness, url complexity and PageRank are three factors we know with absolute certaintly which influence a URL's supplemental status.

[edited by: Halfdeck at 4:17 pm (utc) on Dec. 30, 2008]

6:11 pm on Dec 30, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member jimbeetle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 26, 2002
posts:3295
votes: 6


Put me solidly in the unique title and description camp; all the evidence I've seen over the past few years has pointed to those being crucial in getting pages crawled. If a page looks like a possible dupe during G's initial sniff test it isn't going to index it completely until the PR crosses whatever the threshold happens to be at that time.
6:57 pm on Dec 30, 2008 (gmt 0)

New User from ES 

5+ Year Member

joined:Aug 5, 2008
posts:32
votes: 0


Hi there,

I've made some tests to identify Google SPI factors including small site (15 pages) medium size ( 1500 pages) in two different languages and I've identified several factors mentioned in my last post. You should consider that Google is working with several data center around the world.

 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members