| This 36 message thread spans 2 pages: < < 36 ( 1  ) || |
|Some Supplemental URLs Now Rank in Google|
| 12:25 pm on Jan 7, 2008 (gmt 0)|
I have been seeing some movement on supplementals over the last few days, I noticed an icrease in traffic on one of my sites since last the 4th - having now investigated it is due to thousands of my pages that certainly used to be supplemental that are now ranking fairly competitively.
From a few brief searches in areas I know I am pretty sure that other sites have been given the same treatment so I am a little surprised that there is no talk of it here yet.
I am wondering if other people are seeing this or if it is just my site that has got lucky.
[edited by: tedster at 6:31 pm (utc) on Jan. 7, 2008]
[edit reason] split from another thread [/edit]
| 5:01 am on Feb 14, 2008 (gmt 0)|
|I'm thinking that any PR gain for supplemental pages would be minimal with a Noindex,follow attempt at pageRank sclupting - unless, of course, the website is rather small. Migth be enough to pop some marignal urls out of Supplemental, but I would be surprised at any far-ranging effects. |
I can see your point Ted.
But what about point number two: Allowing Google to spider through your site as normal, but telling Google not to index 20%-60% of your content in an effort to have your REAL content indexed correctly?
Here is my situation:
I have a site with 50,000 pages with only 2500 pages total that are not in the supplemental index.
Of the 50,000 pages, 20,000 are real content pages with value for the search engines.
The other 30,000 pages are pages that are of value to an end-user and to the search engines for SEO reasons.
These are pages such as archives by date, tag clouds, category pages and html sitemaps…, all of which help an end user while onsite and have SEO benefits, but they are never landing pages in the SERP’s.
So why index them?
I guess what I am say here is, I would much rather have my real content indexed and not in the supplemental verses having my sitemaps indexed.
And can I use the meta: <meta name="Robots" content="follow, noindex"> to help show Google what is of value and what is not?
[edited by: kamikaze_Optimizer at 5:34 am (utc) on Feb. 14, 2008]
| 6:08 am on Feb 14, 2008 (gmt 0)|
Sounds like a good experiment. Let us know how it works out, if you do try it. That certainly is a lot of "extraneous" content.
| 6:35 am on Feb 14, 2008 (gmt 0)|
At this point, I only have room for improvement, and seeing that I am the Kamikaze Optimizer, here I go (Tora! Tora! Tora!). :)
I will report back on this, I assume in a few weeks.
Regarding the "extraneous" content:
Percentage wise, it is actually less than the number one performing website on Google, Wikipedia; which has about 10+ pages supporting each actual content page. But then again, I am not sitting on a PR8 url.
| 3:06 am on Feb 16, 2008 (gmt 0)|
I did it.
On one site with 126,000 pages listed in Google but with only 10,700 in the main index, I just put the “follow, noindex” on 46,000 member pages, 1500 archive pages and 143 other pages.
This site actually has 143,000 pages in total.
On a second site with 53,700 pages listed in Google but with only 2740 in the main index, I just put the “follow, noindex” on 30,000 tag pages, 300 archive pages, 2000 category pages and 50 other pages.
This site actually has 50,000 pages in total.
My thinking is that it will take about a month to see if this has any positive/negative/neutral results.
[edited by: Robert_Charlton at 5:52 am (utc) on Feb. 16, 2008]
[edit reason] fixed typo per poster request [/edit]
| 5:18 pm on Apr 7, 2008 (gmt 0)|
I see less supplementals as mentioned in another thread (maybe 3% less), let's see what happens next, maybe 4%.
| 5:39 pm on Apr 7, 2008 (gmt 0)|
well now google has promoted a site from page to to #2 on page 1 here which:
- can't be accessed as the server is down
- hasn't been updated since 2004
in a business related niche
| This 36 message thread spans 2 pages: < < 36 ( 1  ) |