Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Again supplemental results

supplemental results

         

rusmart

2:02 am on Jun 23, 2006 (gmt 0)

10+ Year Member



Please tell me why all my pages show google supplemental results

I even remove old pages from google and now only new pages 5 Months is old and still supplemental results whta I need to do....

tedster

3:13 am on Jun 23, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Each site's situation can have its own reasons -- but as a general rule, I find that paying attention to title tags and meta descriptions can work miracles. Both titles and meta description should be unique and exactly tuned to the details of the page itself, and not the whole site. Keep the title tag relatively short, and the meta description more extended.

As I mentioned above, make sure not to fill up either element with general wording that really applies to the entire website rather than the specific page. If these two tags look too similar to others, going from page to page, then Google in recent months has been droppping such urls into the Supplemental index.

rjwmotor

5:01 am on Jun 23, 2006 (gmt 0)

10+ Year Member



"similar title and description tags"

what to do then? If pages go supp. can they be revived? i believe this is what may have happened to one of my sites.

tedster

5:13 am on Jun 23, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes, I have recently revived pages for two sites with this approach.

benc007

10:06 am on Jun 23, 2006 (gmt 0)

10+ Year Member



tedster,

How did you revive pages that were previously in Google's supplmental index? I am having this problem with having the same description meta tag for all instances of a page, ABC.asp

ABC.asp is counted as supplemental for each instance.

eg.

ABC.asp?Id=126
ABC.asp?Id=23432
ABC.asp?Id=33

If I change the description tag so that it is unique, will G remove these pages from the supplemental index? Do I have to email them to do this?

I am also in the process of finishing a project that rewrites the ABC.asp page from:

ABC.asp?Id=126 into ABC_126.asp
ABC.asp?Id=23432 into ABC_126.asp
ABC.asp?Id=33 into ABC_126.asp

and the old ABC.asp?Id=someNumber will return a 302 Object moved status. What do you think?

Halfdeck

12:44 pm on Jun 23, 2006 (gmt 0)

10+ Year Member



I'm also seeing more thin content pages (100 words or less) ending up in the supplemental index, irregardless of title/meta description. So to be on the safe side, on top of tuning your title/metas I'd also beef up word count on every page.

asiaseo

1:04 pm on Jun 23, 2006 (gmt 0)

10+ Year Member



One site I deal with we have lots of pages showing as supplemental, cache 6 months ago. If I take a unique line of text as the search time I find the page 'not' supplemental, I check the cache date and it's 2 weeks ago.
So there seem to be 2 different cashes of the same page.
I guess this may have been answered 'somewhere' but would be pleased if someone could explain.

These pages are unique in content, only link is to main index, more 'frustrating' is that these pages are in an Asian text so how they can decide it's supplemental is amazing.
Every other new page within the last month now indexed on various sites, ours and those I have been asked to monitor, have gone straight supplemental.

malachite

1:09 pm on Jun 23, 2006 (gmt 0)

10+ Year Member



Both titles and meta description should be unique and exactly tuned to the details of the page itself, and not the whole site. Keep the title tag relatively short, and the meta description more extended.

Agree with the theory, but recently Google has been completely ignoring the title tag and meta description on some sites. A site: search which used to show all my pages as individuals with unique descriptions now shows no unique meta descriptions.

If these two tags look too similar to others, going from page to page, then Google in recent months has been droppping such urls into the Supplemental index.

Again, agree with the theory, but in practice Google is currently putting a lot of pages with unique meta descriptions and completely unique content into the supplemental index.

I'm also seeing more thin content pages (100 words or less) ending up in the supplemental index, irregardless of title/meta description.

Irregardless? What a lovely word! It's not just pages short on content that are going supplemental. One of my sites, which contains 100% unique content and long articles has gone supplemental since beginning of May.

Halfdeck

1:31 pm on Jun 23, 2006 (gmt 0)

10+ Year Member



It's not just pages short on content that are going supplemental.

Obviously. That's not what I said.

One of my sites, which contains 100% unique content and long articles has gone supplemental since beginning of May.

Not a rare occurance.

randle

2:17 pm on Jun 23, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Having some supplemental struggles ourselves on a few sites.

I know this is probably a really stupid question but why do they have this thing? Conceptually I understand it, but never really understood why they created it. Just seems like its causing more problems than it solves. Its fine to have the junk drawer of the internet to throw everything you don’t really need, but don't want to get rid of, into, (got one of those drawers in my kitchen), but I would appreciate it if they wouldn’t throw my stuff in there. It does appear as if BD has vastly expanded what Google thinks should go in there.

Wouldn't it be a lot simpler to either have a page in the index or not?

Anyone else ever really wonder why this thing exists? Don’t see it in MSN and Y.

Quadrille

4:00 pm on Jun 23, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There's no way to know exactly why google uses the 'supplemental' thing so much.

I'm guessing there are two main motivators, and they are related.

Recent fashions (blogging, article publishing, most recent spam methods) all involve plastering the web with multiple copies (often thousands of copies) of the same machine generated twaddle. All this may provide the occasional author with a very slight boost to their page rank (dread term), but mostly it is simply clutter, which threatens to paralize Google, and eventually the others.

As 'leader' much of this twaddle is tailor-made for Google, so they will inevitably be worse off than the others (Like M$-windows-IE getting viruses more than Linux, etc).

Google needs to separate the wheat from the chaff, without giving detailed feedback for spammers. That's why 'total penalties' are getting rarer, and 'markdowns' much more common. Which in turn, explains why spammers are increasingly going for throwing multiple stuff at the fan, hoping some sticks, as more sophisticated methods are being undermined.

Like every other attempt to clean the Augean stables, there is collateral damage. We wait and cross our fingers.

Very largely guesswork - but much of it fits the facts.

LuckyGuy

8:22 am on Jun 24, 2006 (gmt 0)

10+ Year Member



Don´t make me laugh. Google says in their guidlines. Make pages for visitors not for SEs. But what I see high ranking in serps is low quality pages only created for SEs. They all have in common, low quality, low text, low navigation.
Google has heavy problems and all they do is, they make the webmasters responsable for the bad serps. And we are going to believe it and try to find out what we are doing wrong. But yahoo and even MSN don´t have these problems and working very well with our pages. IMO momentary they do a better job than google does.
Google was working quite well with our pages, was able to find on page content but now they turned their content algo off.

I did and page for visitors with some more navigation on each page and lots of content. Title Tags, META all different. No link exchange, no black hat technics. Vanished. Thanx a lot google. Its time to pay attention to the dark side of the web.....

Quadrille

8:55 am on Jun 24, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It's no laughing matter.

But Google didn't make those pages, which were deliberately made to exploit loopholes in their algo.

We call it 'spam' ;)

Because Google is #1, most spammers aim specifically at Google. It's not that the others are better at dealing with spam (they have it too). just that 95% is aiming at Google.

What makes you think that Google "makes webmasters responsible"? - that's quite new to me.

I wish you luck in your new career; but before you sign up for your black hat, look again at those serps.

Yes, spam often - too often - gets in at the top. But look at the bottom - there's much, much, much more there. Some spammers succeed, for a while - many never do. And most decent sites that get clobbered in the crossfire recover when Google fine tunes the algo fixes.

LuckyGuy

9:39 am on Jun 25, 2006 (gmt 0)

10+ Year Member



"And most decent sites that get clobbered in the crossfire recover when Google fine tunes the algo fixes"

And that is what you see now. It does not work. Thousands of decent websites have vanished and did not recover. All they did is bring up the major pages to the top, that is the most easiest way to have serps without spam. Google has lost the way.
What I mean webmasters are responsible? I mean now every webmaster that was doing just fine before BD, and now has vanished is going crasy to look after errors in html code, duplicate contet, external links and so on. MSN and Yahoo don´t treat the webmasters in that way. Its only google : DO NO EVIL!
DO I need to find 30 pages with price comparismn in to 30 serps be searching for a widget? No I do not. The small clean and good pages are not in serps anymore. In a few years there will only be the big business in Internet.

g1smd

3:48 pm on Jun 25, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>> So there seem to be 2 different cashes of the same page. <<

Yes. This has been in effect for a while. Google keeps a copy of the current version of the page as well as the previous version of that page.