Forum Moderators: Robert Charlton & goodroi
As I mentioned above, make sure not to fill up either element with general wording that really applies to the entire website rather than the specific page. If these two tags look too similar to others, going from page to page, then Google in recent months has been droppping such urls into the Supplemental index.
How did you revive pages that were previously in Google's supplmental index? I am having this problem with having the same description meta tag for all instances of a page, ABC.asp
ABC.asp is counted as supplemental for each instance.
eg.
ABC.asp?Id=126
ABC.asp?Id=23432
ABC.asp?Id=33
If I change the description tag so that it is unique, will G remove these pages from the supplemental index? Do I have to email them to do this?
I am also in the process of finishing a project that rewrites the ABC.asp page from:
ABC.asp?Id=126 into ABC_126.asp
ABC.asp?Id=23432 into ABC_126.asp
ABC.asp?Id=33 into ABC_126.asp
and the old ABC.asp?Id=someNumber will return a 302 Object moved status. What do you think?
These pages are unique in content, only link is to main index, more 'frustrating' is that these pages are in an Asian text so how they can decide it's supplemental is amazing.
Every other new page within the last month now indexed on various sites, ours and those I have been asked to monitor, have gone straight supplemental.
Both titles and meta description should be unique and exactly tuned to the details of the page itself, and not the whole site. Keep the title tag relatively short, and the meta description more extended.
Agree with the theory, but recently Google has been completely ignoring the title tag and meta description on some sites. A site: search which used to show all my pages as individuals with unique descriptions now shows no unique meta descriptions.
If these two tags look too similar to others, going from page to page, then Google in recent months has been droppping such urls into the Supplemental index.
Again, agree with the theory, but in practice Google is currently putting a lot of pages with unique meta descriptions and completely unique content into the supplemental index.
I'm also seeing more thin content pages (100 words or less) ending up in the supplemental index, irregardless of title/meta description.
Irregardless? What a lovely word! It's not just pages short on content that are going supplemental. One of my sites, which contains 100% unique content and long articles has gone supplemental since beginning of May.
I know this is probably a really stupid question but why do they have this thing? Conceptually I understand it, but never really understood why they created it. Just seems like its causing more problems than it solves. Its fine to have the junk drawer of the internet to throw everything you don’t really need, but don't want to get rid of, into, (got one of those drawers in my kitchen), but I would appreciate it if they wouldn’t throw my stuff in there. It does appear as if BD has vastly expanded what Google thinks should go in there.
Wouldn't it be a lot simpler to either have a page in the index or not?
Anyone else ever really wonder why this thing exists? Don’t see it in MSN and Y.
I'm guessing there are two main motivators, and they are related.
Recent fashions (blogging, article publishing, most recent spam methods) all involve plastering the web with multiple copies (often thousands of copies) of the same machine generated twaddle. All this may provide the occasional author with a very slight boost to their page rank (dread term), but mostly it is simply clutter, which threatens to paralize Google, and eventually the others.
As 'leader' much of this twaddle is tailor-made for Google, so they will inevitably be worse off than the others (Like M$-windows-IE getting viruses more than Linux, etc).
Google needs to separate the wheat from the chaff, without giving detailed feedback for spammers. That's why 'total penalties' are getting rarer, and 'markdowns' much more common. Which in turn, explains why spammers are increasingly going for throwing multiple stuff at the fan, hoping some sticks, as more sophisticated methods are being undermined.
Like every other attempt to clean the Augean stables, there is collateral damage. We wait and cross our fingers.
Very largely guesswork - but much of it fits the facts.
I did and page for visitors with some more navigation on each page and lots of content. Title Tags, META all different. No link exchange, no black hat technics. Vanished. Thanx a lot google. Its time to pay attention to the dark side of the web.....
But Google didn't make those pages, which were deliberately made to exploit loopholes in their algo.
We call it 'spam' ;)
Because Google is #1, most spammers aim specifically at Google. It's not that the others are better at dealing with spam (they have it too). just that 95% is aiming at Google.
What makes you think that Google "makes webmasters responsible"? - that's quite new to me.
I wish you luck in your new career; but before you sign up for your black hat, look again at those serps.
Yes, spam often - too often - gets in at the top. But look at the bottom - there's much, much, much more there. Some spammers succeed, for a while - many never do. And most decent sites that get clobbered in the crossfire recover when Google fine tunes the algo fixes.
And that is what you see now. It does not work. Thousands of decent websites have vanished and did not recover. All they did is bring up the major pages to the top, that is the most easiest way to have serps without spam. Google has lost the way.
What I mean webmasters are responsible? I mean now every webmaster that was doing just fine before BD, and now has vanished is going crasy to look after errors in html code, duplicate contet, external links and so on. MSN and Yahoo don´t treat the webmasters in that way. Its only google : DO NO EVIL!
DO I need to find 30 pages with price comparismn in to 30 serps be searching for a widget? No I do not. The small clean and good pages are not in serps anymore. In a few years there will only be the big business in Internet.