Forum Moderators: Robert Charlton & goodroi
Does the re introduction of "lots of new pages"* after the fix trigger a "flag" that then inhibits proper results from appearing as one would normally expect?
*[ Matt has said sites with more than 5000 pages launched per week will be flagged .. refer to his video ]
Some clarification on the sandboxing procedures and how to get out of the sandbox would be appreciated per [webmasterworld.com...]
The reason i ask is that this has occurred with us. We did the fixes in in July/Aug but we seem to have some "filter" still on our results after around 8 weeks. Should we be worried or just letting things work themselves through?
[edited by: Whitey at 12:16 am (utc) on Sep. 27, 2006]
Your are 100% right, but once your site is affected, it's very hard to crawl out the supplemental results.
Question should be if our site is in supplemental results due to redirect, canonicalisation, ur rewritting, duplicate contents etc. and has been fixed by webmaster (page has been 301 redirected etc.) how do we get those pages out of supplemental results?
Once a page is marked supplemental, Google hardly looks at those pages again and supplemental pages hardly ever get indexed again, which has a huge impact on rankings.
What would you ask him?
How can I consistently get the number one position for "all" of my targeted keyword phrases? :)
Once a page is marked supplemental, Google hardly looks at those pages again and supplemental pages hardly ever get indexed again, which has a huge impact on rankings.
There are ways to influence Google's crawling pattern to get those Supplementals out of there. But, if you did not address the problem that put them there to begin with, you'd be chasing your tail. ;)
Only URLs that return "200 OK" and are tagged as Supplemental need to be investigated for problems with the site. Those problems are usually multiple URLs for the same content. See the longer thread [webmasterworld.com]
It would be very BAD to have the actual page removed from the index, but there should be a way to remove the cache if the page is Supplemental and that cache will just sit out there forever waiting for an exploit to happen.
There have been enough server exploits and sites getting hacked lately without Google providing them with a roadmap to help them out. And they ARE doing just that with some caches of pages sitting out there that give holes for script kiddies to tap into.
[edited by: Marcia at 12:25 am (utc) on Sep. 27, 2006]
Ask this engineer if pages can be pushed into supplemental because of lack of inbound links. I'm not talking about from the same base href but outside inbound links. I am highly suspicious that there has been a trend since Big Daddy to send pages into the supplemental if they have none. If so, can they ever be restored to the regular index.
I don't believe that canonicalization and redirects are the only reason for supplemental results. Most of my site went supplemental long AFTER I did a 301. Some 10 months later.Ask this engineer if pages can be pushed into supplemental because of lack of inbound links. I'm not talking about from the same base href but outside inbound links. I am highly suspicious that there has been a trend since Big Daddy to send pages into the supplemental if they have none. If so, can they ever be restored to the regular index.
A site is indexed with both non-www and www URLs showing up (for a site where you prefer www to be listed). Some non-www URLs are shown as normal results, and so are some www URLs too. Some URLs of both types already show as Supplemental or as URL-only too.
One day you decide to install the redirect to www, and more www pages become indexed in the next few weeks. Once several months have elapsed after the redirect was put in place you see that many of those now redirected non-www URLs suddenly reappear in the SERPs as Supplemental Results.
I have seen that many times. They stick around for a year, and then vanish.
If that is what you have then no need to worry.
If the URLs that you redirect to are Supplemental than you have more issues to sort out: could be more of the same, or some other problem.
I have seen some sites dip into Supplemental due to a lack of inbound links to the site as a whole, and Matt Cutts confirmed that does happen - several months ago.
What you also need to be aware of, is throwing away your internal PR by inefficient internal linking.
Every page should link back to your root index page but don't throw all of your PageRank at /index.html because Google usually prefers to list www.domain.com/ instead; so link to that one as http://www.domain.com/ every time.
Use breadcrumb navigation to spread PR around the site, too, if you can.
[edited by: g1smd at 12:39 am (utc) on Sep. 27, 2006]
>> Ask this person about the whole supplemental pages problem and what exactly is it that Google expects from us so that we can change our sites if need be. <<I think that one has been done to death, both here and at Matt Cutts blog already.
Redirects, and canonicalisation is all it is.
Let's assume two url's went into supplemental due to duplicate contents, we decide to keep one and 301 redirect the other url.
How do we get the one we decide to keep out of supplemental? Do we just wait for it to happen?
They hardly ever cache supplemental pages, for that reason we could probably end up waiting forever..
Today is really crazy for me looking at google serps. Many old outdated sites that are no longer and many, many subdomains hinting at best on keyword relevance in many sectors that I'm watching.
There must be many things brewing at the complex right now. Really curious when some new serious updates will happen without outdated no-longer existing urls and?
"But, if you did not address the problem that put them there to begin with, you'd be chasing your tail. "
My question is if the problem has been addressed, how do you get Google to put these pages back into thier regular index?
I'm with you F_Rose! Please, if you say there are ways to get your site our of supplemental, please let us know. I'm going through my site now trying to figure out what is wrong. I'm cleaning up the code and site, but I really do not see anything that is wrong, unless it is the duplicate content filter, due to I sell so many products that are similar.
I'd be more than happy to clean up and fix my site if I only had a clue as to what is suddenly wrong with it. I am doing the 301 redirect now but I never knew about this being an issue until now. Could that one thing be what put most of my pages in supplemental?
Please do give us the steps to get our pages out of supplemental. It would really be appreciated by many of us.
To the OP. Please do ask the engineer if something as simple as the www versus NON-www issue could put most of your pages in supplemental. If not, then what are the major triggers that may have and how do you get out?
Thank you in advance!
Please ask if the meta tag "noarchive", applied to every page in a site, causes any degradation in ranking at all.
I have a lot of good reasons (legal among others) for not wanting cached versions of my pages - but I keep hearing the siren song of increased rankings if I remove "noarchive".
I would love to know, once and for all, if there is ANY effect in having that tag on the ranking.
Thank-you.
I just had my meeting with the engineer. It was pretty cool. He had tips for getting reindexed: Fix everything and start using google sitemaps. Sites that used a lot of spam or are from known spammers will not be reindexed.
About the sandbox... its related to links, older domains have benefit from having more links.
The main goal should be getting links with good link descryption from relevant sites. (so what else is new...)
And no he didn't know who Google Guy was...
I just had my meeting with the engineer. It was pretty cool. He had tips for getting reindexed: Fix everything and start using google sitemaps. Sites that used a lot of spam or are from known spammers will not be reindexed.
I am indexed, but since June 27th with a filter penalty reducing Google traffic by 85%
I tried to fix everything, I heared in this forum.
Sometimes, I feel like a witch doctor performing rain dances.
2. no strange javascripts with links build in on your site
3. no "nofolow" tags on your site.
4. don't use your keyword too many times on a url
[edited by: tedster at 2:14 pm (utc) on Sep. 27, 2006]
I have a safety-critical page that I do not want cached anywhere - historically #1 on all engines for its highly obscure keywords.
I added nocache tags about three months ago, and it vanished from Google's SERPs just like that. None of the other search engines were affected. I took the nocache off and it's back where it should be.
A highly unsatisfactory situation, but there it is.