Forum Moderators: Robert Charlton & goodroi
Yes, one of my long-term "perfectly listed" sites, one that I use as a "reference" has suddenly had 30 of the 180 pages completely dropped from the index (a few weeks ago).
They aren't in the normal index, and they aren't supplemental either. They are not listed at all.
Have I hit an over optimisation penalty for having a unique title and unique meta data on every page, perfect code, perfect site navigation, and so on?
Or is it just some sort of new PR and IBL thing?
[edited by: tedster at 5:59 am (utc) on Nov. 1, 2006]
I guess having the index as thumbnails wasn't such a bright idea. Meaning a page with mostly images as links ( photo albums ). There are some text links like "first picture" and "last picture" which hold the framework together in the index, but the actual content is falling out from in between ;)
PR is passed on only for the first like... 2 images. ( ie. PR 2,1,0 ) Probably because G is following nothing but the text links, and PR is reduced by one point with each page...
And in the end... PR0 pages are starting to vanish.
Cool... i didn't know what comes next... at least i figured it out.
( Today, when it's already late. Probably. )
[edited by: tedster at 12:05 am (utc) on Oct. 30, 2006]
[edit reason] moved from another thread [/edit]
A new, additional, internal 301 redirect from the old to new URLs has at least kept the traffic flowing where that comes in on the "wrong" URL from search engine results.
However, that is incidental to the main point. It is about 30 of the non-moving URLs that were completely dropped. They do not show up at all anywhere, in any type of search I do.
However, I am now wondering if perhaps there is an external link pointing to the old folder at either a non-www or a .co.uk URL, and therefore there is then an internal 301 redirect pointing to the old folder at www...com and this makes Google reluctant to drop the target URL even though that has been 404 for a few weeks.
The single-folder redirect wasn't put in place until at least 3 weeks after the folder was moved, so for a while there was a "hanging redirect", a 301 redirect that goes to a destination URL that generates a 404 error. Now there is a new 301 redirect from the old folder on non-www, and from the old folder on both www and non-www on any alternative domain, and with all directly pointing to the new folder at www...com (no chain).
Anyway, that is still incidental to the main point that 30 of the non-moving pages have vanished from the SERPs.
30 of the non-moving URLs that were completely dropped
Would those "newly moved" urls have had backlinks to the dropped non-moving pages? Here I'm thinking that taking away some backlinks from Google's calculations (temporarily, you can hope) might have dropped PR below some critical threshold. Still, odd that they would vanish completely rather than go Supplemental.
Have I hit an over optimisation penalty for having a unique title and unique meta data on every page, perfect code, perfect site navigation, and so on?
Maybe excessive 301's? I am working on a theory there. Only SEOs seem to know much about the 301 header. Most hosts and registrars are clueless, and so are most webmasters. So it seems to me that a lot of 301s might raise a question of attempts to manipulate things.
I work with plenty of sites that show all the other signs you mentioned above and they are thriving. But I treat 301 redirects like radioactive isotopes. They can help heal in some situations, but...
I need to check whether the site has an HTML sitemap page. I'm not sure if one was ever made and uploaded. If not, I'll recommend that be done soon.
I hear what you say about redirects. All www and non-www URLs on all alternative domains are redirected to www and to .com only, and then the one moved folder now has a redirect for all domains to the new folder location at the .com location. That new redirect is first in the .htaccess file.
I've got reservations about the site operator tool and also how it is being applied with "webmastery" . This is why i started this thread: [webmasterworld.com...]
[edited by: Whitey at 2:22 am (utc) on Oct. 30, 2006]
I am using site:domain.com and site:www.domain.com to see all that is listed.
But it's probably broken Site Tool Issues [webmasterworld.com] and/or it's probably telling us other things as well which haven't been explained by our good friends at Google or recognised by our community.
If we're making these assumptions i just make the point that it would be good to be able to rely on this tool that supports a lot of our assumptions and conversations.
It worries me that webmasters are provided tools to measure their site's performance on the one hand with little explanation, and on the other hand there is doubt as to whether it works or not.
I just make the point that webmasters greatly need some clarifications from G on this. IMO
[edited by: Whitey at 11:38 pm (utc) on Oct. 30, 2006]
It gets pretty bad when we have to worry that since some of us know the correct way of doing things...we need to beware of doing them as it might be misconstrued as manipulative. In other words...we need to dumb our sites down to stay under the radar. Wasn't there a book along these lines?...hmmmm
> Or is it just some sort of new PR and IBL thing?
Did you seen any Java bots activities from Google IPs - Java/1.3.1_03,
Java/1.4.1_04 or Java/1.5.0_06?
Maybe your site in Web Purgatory for some reason.