Forum Moderators: Robert Charlton & goodroi
The problem isn't anything onpage, it's not duplicate content, it's not bad code. Could it be possibly caused by sitewide inbound links? I doubt it.
Note: these are some of the best internally linked pages of my site, yet others that are equally well linked are still alive and kicking.
Anyone care to shed some light on the situation?
I think that I may have discovered what was triggering my site's Google problem, text and a link in noscript tags that had been added months ago. The text and link were non-spammy, but I removed them after discovering that they could be considered hidden text/links. I then submitted a reinclusion request to Google. Soon after I did this, the repeated indexing/deindexing problem ended... And hopefully it won't return.
I seems as if the daily updates created some instability in the serps.
Guess:
If we assume that daily update doesn't mean that the whole index is reranked every day, but only that changes in filters etc will be infused daily, the delay until everything is reranked would cause some "ripples" I assume, like water. The more changes, the more ripples and then wild effects.
On a montly update the ripples will fizzle out soon, but daily they will remain possibly and produce artefacts. Any system that is constantly perturbed will show this effect.
Maybe there are long term effects. It's now a highly dynamic system, but not only from the webmaster side but also plex side.
And the changes will be huge for some and unoticable for others.
< Discussion continues here: [webmasterworld.com...] >
[edited by: tedster at 9:17 pm (utc) on Jan. 13, 2007]