Welcome to WebmasterWorld Guest from 184.108.40.206
we have a site with free, quality content. We launched it last summer, page indexing and ranking started well and were getting better until the end of september, when the traffic was around 30k unique per month. After that, our site dropped suddenly in rankings, and our traffic was around 0. We made no changes in that period and we do not use heavy SEO practises.
We do monitor the website with google webmaster tools and no problem is pointed out there.
In december the site appeared again in serp and our traffic rose a little, but was about 1/3 of september. We had another drop to zero in february and now we are back to december levels from march.
But from january to march we saw our indexed pages drop a lot, from about 250,000 pages to 14,000. Consider that our site has more than 1 million different pages.
What could we do to have our site not treated well, but at least decently.
Thanks in advance to all of you.
[edited by: tedster at 5:54 pm (utc) on May 2, 2009]
Have you set up a Google Webmaster Tools account and checked in there for any reports or messages that might be involved in this indexing problem? I'd also suggest an XML sitemap if you aren't already generating one.
We use webmaster tools and no problems are pointed out there.
Moreover, we have similar sites in other languages, and never had these kind of problems.
Another thing that is strange is googlebot activity graph, shown in webmaster tools. The number of pages scanned per day were high (max 43,000/day) and became low (min 813!) and they do not seem to come back higher. Don't know if there connection with the drop of indexed pages.
its a old and good traffic site...any suggestion.
No specifics please, but where did you get the content for these 250k pages?
Is the content unique to the web?
How many of those 250k pages have incoming links from external sources? A better question might be what percentage? 1% of pages with incoming links would be 2500 pages. If only 1% of pages have incoming links, would you index 250k pages deep if you were google? No. (Maybe I'm off base here, but that's my quick thoughts given no context real of the situation.)
I suspect GoogleBot has determined, for either PageRank or quality reasons, that your site is not worthy of the level of crawl depth you would like to have.
To me, even 14k seems like a lot for a >1 year old site.
One thing to keep in mind re this situation is if you 'added steadily' under the threshold, but Google didn't spider all 2700 new URLs one day (or even for a couple of days in a row) and the site is dynamic, which means it probably doesn't serve 'last modified' headers, when G came back and spidered again it could have looked like you added 5400 or 8100 (or more) pages in a single day, even though you really didn't...