homepage Welcome to WebmasterWorld Guest from 54.167.179.48
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Indexed pages drop from 250,000 to 14,000
giateno

5+ Year Member



 
Msg#: 3905582 posted 5:41 pm on May 2, 2009 (gmt 0)

Hello,

we have a site with free, quality content. We launched it last summer, page indexing and ranking started well and were getting better until the end of september, when the traffic was around 30k unique per month. After that, our site dropped suddenly in rankings, and our traffic was around 0. We made no changes in that period and we do not use heavy SEO practises.

We do monitor the website with google webmaster tools and no problem is pointed out there.

In december the site appeared again in serp and our traffic rose a little, but was about 1/3 of september. We had another drop to zero in february and now we are back to december levels from march.
But from january to march we saw our indexed pages drop a lot, from about 250,000 pages to 14,000. Consider that our site has more than 1 million different pages.

What could we do to have our site not treated well, but at least decently.

Thanks in advance to all of you.

[edited by: tedster at 5:54 pm (utc) on May 2, 2009]

 

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3905582 posted 1:50 am on May 3, 2009 (gmt 0)

Some kind of ranking and traffic drop a month or two after launching a website is a common experience - but at this point it sounds like something is going on beyond that "honeymoon period" effect. It sounds like the site may not have very strong backlinks/PR - or there may be a technical problem involved, such as too many multiple urls for the same content, duplicate title elements or meta descriptions, server bugs, robots.txt errors - many technical possibilities come to mind.

Have you set up a Google Webmaster Tools account and checked in there for any reports or messages that might be involved in this indexing problem? I'd also suggest an XML sitemap if you aren't already generating one.

dailypress

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3905582 posted 5:07 am on May 3, 2009 (gmt 0)

I'd also suggest an XML sitemap if you aren't already generating one.
Tedster, do you update the XML sitemap or do most people have it programmed to do it automatically? Any links where I can learn more?
tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3905582 posted 2:56 pm on May 3, 2009 (gmt 0)

Google has a nice full Help section about XML sitemaps:
https://www.google.com/webmasters/tools/docs/en/protocol.html

There are several approaches you can take, depending on your site's size and technical needs.

giateno

5+ Year Member



 
Msg#: 3905582 posted 12:01 pm on May 4, 2009 (gmt 0)

Thanks for your remarks.

We use webmaster tools and no problems are pointed out there.

Moreover, we have similar sites in other languages, and never had these kind of problems.

Another thing that is strange is googlebot activity graph, shown in webmaster tools. The number of pages scanned per day were high (max 43,000/day) and became low (min 813!) and they do not seem to come back higher. Don't know if there connection with the drop of indexed pages.

experienced

10+ Year Member



 
Msg#: 3905582 posted 1:12 pm on May 5, 2009 (gmt 0)

we have a site which is having almost 3000 pages with original content. since january 2008 we are trying to get them back in google but google is not indexing them at all. We are regularly adding fresh contnet pages in the site which are required but still there is no improvement. I tried deep search but google did not return enough pages.. although rankings are fine but still i believe if the pages will increase, i can get the more traffic.

its a old and good traffic site...any suggestion.

peterdaly

10+ Year Member



 
Msg#: 3905582 posted 4:47 pm on May 5, 2009 (gmt 0)

You have 250k pages and your site is less than a year old.

No specifics please, but where did you get the content for these 250k pages?

Is the content unique to the web?

How many of those 250k pages have incoming links from external sources? A better question might be what percentage? 1% of pages with incoming links would be 2500 pages. If only 1% of pages have incoming links, would you index 250k pages deep if you were google? No. (Maybe I'm off base here, but that's my quick thoughts given no context real of the situation.)

I suspect GoogleBot has determined, for either PageRank or quality reasons, that your site is not worthy of the level of crawl depth you would like to have.

To me, even 14k seems like a lot for a >1 year old site.

SEOPTI

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3905582 posted 5:06 pm on May 5, 2009 (gmt 0)

Peter, I don't think so.

It does not matter how old your site is. Trusted links, PR, good content and not tripping any on-site penalties matters. 250k URLs indexed is not a problem at all for a new site.


peterdaly

10+ Year Member



 
Msg#: 3905582 posted 6:26 pm on May 5, 2009 (gmt 0)

Trusted links, PR, good content and not tripping any on-site penalties matters.

I don't disagree. I question whether all that is in place, and intended my remarks to dance around those topics.

johnnie

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3905582 posted 11:14 am on Sep 2, 2009 (gmt 0)

If you have a 250k-page site with little inbound links, google might consider your content unworthy. Also, with that many apges, you have to make sure both the spider and any linkjuice can crawl freely throughout your site.

fishfinger

5+ Year Member



 
Msg#: 3905582 posted 12:34 pm on Sep 2, 2009 (gmt 0)

Consider that our site has more than 1 million different pages

in just over a year. How many people do you have working on the content for your site?

jd01

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3905582 posted 7:42 pm on Sep 2, 2009 (gmt 0)

Unless Google has changed the rules since I have been reading... It has been stated by Matt 'I' Cutts 'The Spam' 5000 pages a day is the max you should add to a site. It seems possible you could have steadily added 2700 (ish) pages a day to get to your current page count, but it also seems possible you tripped the 'too big, too fast' filter by adding too many pages at once...

One thing to keep in mind re this situation is if you 'added steadily' under the threshold, but Google didn't spider all 2700 new URLs one day (or even for a couple of days in a row) and the site is dynamic, which means it probably doesn't serve 'last modified' headers, when G came back and spidered again it could have looked like you added 5400 or 8100 (or more) pages in a single day, even though you really didn't...

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved