Forum Moderators: Robert Charlton & goodroi
I have a site with say 5 or 6,000 pages total. There are maybe only 1500 actual pages like .php and ,html actual files. My current indexed page count in Google is at about 2800 pgs.
Today I was about to upload a Google SiteMap when i got to thinking?
If this script runs from WITHIN my directories and is ignorant of all that database content and scripted content (3000+ pgs) then won't it likely hurt my site rather than help it?
I start thinking that without spidering my site conventionally, this siteMap will actually distinguish hard coded content from data driven content. Even my Newsfeed pages will then show as not being actual content but more of scripted aggregated feeds etc.
So, that's my story, and question? Is Site map a good thing for me or not? I haven't used it yet and have done pretty well, anybody have a before & after experience that I can draw from?
Thanks.
If I understood your question correctly, you have 2800 idempotent URLs indexed by google and another 3000 or so different pages of database-driven content, which seem to reside under one or only a few URLs. In a similar situation I have almost 1000 landing pages of my shop system. Parallel to the database-driven navigation inside the shop, I designed one decided page for each category. It is a single php routine, which generates the pages via fwrite, but you have to pay careful attention to generate different page-titles and descriptions.
I also noticed, that it is not convenient to submit this mass in one bunch; you seem to trigger some sort of unnatural growth filter if you double your number of pages in one day.
I see no reason why it might hurt your site to make that content indexable, provided each individual page contains enough information worth being indexed, i.e. is of particular value for your visitors and thus google's spider.