|How many pages will Google index?|
Google index maximum
My site is a genealogy site. It took me a while to understand why people contacted me without following the directions. They dropped in from a search engine and bypassed the home page.
I made a page for every town in the Italian provinces I was covering. Traffic went way up because people were keying in the town names as key words.
Traffic dropped like a stone when I added a page for EVERY town in EVERY province, 14,000 files in one directory.
Meanwhile, I added Mexico, and traffic for that part of the site is brisk. A few dozen or a few hundred towns per state, and I think Google indexes all of them. I think it stopped indexing my Italy pages, or at least some of them.
Well, I've added dozens of countries now. If I add a country as argentina.domainname.com will Google count that as an entire website in its own right, or will those pages count toward the maximum number of pages Google will index under domainname.com?
The pages for all the towns are very similar, and written with a QuickBASIC program. The town name is in the title of each page an in large print near the top of each page, as in "Genealogy surnames for Castelcivita, Salerno, Italy"
I don't know of any limit. Webmaster World has close to 300,000 pages indexed on Google, Amazon more than 7 million.
There can be any number of reasons why traffic dropped. Did you change the linking pattern when you added the new pages? Could extensive navigation be masking the actual content and making it appear to be near duplicate? Run a couple of similar pages through Brett's Sim Spider [searchengineworld.com] to see exactly what the search engines see. Also, if the individual city pages had any page rank before the addition, adding a bunch of new pages might have diluted PR that was flowing top down.
There's probably a bunch more points and more folks will be around to pitch in, but those are a start.
And yes, Google would see a subdomain as a completely different domain. Unless there are other reasons, I don't see why you would have to go that route.
And by the way, Welcome to Webmaster World.
Not only PR, like Jimbeetle mentioned, but consider the keywords too.
Could it be that you pulverized the keywords which in the past were all in only one page (giving it very high relevance for the searches)?
I understand you don't have so many pages for Mexico, which would mean there's a greater amount of relevant keywords in each page, right?
Maybe you shouldn't have so many pages with so little content, but fewer pages with more content each.
Actually, the keywords were in every page. All the pages are identical except for the name of the town, as in, "Genealogy surnames for Castelcivita, Salerno Province, Italy". And some towns, of course, have some surnames submitted.
In that case might they be seen as 90% duplicate content, with little original/distinct copy?
|The real simon|
> Actually, the keywords were in every page. All the
>pages are identical except for the name of the town, as
>in, "Genealogy surnames for Castelcivita, Salerno
>Province, Italy". And some towns, of course, have some
I think what might solve the problem is to publish a page
for the town only if there is at least one surname submitted. This way, google will not consider the pages as duplicates.