Welcome to WebmasterWorld Guest from 220.127.116.11
How to make sure they are all listed in Google?
Should i create seperate page and place links to all 30000 files there?
Will google notice it (size of the page will be large because of links)
Or should i place links to all of them from homepage?
May i use hidden links or they all should be visible?
How will my ranking increase?
30,000 pages of content is really goo idea for any site. but placing all the links from the home page or from all the 30,000 pages would be like you will be left with nothing to write in any page.
I had the same issue earlier and I tried creating many site maps instead of just one and i saw that all the pages are crawled and i happened very well
why dont you try that
Why would anyone bother to return with the barrage of infantile replies that this thread produced?.
I just love bookmarking threads like this as it gives a great insight into the quality of those members who may approach me for link exchanges.
Andrew, I have to assume that the scale of content simply scared some people and they got defensive thinking no one could legitimately accumulate such an amount.
Don't forget most of the people here are just nickel and dime Ma and Pa webmaster operations so their chances of ever having a 30 000 page site let alone needing to add 30k in one go is very remote.
Mail me and I will give you instructions how to add your content for maximum effect.
I cant see any problem with 30,000 or 300,000 pages or 3 million pages on a site providing its original content and the pages represent different aspects.
If a subject matter can contain say 2 million aspects and you and a team build content over time and upload them to your site and its all indexed correctly why should it be a problem?. A site like that i would class as an authority site on the basis that it contained more pages of data and information than any other in its sector.
Am i missing the point here? Surely the whole point of the internet is to provide quality information for the end user. More and more people are using the search engine as some sort of oricle these days so why shouldnt a site be able to carry that kind of quantity.
Im very interested in others views on this. In my sector the largest site i think has 250,000 pages indexed and the team i work on think its possible over time to take this towards 2 million!
Anyone know what the biggest number of pages for a site indexed by google is on the internet?
OK, it could be 30,000 products, lots of catalogues have over 30,000 items! It does not even have to be totally unique - a review site of 30,000 items - 30,000 dvds (the titles and the synopsis will maybe the same)
People get so wound up about the number of pages thinking they are flooding the engines - however it is a different skill promoting a 30,000 page site to a 300 page site.
Aslong as it is not 30,000 pages of scraped/autogenerated content or 30,000 search engine pages linking to themselves then less criticism would be nice.
This chap might (ok we know not from previous post) already have a 30,000 database run via php and working his gut off to make it indexable for a SE.
However - of course it could be scraped content
Factorials grow VERY rapidly, in fact 9! (9 factorial) is over 36,000.
That means you could take nine different words, and arrange them in over 36,000 ways no exact repeats!
Make one page for each arrangement .. instant 36,000 pages. (This demands automation.)
The text is absolute garbage of course, but some adsense publishers don't seem to mind that much.
At least the pages are unique, well sort of, and very very short.
No one page could be accused of kw stuffing.
The page titles could be numbered corresponding to the order of the words ..
123456789.html, 123456798.html, 123456897.html ...
The sitemap would look awfully dull. -Larry