|large amount of content is the king! |
Seems to me that in a small niche sector with tight themes..this would not be the norm...
...but in large sectors with deep vertical/horizontal themes and sub-topics..one would be forced to create a large amount of content to "serve" the user base...(whether this is done organically/naturally/manually or machine generated <= if so...it better be done with the human reader in mind...the engines can spot this type of content pretty easily...)
|- search results pages within content of the site (not extracted from search engines and directories) |
Yes..if you can generate your own search results from your site and store these as additional searchable pages (say in a global index or something)...then you can really hone in on your sector with some very interesting content...(both vertically and horizontally)
Righto. User searches stored as indexable pages is not a new trick - been around for a while, and I think some really large portals are doing it. Or used to do it a while back.
Anyway, G would tweak something else after a few months, and they would all be moaning here about how they cant get G to index their sites, millions of pages dropped etc etc.
|Anyway, G would tweak something else after a few months, and they would all be moaning here about how they cant get G to index their sites, millions of pages dropped etc etc. |
I don't see that happening, however. On the contrary, the sites I'm keeping an eye on have been doing well on the serps for sometime now.
I guess we are witnessing a new kind of SEO emerging. Dynamic-SEO, we might call it. Its based on cooperation among SEO specialists and script / database programmers aiming at generating large amount of pages that Google loves to index ;-)
Sounds good, amyone with any info on how or what software to use to generate a search results database of pages?
I have two big sites, one is 5000 plus pages of content, the other is a forum with about 50,000 pages of true content.
I was just wondering what you felt was "large".
Hi kamikaze Optimizer
I would consider 50.000 as "large".
However I have seen sites of for example 300.000 and 700.000 pages etc.. following Dynamic-SEO techniques to boost the number of their indexed pages.
[edited by: tedster at 6:53 am (utc) on Oct. 15, 2006]
|Its based on cooperation among SEO specialists and script / database programmers aiming at generating large amount of pages that Google loves to index ;-) |
Well, it's just like generating spam, except each landing page contains useful information.
Writing a script and designing a database is the easy part. The hard part is collecting the data and publishing them in a way that doesn't send your entire site into the supplemental index.
none of this is news and was observed in this forum 2 years ago
I could easily be 700,000 pages also, but I have worked very hard to point the bots to the real content and removed the chances of dupe content.
|However I have seen sites of for example 300.000 and 700.000 pages etc.. |
As is, Google still reports over 300,000 pages of content, but we both know that this is not correct.
"publishing them in a way that doesn't send your entire site into the supplemental index."
Good subject, any ideas on that?
|I could easily be 700,000 pages also, but I have worked very hard to point the bots to the real content and removed the chances of dupe content. |
Dynamic-SEO isn't about generating duplicates. Lets say its other creative ways to explore and present info on your site through large number of pages.
|Well, it's just like generating spam, except each landing page contains useful information. |
I wouldn't associate Dynamic-SEO with spam methods in anyway.
Dynamic-SEO is an intelligent approach to Google indexing. While spam is a very primitive stupid way of operating on Google index ;-)
Generating a lot of pages from the google or yahoo or msn index by saving results of queries; generating large number of pages from your own site search results.
Not very different, are they?
Of course, rearranging product titles, descriptions, details in different categories etc to make a large number of unique indexable pages is tougher, and a bit farther away from spam.
|Generating a lot of pages from the google or yahoo or msn index by saving results of queries; generating large number of pages from your own site search results. |
Not very different, are they?
In fact there is a big difference!
In case of search engines and directories, you generate pages not from your own site including contents which you don't own .
In the case of Dynamic-SEO, you generate pages from your own site including your own content, intellegently ;-)
none of this is new, and many here discussed it and did it 2 years ago.
(1) Get some content in a database
(2) Present it in many ways
get a database of 2000 names
split them into genders
split them into origins
split them into starts with
split them into pet names
then have a page for each generated by a database and sitemapped.
female dog names or origin ecuador starting with e
elphant names from india
male dog names starting with f
hundreds of thousands of content pages, from a small database.
Its simple, it works, generate a big site.
and guess what. people DO search for 5,6,7 word phrases jsut like this.
Plenty of topics you can do this for.