Msg#: 4486470 posted 12:51 pm on Aug 20, 2012 (gmt 0)
I have a dynamic website that provides news, fashion updates etc.. And we upload at least 20 to 30 posts on daily basis in unique categories. So the total count of new pages may be more then 1000. My question is :
Is it fine to provide the visitors as well as search engines these much pages on daily basis?
or shall I divide the traffic by creating individual domains for individual topics?
e.g. currently it has the subdomains like: news.web.com fashion.web.com
Msg#: 4486470 posted 3:40 am on Aug 21, 2012 (gmt 0)
Welcome to the forums, webseosolution.
First, what quantity of new pages per day can be supported is pretty much going to depend on how much PageRank your site has. But the only downside for having a too low PR might be that all the URLs don't get crawled and/or indexed. I'd say if the site has a certain number of legitimate new URLs being published every day, then let them be available to googlebot.
Second, using topical hostnames (subdomains) will keep your URLs shorter - and it may also help you administer things technically. Years back, there may also have been an SEO advantage of some kind, but I don't think there is one anymore.
Msg#: 4486470 posted 5:02 pm on Aug 21, 2012 (gmt 0)
please let me know what should be that number approx according to page rank and number of indexed pages by google.
That's not the best way to think about it. Some sites (news sites, for example), publish hundreds of pages each day. Other sites with the same level of authority may publish only a dozen times a year (and sometimes not at all).
It's really about the demands of your audience and the topics you cover.
Msg#: 4486470 posted 7:47 am on Aug 23, 2012 (gmt 0)
That message is usually sent for a LOT more than a few thousand pages. Sounds like you may have canonical URL problems, or some kind of potentially infinite crawl space (wild card subdomains, site search exposed to googlebot, etc.)