Forum Moderators: Robert Charlton & goodroi
1. How many links on a page in the Directory?
We plan to create a Directory that will allow browsing of the content and will help search engines to spider the content. I've read the Google guidelines about 100 links or less on a page. Is that a good number to use? Do you run into issues if you have 300 or 500 links?
2. Anything else to ensure that each of the pages are indexed?
Because the content on the individual pages is pretty unique, we want to make sure that they get indexed so that while they will have low Page Rank, they still may pop because of relevance in some obscure search terms. Beyond the Directory and Sitemaps, anything else we can do to make sure they get indexed?
3. How do we roll out?
We have a domain that has an okay Page Rank where we will likely put the Directory and the Pages under this domain. However, I've heard that rolling out a bunch of pages like this all at once may hurt you. Should we consider limiting the number of pages and rolling them out incrementally? If so, what kind of roll-out would make sense?
1.Do you run into issues if you have 300 or 500 links?
2. Beyond the Directory and Sitemaps, anything else we can do to make sure they get indexed?
3. Should we consider limiting the number of pages and rolling them out incrementally? If so, what kind of roll-out would make sense?
On a brand new domain, releasing all at once and allowing Google to crawl naturally as it sees fit might make some sense. You will be highly dependent on getting inbound links and establishing trust in that case, but you will be in any case.
In my opinion, you are more likely to find trouble when re-adapting an exisiting domain, because this fits more closely to a spammer's profile. I would definitely consider how to roll out in a phased release. The particulars would need to be tailored to your situation - but I would suggest you start at relatively low level, and then build out more rapidly as you see a healthy response from Google.
Build your directory in logical pieces for your users. Put as many links as make sense for those users. If you start to have pages with more than 100 links, think about how it might make sense to split it up. If it doesn't make sense to split it up, then don't.
To get the pages indexed, I will give the same advice I gave in another similar thread. It takes three things, good internal navigation, deep links and time.
Those deep links will give you multiple entry points for both visitors and googlebot. The good internal navigation (search on breadcrumbs) will put more deep pages within a click or two from your deep links.
Most importantly is to accept the fact that it will take time. If the content is that good and unique, it will start gaining those links naturally as more people find the resource. Do some moderate things to speed things along, but steer clear of trying to force googlebot to speed up.
As for the rollout, I would take a look at the sections and see if it makes sense to put out specific, more popular sections first.
If it were a medical website, diabetes would be a more popular subject than scurvy. You might have 1000 articles on diabetes.
First week, put up the diabetes section and 10 diabetes articles.
The next week put up your congestive heart failure section with 10 articles and add 10 to the diabetes section.
The third week you do strokes and add 10 articles to the previous categories.
This way you are training googlebot to accept your ever increasing amount of content and giving it time to digest it.