Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
I'd love to have all my articles within 2 clicks of the homepage but it isnt possible.. most are 3-4.
My pagerank is only 4.
One my homepage however, I have a "new articles" box with links and descriptions to some new articles we have.
I've found that the only articles google has indexed on my site are the articles that are in this box currently.
So here is my idea.
If I rotate in more articles into this box, wait for them to show up in google results, and rotate more articles in, google should eventually have all my articles indexed.
My main question is, the articles I rotate-out of the box after being index, will they disappear from google?
This is a typical method of getting pages indexed (linking to them for pages that are already indexed). But another method would be to get external links from on-topic web sites directly to specific areas you're trying to get into Google.
I probably have more external links to sections under root than I do my home page. This might not be great for PR but it's great for traffic and getting indexed.
On one of my sites I have a What's New page which is the only page (other than the home page) that gets a daily freshen up.
I put the 6 most recent additions to the website in there, and yes, they do seem to get discovered and indexed more quickly. Since, however, they are also linked from the site as a whole (as in your case, possibly 4 clicks from the home page), they don't disappear when I change the What's New - they simply don't get revisited till my site is given a good going over by Google, which seems to be about once a month.
In point of fact, this is hardly "tricking" Google - it's simply pointing out to your returning visitors new pages to go and inspect. In that sense, I've found that the Google side-effect is just a bonus to a page that makes the site much more user friendly.
I would recommend an archives section that allows you to browse your older pages.
Here is what really works:
Get yourself a bunch of friends edit all your pages and append a NOSCRIPT section.
In this section put about 15 links with the keyword you would like to spam.
It's working 100%. A competitor of mine is doing this since ages.
It survived Florida and is even improving :(
Number 1 for all of these keywords.
I contacted Google several times, but they don't seem to care much.
I tried to put up a post on it, but Brett doesn't seem to like this topic (Maybe nobody shall know it).
Why can't Google just ignore NOSCRIPT?
This is really discouraging.
If you want examples, sticky me.
I know I use that all the time - and I am sure others do as well. This might not be as immediate a fix to your problem, but should allow shorter routes to some of your older articles:
Home Page -> New Article -> Older Article ->
Site maps are nice, but how many people really use them. You want your site to be better for both users and search engines. It will also increase your Page Views per user - if you honestly pick pages that would interest them from the article they just read.
Google should eventually pick more of them up. What you want is other sites to as well - and they are more likely to if you use the technique I mentioned (IMHO). As you are going from targeted content to filtered targeted content.
This is truly one of the more powerful techniques you can use. Slightly different than another technique I used that was extremely successful.
Just my 2 cents.
100 was the guideline, but for large sites do we have sitemap1, sitemap2 etc.
I'll second this question. I JUST got the bulk of my content up this week in time for my monthly enema (deep crawl), but nothing in my deeper pages was picked up apparently. Everything down to level 3 was picked up great, but 4 and down (it's an automated thing, I have to go that far down using "next" pages) don't seem to be registering. Maybe the Gbot works in waves because awhile ago it took level 2 and this time it seems to have taken level 3. Maybe next time will be 4, but I still feel I need a site map.
...anyway, I've got several hundred pages. What is the definitive answer to a sitemap on a site over 300+ pages? Multiple maps? I would just use categories, but I'd really like each page indexed and using categories might push the articles themselves too far down again. All the articles are unique and each deals specifically with a seperate topic, so there's no real "central" point to direct the Gbot too, ya know?