If the pages are still on your site and Google has indexed them Google will check them every once in a while and they will remain in the index.
This is a typical method of getting pages indexed (linking to them for pages that are already indexed). But another method would be to get external links from on-topic web sites directly to specific areas you're trying to get into Google.
I probably have more external links to sections under root than I do my home page. This might not be great for PR but it's great for traffic and getting indexed.
I think what you're describing is nothing more than a "What's New"
On one of my sites I have a What's New page which is the only page (other than the home page) that gets a daily freshen up.
I put the 6 most recent additions to the website in there, and yes, they do seem to get discovered and indexed more quickly. Since, however, they are also linked from the site as a whole (as in your case, possibly 4 clicks from the home page), they don't disappear when I change the What's New - they simply don't get revisited till my site is given a good going over by Google, which seems to be about once a month.
In point of fact, this is hardly "tricking" Google - it's simply pointing out to your returning visitors new pages to go and inspect. In that sense, I've found that the Google side-effect is just a bonus to a page that makes the site much more user friendly.
Excellent point, Derek. The best thing to keep in mind when optimizing for Google is to not optimize for Google, optimize for your visitors.
Is it not possible for you to implement a 'site map' or 'article map'?
This way all articles will be 2 clicks away from home page
we'll soon my content manager will have a automated site map.. but not yet.
While your "Whats new" box will get pages indexed by google, you have to keep in mind that page rank works by what pages link to other pages. When you rotate out your old pages after they are indexed, they will no longer have your incoming links.
I would recommend an archives section that allows you to browse your older pages.
As oposed to a site map why not use a site directory with the articles split into categories. That way all articles will be linked to from a page with good on topic content.
Choosing what links to put on your home page is your call. I'd see nothing wrong with a "What's Cool" that gradually cycles through the articles on your site. But a site map would be better, because then all the articles would be there for your visitors and for bots to find.
Forget about on-page optimization.
Here is what really works:
Get yourself a bunch of friends edit all your pages and append a NOSCRIPT section.
In this section put about 15 links with the keyword you would like to spam.
It's working 100%. A competitor of mine is doing this since ages.
It survived Florida and is even improving :(
Number 1 for all of these keywords.
I contacted Google several times, but they don't seem to care much.
I tried to put up a post on it, but Brett doesn't seem to like this topic (Maybe nobody shall know it).
Why can't Google just ignore NOSCRIPT?
This is really discouraging.
If you want examples, sticky me.
How many links should be on a site map? 100 was the guideline, but for large sites do we have sitemap1, sitemap2 etc.
If you have a hundred pages on your site - you need 100 links. If you have more pages, you might need more. I thought the days of SEO were over! (despite use of the word 'Trick' in this discussion title)
I often use the links I see at the bottom of places like:
I know I use that all the time - and I am sure others do as well. This might not be as immediate a fix to your problem, but should allow shorter routes to some of your older articles:
Home Page -> New Article -> Older Article ->
Site maps are nice, but how many people really use them. You want your site to be better for both users and search engines. It will also increase your Page Views per user - if you honestly pick pages that would interest them from the article they just read.
Google should eventually pick more of them up. What you want is other sites to as well - and they are more likely to if you use the technique I mentioned (IMHO). As you are going from targeted content to filtered targeted content.
This is truly one of the more powerful techniques you can use. Slightly different than another technique I used that was extremely successful.
Just my 2 cents.
|Site maps are nice, but how many people really use them. |
I do. I use them a lot.
I use them, but nowhere near onsite navigation. If users are using a site map more than onsite navigation - there is a design flaw.
I guess that's far less than 1% of the users.
This filter would have the least collateral damage of all filters.
P.S.: Get yourself a new Phone ;)[/edit]
|P.S.: Get yourself a new Phone ;)[/edit] |
It's a T-Mobile Sidekick. It is new.
|100 was the guideline, but for large sites do we have sitemap1, sitemap2 etc. |
I'll second this question. I JUST got the bulk of my content up this week in time for my monthly enema (deep crawl), but nothing in my deeper pages was picked up apparently. Everything down to level 3 was picked up great, but 4 and down (it's an automated thing, I have to go that far down using "next" pages) don't seem to be registering. Maybe the Gbot works in waves because awhile ago it took level 2 and this time it seems to have taken level 3. Maybe next time will be 4, but I still feel I need a site map.
...anyway, I've got several hundred pages. What is the definitive answer to a sitemap on a site over 300+ pages? Multiple maps? I would just use categories, but I'd really like each page indexed and using categories might push the articles themselves too far down again. All the articles are unique and each deals specifically with a seperate topic, so there's no real "central" point to direct the Gbot too, ya know?
DRGather, the hundred links guidelines is a suggestion because we sometimes only index the first 101K of a page. So if the site map page is < 100K, you're probably okay. If it would be more, you could break it into alphabetical chunks, or you could also do it chronologically. If you've got that much data, I'd think about how a user would want to see it presented--categories? Maybe a taxonomy? As long as the links are static, Googlebot will often scarf up lots of pages.
You have a point chris_r, site map is my last resort if I can't find what I'm looking for in a site. ;)