It would certainly be beneficial to let Google spider your entire site. If each page has some unique content, it may be what a user is searching for. It's kind of like having all of you eggs in one basket.
People do search for others things that are related to you, than what you have on your homepage.
|is there any benefit or any disadvantage from allowing the spiders to index all of my site even though my Home Page is the only page to be primed for keywords? |
Without knowing a lot more about your site I can only speculate. But in general (and yes, your mileage may vary) it is not a good idea to concentrate all of your keywords into one home page.
In most cases you will do better by having separate pages optimized for sets different keywords (assuming that you have more than a handful of them).
For example, it is very important to have your keywords in the title. That is quite easy to do if you have two or three keywords per page, but if you have a score of keywords you cannot put them all in the title.
Don't target only single phrases. And never ever put them on only one page. If you do this, your "site" consisting of just one page will be nuked completely in the next Florida.
I had read (probably in Webmasters World) that I should NOT let the spiders index any pages that are not primed for keywords/phrases. This was because, once a spider has visited, it would not return for a long time. Therefore, it was suggested that it may be best to wait until I had primed the relevant pages so that the search engines would position the page better?
Any thoughts would be appreciated.
|This was because, once a spider has visited, it would not return for a long time. |
This is quite nonsense.
The only exception I know is when a page is not ready for your users. But then you wouldn't put it online at all.
A crawler should be granted access to all pages to which outside visitors get access, two. That's the purpose of search engines.
Thanks for your advice. I was just wondering why would there be the option of putting NoIndex, NoFollow in the html or Disallow in the robots text if all pages should be indexed by the crawlers?
1,064 posts - wow that is some going - hope I get to that one day.
I am quite a newbie and have only my Home Page primed for 3 different keywords. At the moment I am building up content but, to be honest, I haven't primed any of my other pages for any other keywords and my traffic is really low. i still have a lot to learn obviously.
It sounds a good idea to have seperate sets of pages optimised for different keywords and i will do that in the future but, as mentioned before, at the moment I have not let the spiders crawl my site because they are not really ready for crawling as i don't think they will help my SE position. Do you agree with dirkz that all pages should be available to the crawlers as long as they are in a reasonable state to be read by the surfers? What about if I prime them a day after google has crawled - will the crawler come back to visit soon or will it take 6 months to return?
You're smoking the wrong end of the pipe here. The only purpose of search engine spiders is to blaze the trail for humans. And when humans come, and find nothing but spider food, they'll shake the dung off their sandals and leave.
If your pages aren't ready for humans to look at them, get them off of your server, and do it yesterday. If they are, then let the spiders at them.
And don't worry about the spider's schedule. You have no control over that. Monitor the content developer's schedule, because THAT'S the pipe all of your eggs have to go through before you can put them in your basket.
You're getting some advice from a very clueless source. Forget everything you heard from there -- get a lobotomy if you have to, this is critical! -- and start reading old threads in this forum.
|and start reading old threads in this forum. |
And since you claim to be a beginner, start where many of us did: Brett's famous Successful Site in 12 Months with Google Alone [webmasterworld.com] post.
The web is a very dynamic place, and details change very fast, but general principles have a much longer shelf life. Brett's post gives an excellent mix of the two.
You've been shot down in flames a bit here mate. But don't worry - just the WW 'heavies' flexing their buttocks.
A robots.txt / meta robots instruction can be useful to exclude stuff you *really* don't want indexing (because Google indexes a lot of inappropriate stuff). For example, Google decided in 2002 that our 'Checkout Page' was the most important one on one of our sites - but anyone clicking on that link would get a message 'Your Shopping Basket is Empty'. So obviously we excluded that from the mindless (and increasingly irrational) GoogleBot ;) Since then we've excluded a few other files and folders, such as our CGI bin, etc. Simply because they are of no interest to a searcher.
But the general advice to make all your info available to the surfer still holds. Trouble is, the recent changes in Google mean that webmasters are forced to add more and more information surrounding their subjects, in order to make their already relevant pages (pre-Florida) 'look' more relevant post-Florida.
It's a short-sighted and exponential effect - but never mind - we are adding pages as quickly as we can type - but this is no good for the Internet having all this extraneous crap piled onto it.
This is what I do, I set up a site and review the search terms used on ALL pages for around 3 months, then tweak the terms on those pages to increase the visitors.
Never set up a page solely for search engines target humans then the engines with little tweaks.
I cant understand why anyone would allow only the home page to be spidered.
Essex_boy makes a very good point, keep an eye on your site and then tweak it to reinforce your strengths and cut your losses.
I have often seen certain pages coming in at number 11 for a particular search term and managed to bring it up on the first page with some simple changes.
It's also a good idea to see how your page reads in the SERPs compared to the others. Although my site was coming in at number 8 for a very important search phrase, the other sites had descriptions and snippets that read like a string of keywords where mine flowed naturally.
What's important is bringing visitors to your site and not only getting the site listed high in the SERPs.
|What's important is bringing visitors to your site and not only getting the site listed high in the SERPs. |
When all is said and done, this is probably the "Fundamental Theorem" of SEO. Of course visitors won't come unless you are listed relatively high in the SERPs. But the bottom line is visitors (or, to be more accurate, conversions). The SERPs are a means, not an end.
|will the crawler come back to visit soon or will it take 6 months to return? |
If you have a decent Page Rank, you will see the bot come through almost daily. Usually, PR5+ pages get looked at very often, PR4 fairly often, PR3 less often, etc.
linton, don't exclude googlebot from anything. Don't get bogged down concentrating on certain magic kw's. It's better to have one hit a day on 50 different kw phrases than 50 hits a day on one kw phrase. Having many pages with lots of content is the way to accomplish that.
People who zone in on one particular kw phrase are liable to "Florida" style eradication. (Re-stating another post in the thread).
In addition to what Essex Boy suggests, Set up some pages that are strictly human food, and don't even worry about the bot.
Put together a few articles about your topic, and forget all about Googlebot. Do not "prime the kewords" or whatever.
Why would I recommend this? There are only so many sites (5-10) that can be on that front page for the keywords that you decided that you want. Odds are pretty good that you will not always be on the front page for those searches.
For example, if you are putting up a page for a hotel in widgetland, and you optimize for "widgetland hotel", you will not only miss out on the searchers for "widgetland lodging", "widgetland accomodations" and things like that, but you will also lose out on "widgetland hotel near the beach" "widgetland hotel restaurant", and "widgetland hotel reservations". And Even when you are nowhere to be seen on your "main keywords" you will still pick up front page on all sorts of unplanned searches.
All those unplanned searches are the ones that save your butt when you disappear for your main keywords.
By the way, it helps to have good deep links.
|All those unplanned searches are the ones that save your butt when you disappear for your main keywords |
Ain't that the truth!
But what I'd like to know is WHY is google filtering out my entire web site for "widgetland hotel restaurant" (not in the top 1000 when previously at #2) and NOT FILTERING for "widgetland restaurant" or "widgetland hotel"?
Sorry to get off track here but I just wanted some opinions on whether or not it's better to abandon the EVIL KEYWORDS which are provoking what in my view is an obvious filter and focus on the more obscure searches or continue optimizing new pages with the main keywords in the hopes that they will be combined with other useful word combinations that can deliver relevant traffic.
|My question is, is there any benefit or any disadvantage from allowing the spiders to index all of my site even though my Home Page is the only page to be primed for keywords? |
There is an advantage for PR if you allow Googlebot to index your whole site. (Otherwise there are dead ends and PR is wasted.)
Wow! (haha) I feel like I have been beaten black & blue by the heavies - even though all the suggestions are taken on board. I gotta grow up quick. I wish i could find the article that says don't let the spiders crawl until it is primed (I am sure it was on Webmaster World but maybe not?). I guess the lesson to be learnt is loud and clear - let the spiders index most nearly everything - so that's what I will do - right now - so get searching for 'widget hotels near the beach' guys and watch my site get to no.1!
(really) Thanks for helping a newbie.
> I feel like I have been beaten black & blue by the heavies
I'm pretty sure that the ill advice was either not by "heavies" or simply a misunderstanding.