Forum Moderators: open
People do search for others things that are related to you, than what you have on your homepage.
is there any benefit or any disadvantage from allowing the spiders to index all of my site even though my Home Page is the only page to be primed for keywords?
Without knowing a lot more about your site I can only speculate. But in general (and yes, your mileage may vary) it is not a good idea to concentrate all of your keywords into one home page.
In most cases you will do better by having separate pages optimized for sets different keywords (assuming that you have more than a handful of them).
For example, it is very important to have your keywords in the title. That is quite easy to do if you have two or three keywords per page, but if you have a score of keywords you cannot put them all in the title.
This was because, once a spider has visited, it would not return for a long time.
This is quite nonsense.
The only exception I know is when a page is not ready for your users. But then you wouldn't put it online at all.
A crawler should be granted access to all pages to which outside visitors get access, two. That's the purpose of search engines.
If your pages aren't ready for humans to look at them, get them off of your server, and do it yesterday. If they are, then let the spiders at them.
And don't worry about the spider's schedule. You have no control over that. Monitor the content developer's schedule, because THAT'S the pipe all of your eggs have to go through before you can put them in your basket.
You're getting some advice from a very clueless source. Forget everything you heard from there -- get a lobotomy if you have to, this is critical! -- and start reading old threads in this forum.
and start reading old threads in this forum.
And since you claim to be a beginner, start where many of us did: Brett's famous Successful Site in 12 Months with Google Alone [webmasterworld.com] post.
The web is a very dynamic place, and details change very fast, but general principles have a much longer shelf life. Brett's post gives an excellent mix of the two.
You've been shot down in flames a bit here mate. But don't worry - just the WW 'heavies' flexing their buttocks.
A robots.txt / meta robots instruction can be useful to exclude stuff you *really* don't want indexing (because Google indexes a lot of inappropriate stuff). For example, Google decided in 2002 that our 'Checkout Page' was the most important one on one of our sites - but anyone clicking on that link would get a message 'Your Shopping Basket is Empty'. So obviously we excluded that from the mindless (and increasingly irrational) GoogleBot ;) Since then we've excluded a few other files and folders, such as our CGI bin, etc. Simply because they are of no interest to a searcher.
But the general advice to make all your info available to the surfer still holds. Trouble is, the recent changes in Google mean that webmasters are forced to add more and more information surrounding their subjects, in order to make their already relevant pages (pre-Florida) 'look' more relevant post-Florida.
It's a short-sighted and exponential effect - but never mind - we are adding pages as quickly as we can type - but this is no good for the Internet having all this extraneous crap piled onto it.
Never set up a page solely for search engines target humans then the engines with little tweaks.
I cant understand why anyone would allow only the home page to be spidered.
I have often seen certain pages coming in at number 11 for a particular search term and managed to bring it up on the first page with some simple changes.
It's also a good idea to see how your page reads in the SERPs compared to the others. Although my site was coming in at number 8 for a very important search phrase, the other sites had descriptions and snippets that read like a string of keywords where mine flowed naturally.
What's important is bringing visitors to your site and not only getting the site listed high in the SERPs.
What's important is bringing visitors to your site and not only getting the site listed high in the SERPs.
When all is said and done, this is probably the "Fundamental Theorem" of SEO. Of course visitors won't come unless you are listed relatively high in the SERPs. But the bottom line is visitors (or, to be more accurate, conversions). The SERPs are a means, not an end.
will the crawler come back to visit soon or will it take 6 months to return?
If you have a decent Page Rank, you will see the bot come through almost daily. Usually, PR5+ pages get looked at very often, PR4 fairly often, PR3 less often, etc.
linton, don't exclude googlebot from anything. Don't get bogged down concentrating on certain magic kw's. It's better to have one hit a day on 50 different kw phrases than 50 hits a day on one kw phrase. Having many pages with lots of content is the way to accomplish that.
People who zone in on one particular kw phrase are liable to "Florida" style eradication. (Re-stating another post in the thread).
Put together a few articles about your topic, and forget all about Googlebot. Do not "prime the kewords" or whatever.
Why would I recommend this? There are only so many sites (5-10) that can be on that front page for the keywords that you decided that you want. Odds are pretty good that you will not always be on the front page for those searches.
For example, if you are putting up a page for a hotel in widgetland, and you optimize for "widgetland hotel", you will not only miss out on the searchers for "widgetland lodging", "widgetland accomodations" and things like that, but you will also lose out on "widgetland hotel near the beach" "widgetland hotel restaurant", and "widgetland hotel reservations". And Even when you are nowhere to be seen on your "main keywords" you will still pick up front page on all sorts of unplanned searches.
All those unplanned searches are the ones that save your butt when you disappear for your main keywords.
All those unplanned searches are the ones that save your butt when you disappear for your main keywords
Ain't that the truth!
But what I'd like to know is WHY is google filtering out my entire web site for "widgetland hotel restaurant" (not in the top 1000 when previously at #2) and NOT FILTERING for "widgetland restaurant" or "widgetland hotel"?
Sorry to get off track here but I just wanted some opinions on whether or not it's better to abandon the EVIL KEYWORDS which are provoking what in my view is an obvious filter and focus on the more obscure searches or continue optimizing new pages with the main keywords in the hopes that they will be combined with other useful word combinations that can deliver relevant traffic.
My question is, is there any benefit or any disadvantage from allowing the spiders to index all of my site even though my Home Page is the only page to be primed for keywords?
There is an advantage for PR if you allow Googlebot to index your whole site. (Otherwise there are dead ends and PR is wasted.)