Welcome to WebmasterWorld Guest from 22.214.171.124
Forum Moderators: mack
Unless you have explicitly barred spiders from your site then the answer is normally yes. However "can spiders crawl your site properly" is probably the next question you need to ask...
That will depend on how it was created - some design techniques and construction methods are more suited to being accessed by spiders than others.
For example regular text-heavy HTML can be *very* spider-friendly if used correctly, equally a badly thought-out HTML structure can be extremely spider-unfriendly. Also until recently flash was also another extremely spider-unfriendly technology. The key thing to remember is that spiders are looking for text content within a page - if a spider can get to that content easily and you have put some thought into it then you will often be surprised how well you can do...
As I said a second ago at the moment text-content is king due to the ease with which search engines can get access to it, so from this point of view your goal in creating spider-friendly pages should really be to put the best text content you can into pages and have it nicely formatted and laid-out so that users can make the best use of it.
While you are there you might also want to get those pages as close to perfect in terms of markup validation as you can - again something which will generally benefit your users.
This has been very user-focused so far hasn't it?
There's a reason for that - search engines exist to crawl websites designed for *people* rather than designed purely for search engines so a document that works well for a user is something a search engine spider must be able to understand to have even the slightest chance of having it's search engine make it into the limelight.
As long as you don't try anything too bizarre spiders really do try their best to crawl pages because that's what their objective is in life - to crawl as many pages as they can handle to allow their search engine to provide the best results they can.
I hope some of that has been helpful.
Welcome to WebmasterWorld [webmasterworld.com]!
Also, you may want to try a site search (link at top of screen) here on WebmasterWorld for robots and spider-related topics.
The only reason your site or any site wouldn't accept spiders would be if you either suggest by using the "disallow" in the robots.txt
you have placed limitations of access into your htaccess file.
[baremetal.com...] (couldn't find a reference in WebmasterWorld Glossary)