| 8:34 pm on Feb 4, 2007 (gmt 0)|
What determines the disallow?
Remember Googlebot doesn't identify itself as internet explo*
It identifies as:
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
| 8:52 pm on Feb 4, 2007 (gmt 0)|
Personally, I would prefer that sites that don't support all browsers are kept out of the SERPs.
It isn't that your customer doesn't have the money to support multiple browsers. Your customer spent too much money trying to be fancy, and didn't want to spend any more to fix their mistake. Making compatible sites is easier and cheaper than it is to make a browser specific site.
| 10:20 pm on Feb 4, 2007 (gmt 0)|
Agreed, google has to serve up results for all browsers, not just IE. HTML/XML/CSS are the basic nuts and bolts of any site!
IF YOU CAN NOT EVEN GET THE BASICS RIGHT THEN YOU HAVE NO ROOM TO COMPLAIN WHEN GOOGLE IS NOT INDEXING THE SITE!
| 12:16 am on Feb 5, 2007 (gmt 0)|
Check robots.txt for possible culprit.
| 3:09 am on Feb 5, 2007 (gmt 0)|
Nothing personal, but as one of the millions of Firefox users I can only say: haha
| 3:20 am on Feb 5, 2007 (gmt 0)|
The trouble with an IE-only site is that search engine bots are not running IE - so the very defects which mean your site is inoperable in Firefox could well be the same defects that make your site unindexable.
|due to the poor finance he could not make a site multiple browser compatible |
A real false economy, as you can see. Even if it were true that a site that functions well in different browsers costs more than an IE-only site (and this is unproven at best), the fact that you need your site to function for the spiders - which are another kind of "browser" or user agent - in order to be successful should help your client reconsider his priorities.