|Spidered with an error for my title on AltaVista|
How can I fix the bad listing on AltaVista
I just did a search on AltaVista for my company name, because I've been put in charge of our SEO.
The first result was our domain, but it had the following:
Sorry! Your Browser is unsupported! The new <snip> website requires features of the latest versions of Internet Explorer, Netscape or Mozilla. Download the latest version of Internet Explorer: ..."
We've got people that supposedly QA'd the site (relaunched last Thursday) with all types of browsers. How did the spider end up with that as the title and description?
[edited by: Marcia at 6:23 am (utc) on April 3, 2003]
[edit reason] No specific references, please. Thanks! [/edit]
Or perhaps the browser detection is done server-side, and it did not recognize AltaVista's user-agent. (Try WannaBrowser to check this.)
Either way, the logic of your browser sniffing should have a useful default that will work with almost any browser. This would take care of "unsupported" older browsers as well as robots.
I believe a Web site should offer some minimum of basic functionality to all browsers, including text-only browsers, and even screen readers used by the blind; It doesn't have to be pretty or perfect, but it should work.
Besides, the very next site listed in the search engine results works just fine... Sure, it's a little ugly in my old browser, but at least it doesn't kick me out...
(See it's not just a rant, there IS an SEO connection) ;)
According to one of our tech guys, if they are given the User-Agent string for the indexers used by the various search engines, we can allow them as acceptable browsers.
Does anybody know this information?
Yes but the problem is that these user-agent strings can and do change over time. The minute one changes unexpectedly, your site gets dropped or indexed as "Your browser is unsupported..."
Again, I strongly suggest that what is needed is a change in philosophy. Allow all non-malicious user-agents, give the modern browsers the full experience, and provide a downgraded-but-useful experience to all other user-agents.
Over time, the maintenance costs and problems associated with your current approach are going to hurt you - badly.
I know, because I have tried this approach in the past. The list of "allowable" user agents quickly grows out of hand, errors can get you dropped from important search engines and wreck your site traffic for several months, and you alienate users who have no option to upgrade their browsers. This is throwing away a large portion of your site's potential.
Even if the tech guys insist on this approach to support user-agent redirection and/or cloaking, there are much better ways of implementing it, with much lower risks of disaster.
Now, your answer... AltaVista's user-agent string will contain "Scooter" - unless Overture makes them change it soon.
You make a lot of sense, and I shared it with the guys here - they are insistant on going the hard route.
They say that the new site is very heavy on CSS and DHTML, and consider the % of non-supported browsers to be negligible. Ugh - the classic clash of tech vs. marketing.
They said they'd rather parse a page of spiders on a regular basis to make sure we have the most up-to-date list. I haven't been able to find such a list - any ideas?
"They" are making the wrong decision- good luck!
If you want that spider list, you may as well harvest your own logs first...
I was thinking it would make sense to regularly harvest our own logs, but how to tell the difference between good spiders and bad spiders?
Some tips to be found in another WW thread:
Search Engine Spider Identification [webmasterworld.com]
|They say that the new site is very heavy on CSS and DHTML |
Stuff like this makes me cry.
Your tech guys should be fired.
Good luck when you need them, because your hands are tied.
This makes me angry and I don't even know you :)
Your boss might have well have said "We have break ins and we need your help" But the maintenance department refuses to use anything but beaded curtains.
Also, I doubt anyone else in the entire universe does it this way. Ok maybe a couple - but they are doing it wrong - way way way way wrong.
I will go now
TO get a drink and calm down
Here's a relatively-thorough user-agent list [icehousedesigns.com]. There are many such lists, and I'd suggest using several to get the best coverage.
If your tech guys are determined not to serve a default page for unidentified browsers, then you would be better off to identify search engine bots by IP rather than UA. UA's can change from crawl to crawl. IP's don't.