Forum Moderators: phranque

Message Too Old, No Replies

How to craft server error codes to optimize for Google

Determine use of server error codes to avoid pitfalls

         

bendecko

12:14 pm on Jan 19, 2006 (gmt 0)

10+ Year Member



How would you seasoned SEOs handle this little funball...

Our site is dynamically generated from a database so users can add and remove Listings.

The majority of pages are actually formed from a single physical page script and a parameter

E.g. /propertysearch.asp?townCity=London or /propertysearch.asp?location=London+Bridge

Google appears to be caching a lot of outdated pages - probably because even tho' the page is effectively 'wrong' it is still returning a 200 server code. E.g. /propertysearch.asp?organisation=Company+That+Has+Gone+Bust still appears in the serps.

It IS possible to return a different server status code depending on the parameter

E.g. /propertysearch.asp will return 200
/propertysearch.asp?location=London+Bridge will return 200
/propertysearch.asp?organisation=Company+That+Has+Gone+Bust could return 404
/propertysearch.asp?organisation=Typed+In+Wong+by+User could return 500
/propertysearch.asp?outdatedparameter=value could return 301 (moved to subsidiary site)

Can Google diffrentiate between e.g. 200 response on /propertysearch.asp?county=Essex and 404 response on /propertysearch.asp?propertyID=244, or does it just see various responses from a single page /propertysearch.asp

That is the major concern... Let me one line summarise:

If various parameter have different effects, does Google see the different codes coming from different 'pages' or just the single /propertysearch.asp (with no parameters) page.

What is recommended best practice here.

Thanks

Bendecko

Demaestro

5:11 pm on Jan 23, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The best way to deal with this is to use an access rule that will rewrite the URL tricking Google into thinking it is at a deticated page. Which means removing the query string, I have had to do this a few places. The rule takes the URL.

www.foobar.com/dynamic_page?this_page=houses

and changes it to

www.foobar.com/dynamic_page/this_page/houses

Really the URL hasn't changed and it still requests the dynamic_page. You can then teach the dynmaic_page to parse off the directories and use them as variables. From there you should be able to raise which ever response code you which.

Just remeber when returning a 404 notFound you are encouraging the bot to try again. You want to return things like a 410 contentMovedPermenatly or a contentDeleted response so that bots will stop trying after sometime.

mattglet

3:08 pm on Jan 26, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Your current setup will work just fine, you don't NEED to rewrite anything.

To answer your question, Google (and other SE's) see different URLs as different pages. So, page.asp is different than page.asp?city=whatever, is different than page.asp?property=foo, etc. If the URL is different, the page is different.

You need to code the site so that if a bad querystring parameter is passed, you need to send the correct response status to the crawler (Response.Status in ASP). If you do it correctly, you will be good to go. It's not that hard, just check into the Response.Status method and have at it.