Forum Moderators: open
For about 8 months I have not been listed in google. For a while google toolbar used to just show a complete white line on my site and now it is grey. i know I have some links to my site and they are still there.
I have been using AddWords and still do.
I have read all the rules on this forum and am 100% sure that I have not done any spamming or anything else that would justify a ban. My site is a legitimate business with no wrong doings. I do have a flash intro, but so do many others that are listed in Google and I have links outside of the flash to other pages on my site. It just doesn't seem right. Surely Google knows my site exists, I mean they take over $40 a month off me from the click throughs on AdWords.
What is going on?
I have decided to change my domain name which is going to be costly after all the marketing I have done and now I am worried about something else. If my site has been blacklisted for any reason and I can't understand why it would have, what is going to happen if I put an autodirect from my old site to the new domain? (I don't want to lose customers by them not being able to find me) Will google pick this up and suspiciously ban my new site as well?
Is it my domainname.com that could have been banned or is it just the IP address supplied by my web host. Would it help to just change web host and get a new IP?
Many thanks
Richi
When I tried to check for your robots.txt I ended up on a custom error page ... may be the googlebot thought that you where triyng to cloak it and didn't like it ...
Just a thought ... need a 'senior' guy who may have experience in that... Or try the WW search ..
Good luck
Leo
Upload a valid robots.txt file, or at least a blank one. You can validate your robots.txt file here [searchengineworld.com].
Jim
However, it is only this site that has the problem. robots.txt files don't come as standard with hosting packages and I am sure that only experienced users will know about them so how can this be the problem?
I am going to take your advice and get one uploaded very soon, I just can't see why this hasn't affected my other sites.
Many thanks
Richi
Make sure that when returning your custom error page the header is of the right type (404) and not 200 or 302.
This could be an issue if your custom error page is interpreted when the robots.txt file is requested.
If you're using the ErrorDocument directive, make sure the custom error file is mentioned with a local URL and not an absolute one.
ErrorDocument 404 /error404.html -> OK returns a 404 header
ErrorDocument 404 [domain.tld...] -> WRONG as it returns a 200 or 302 header
Dan
What server response code is returned with the error page when you request robots.txt? (use the server header checker [webmasterworld.com].) If it returns a 301 or a 302 along with the custom 404 page, that would be very bad - It would be interpreted not as a missing robots.txt, but as an invalid one.
Jim
have 3 sites and work on another 4 and none of them have a robots.txt file
robots.txt files don't come as standard with hosting packages
Robots.txt have nothing to do with hosting or servers, it for search engines. It's the first file they search when they arrive on your site. It tell then which folder to index and wich not (if any) it also allows certain robot to 'crawl' (the genuine like googlebot) and other not (the 'fake' like email 'sucker' agents - sorry not sure it's the right terms)
You can see if your other sites have custom error 404 on the robots.txt by typing their URL followed by /robots.txt ex. www.myotherdomain.com/robots.txt
Leo