Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: goodroi
As a matter of fact, I would be embarrassed if I was (G) this isnít a small glitch, it has been going on for more than 6 months now and it should not be a hard problem to fix if they wanted to. At least in 6 months they should have fixed it. That should reflect on their IPO value, oh, Iíd say about 6 months later as investors start seeing what is going on. DUH!
You denied the images folder and your rankings dropped in the SERPS? Did you test this? I often modify client's robots.txt files to exclude the images folder from the spiders to keep hi-jackers off the site as much as possible. Plus, I have even seen issues with slightly corrupted image files causing the search spiders to feeze.
Well that's the way it looks. No I did not test it due to the fact that (G) takes too long to index pages, blah, blah, blah, and since I'm not an SEO, I have too many other things on my plate to figure it out and remember verbatim what I did. With (G) 'playing' with their software, can anything really be certain with or without testing. It could be a perception that something that may test true one week, will not the next true the next. (I wonder if they do that on purpose to confuse SEOís? ;-)
However, that was one of about 5 things I changed after the infamous FL update, none that were anything major. It took me from the bottom of the 2nd page to the top for a specific 2KW search, one word being a common word, six. Of course it could also have been DMOZ updates, etc., etc., etc. I would be at the top of the 1st page if it werenít for Amazon taking 2 places and a portal that link-loaded DMOZ to get on top having 2 pages. That is 2 results of the 10 total on the default page that are the same domains. Or 1/5 of the SERPS on (G) for those KWs are the same domains. Remember when you would get 100 pages from the same domain on a search. It looks like that. (I think (G) came full circle.)
It may be moot anyway because I have several visitors a day that show up via the image search. Maybe some end user finds their SERPs are better if they search images? So if I am getting traffic from the images, then I want some of them there.
>>to exclude the images folder from the spiders to keep hi-jackers off the site as much as possible
Yea, that was my way of thinking as well. But, it may be just a waste of time in the long run. I use to try to keep the images out, but I have found that putting our URL in the images, using a gray see-thru font, works the best. Locks ONLY keep honest people out. And it seems to be a human trait that if humans can get away with something they will. It is indeed sad.
If they want to invest the hours it would take to get our URL out of a line or bar chart without messing it up, well, then there isnít anything Iím going to do on the server that will stop them.
In theory. However, based on what I have seen with (G) since the FL update, I personally would NOT make such a bold statement.
In theory, a 301 redirect should not have a Ďnegative effect on the ranking of your pagesí, but it did and still does because of either a reoccurring bug, or because they keep going back to old code or something else.
I posted about by 301 problems with (G) in several messages here. Theory and practical application does NOT always go hand-in-hand.
If there are certain images you want to protect, you can always ban bots from only those images, while allowing access to the rest.
I have blocked the imagebot for the last two years and enjoy healthy rankings.
GG has stated in the past that it does not have any impact at all.
I just can't see any logical reason why they would punish sites for not wanting to be included in the image search.