Forum Moderators: Robert Charlton & goodroi
www.mydomain.com/
Similar pages
In addition, my Yahoo listing disappeared as well. I then did a Yahoo search for pages with my domain included and found most of my interior pages indexed but not my home page.
What happened? Is it possible my site was not ready to be crawled when Googlebot and Slurp robots visied my site - simultaneously?
This is a very "white hat" site - no tricks at all, just good content...
I went and manually requested my site be spidered on both Google and Yahoo, and sent an email to Yahoo requesting any explanation as well.
Is there anything else I can do? Any ideas of why this happened?
Searching for (www.mysite.com -site:www.mysite.com) on Google and looking for all-smal-caps-in-the-title links listed using our descriptions, I have found that all of them are from for-sale domains registred mostly by www.directNIC.com, www.domaincontender.com and some others.
Is there any sense in contacting these companies?
Thanks a lot!
If you look at it you have NEVER seen google in such a shape before and it all realy kicked off when they added all those fake sites.
A solution?, we have to face it googleguy or other would not reply to this post because it is a HUGE problem and could have effects on there stock, the only way we can (fix) this is by posting the problem to other internet/business News sites if that does not have a effect there are the papers, we have such a big bunch of kreativ webmasters here that there should be no problem and everyone has there right to there comments its up to the news if they want to publish.
I had about 7 redirect links show up in my allinurl command last Nov when we were disussing this matter and wrote google about it with the special email that was provided especially for this problem and also contacted those offenders who had emails and about 1/2 of those links are no longer effective (server could not be found error) and the tracker2 redirects have totallydissappeared and the others are directories with a link to one of my sites. So while all but the tracker2 links still appear in my allinurl command 1/2 of them no longer work and there are no new links there either. So something is changing.
PS. Does anyone know if that special email to google still works. I've forgotten what the subject line was suppoed to say.
"PS. Does anyone know if that special email to google still works. I've forgotten what the subject line was suppoed to say."
What it does is cause the user's browser to loop, thus making it a problem to click on the redirect link.
I don't know how googlebot would respond to the script, my guess is that it simply ignores it and continues indexing the content as if it were the pagejacker's but at least it makes the pagejacker's site a real nuisance.
What I am wondering about is how Googlebot would interpret the script and if it can be guided to another page (which could be excluded by a robots.txt file) full of spam, perhaps excessive repetition of a keyword or something similar. This way the hijacker's site would be penalized.
Think it might work?
In order to do this, you check for the URL of the page the redirect is on, right? Googlebot wouldn't ever notice that script, see msg #54 of this thread [webmasterworld.com].
If not so, please post a little more details.
>> claus how is your's coming along?
See link above, sofar it's only there. I know it's a pain to link to a specific forum post deep inside a thread, but it seemed like the best place to post it, as we had a lot of those hijack threads already and i didn't want to start yet another.
(i'll sticky you on the progress, if any - i'm a busy man you know, not a lot of time for creating content for myself *lol* ADDED: That post do need a little editing for more clarity, and the last 2 points were perhaps a little too simplified.)
[edited by: claus at 10:32 pm (utc) on Mar. 10, 2005]
In order to do this, you check for the URL of the page the redirect is on, right?
Actually it checks 2 variables, the length of the script (over 100 characters) because this 'network' of hijackers all use the same formula, and for the referrer.
if (document.referrer&&document.referrer!="")
if (poshost+lunref > 100)
Still, making the redirect unusable for real users might accomplish something by itself, but only if the publisher of the redirecting site cares if his/her links work for users or not.
The script is easy to place in any html, asp or php file, the question is whether or not it's possible to penalize the hijacker without doing damage to your own site. That's why I suggested creating a "spam" page just for cases where there are hijackers whose formula can easily be identified. The robots.txt file in theory should exclude the bots from indexing the spam (in theory).
What we need now is someone who can enlighten us as to how Googlebot would react to the script, would it consider the new redirect part of the hijacker's page or not? My guess is it would, especially if you use the same redirect script as they are using. If it works for them it will work for you.
Just to be extra careful you could put the spam page in another folder with a robots.txt telling the bots not to index the page.