Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Lost in Google

         

rshandy

6:39 pm on Feb 27, 2005 (gmt 0)

10+ Year Member



My site has been in 1st page serps for many years. Just a few weeks ago, my listing went from a title display and decription to just this:

www.mydomain.com/
Similar pages

In addition, my Yahoo listing disappeared as well. I then did a Yahoo search for pages with my domain included and found most of my interior pages indexed but not my home page.

What happened? Is it possible my site was not ready to be crawled when Googlebot and Slurp robots visied my site - simultaneously?

This is a very "white hat" site - no tricks at all, just good content...

I went and manually requested my site be spidered on both Google and Yahoo, and sent an email to Yahoo requesting any explanation as well.

Is there anything else I can do? Any ideas of why this happened?

Emmett

8:10 am on Mar 10, 2005 (gmt 0)

10+ Year Member



The best way I've found to find a 302 to your url is to search for some unique text from your page in quotes. The offending page will show up and you can view the google cache of the page to see if it matches yours.

arras

10:58 am on Mar 10, 2005 (gmt 0)



"My conjecture is that G knows (and cares) about this issue, but is trying to find an algo type solution to it. Expect great fanfare and praise when the problem is finally solved"
That's probably why the small post alegra update brings back pages lost at 5 of february, are coming back in most DC's?

milivoye

11:49 am on Mar 10, 2005 (gmt 0)

10+ Year Member



Hallo guys!
Some of my company's sites are affected by this problem (as far as I understand it from your posts).

Searching for (www.mysite.com -site:www.mysite.com) on Google and looking for all-smal-caps-in-the-title links listed using our descriptions, I have found that all of them are from for-sale domains registred mostly by www.directNIC.com, www.domaincontender.com and some others.

Is there any sense in contacting these companies?

Thanks a lot!

zeus

11:55 am on Mar 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think you can see in the serps how big a problem this hijacking is, if you have noticed the last 6 month, you see more omitted results, which could be because there are so many dublicated results now because of the googlejacking and suplemental results and URL only, that all has in a way relation to this hijacking/rdirecting problem or it could have Im not 100% sure.

If you look at it you have NEVER seen google in such a shape before and it all realy kicked off when they added all those fake sites.

A solution?, we have to face it googleguy or other would not reply to this post because it is a HUGE problem and could have effects on there stock, the only way we can (fix) this is by posting the problem to other internet/business News sites if that does not have a effect there are the papers, we have such a big bunch of kreativ webmasters here that there should be no problem and everyone has there right to there comments its up to the news if they want to publish.

diddlydazz

12:05 pm on Mar 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



posting the problem to other internet/business News sites

We will run this on one of our tech related sites (we already have an article about Allegra and the MIAs).

If anyone has something written or already live sticky me.

(claus how is your's coming along? )

Dazz

Lorel

4:19 pm on Mar 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hi All,

I had about 7 redirect links show up in my allinurl command last Nov when we were disussing this matter and wrote google about it with the special email that was provided especially for this problem and also contacted those offenders who had emails and about 1/2 of those links are no longer effective (server could not be found error) and the tracker2 redirects have totallydissappeared and the others are directories with a link to one of my sites. So while all but the tracker2 links still appear in my allinurl command 1/2 of them no longer work and there are no new links there either. So something is changing.

PS. Does anyone know if that special email to google still works. I've forgotten what the subject line was suppoed to say.

crobb305

7:04 pm on Mar 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Diddlydazz,

I am working on something as well... I will sticky you when we are done.

[edited by: crobb305 at 7:27 pm (utc) on Mar. 10, 2005]

walkman

7:21 pm on Mar 10, 2005 (gmt 0)



save yourself the trouble. It will not make a difference. Many of us have sent it months ago.

"PS. Does anyone know if that special email to google still works. I've forgotten what the subject line was suppoed to say."

zeus

7:25 pm on Mar 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Drop an email to webmaster [at] google.com with the keyword "canonicalpage"

Bobby

8:50 pm on Mar 10, 2005 (gmt 0)

10+ Year Member



A couple of nights ago a programmer friend of mine came over and we talked about various solutions, one of which involved aiming the redirect script right back at the pagejackers site. Well, I tried it and it works.

What it does is cause the user's browser to loop, thus making it a problem to click on the redirect link.

I don't know how googlebot would respond to the script, my guess is that it simply ignores it and continues indexing the content as if it were the pagejacker's but at least it makes the pagejacker's site a real nuisance.

What I am wondering about is how Googlebot would interpret the script and if it can be guided to another page (which could be excluded by a robots.txt file) full of spam, perhaps excessive repetition of a keyword or something similar. This way the hijacker's site would be penalized.

Think it might work?

claus

9:35 pm on Mar 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>> one of which involved aiming the redirect script right back at the pagejackers site

In order to do this, you check for the URL of the page the redirect is on, right? Googlebot wouldn't ever notice that script, see msg #54 of this thread [webmasterworld.com].

If not so, please post a little more details.

>> claus how is your's coming along?

See link above, sofar it's only there. I know it's a pain to link to a specific forum post deep inside a thread, but it seemed like the best place to post it, as we had a lot of those hijack threads already and i didn't want to start yet another.

(i'll sticky you on the progress, if any - i'm a busy man you know, not a lot of time for creating content for myself *lol* ADDED: That post do need a little editing for more clarity, and the last 2 points were perhaps a little too simplified.)

[edited by: claus at 10:32 pm (utc) on Mar. 10, 2005]

Bobby

9:50 pm on Mar 10, 2005 (gmt 0)

10+ Year Member



In order to do this, you check for the URL of the page the redirect is on, right?

Actually it checks 2 variables, the length of the script (over 100 characters) because this 'network' of hijackers all use the same formula, and for the referrer.

if (document.referrer&&document.referrer!="")

if (poshost+lunref > 100)

zeus

9:52 pm on Mar 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Bobby keep us updated, but still I hope we can get google fix there problem, because not everyone can handle scripts

claus

10:01 pm on Mar 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Okay, then i can confirm that Googlebot will simply ignore it. Also "document.referrer" looks like JavaScript to me and Googlebot does not execute JS (wouldn't work if you made it php, asp, perl or whatever either, though)

Still, making the redirect unusable for real users might accomplish something by itself, but only if the publisher of the redirecting site cares if his/her links work for users or not.

Bobby

10:06 pm on Mar 10, 2005 (gmt 0)

10+ Year Member



Zeus,

The script is easy to place in any html, asp or php file, the question is whether or not it's possible to penalize the hijacker without doing damage to your own site. That's why I suggested creating a "spam" page just for cases where there are hijackers whose formula can easily be identified. The robots.txt file in theory should exclude the bots from indexing the spam (in theory).

What we need now is someone who can enlighten us as to how Googlebot would react to the script, would it consider the new redirect part of the hijacker's page or not? My guess is it would, especially if you use the same redirect script as they are using. If it works for them it will work for you.

Just to be extra careful you could put the spam page in another folder with a robots.txt telling the bots not to index the page.

This 206 message thread spans 14 pages: 206