Forum Moderators: Robert Charlton & goodroi
Many unethical webmasters and site owners are already creating thousands of TEMPLATED (ready to go) SKYSCRAPER sites fed by affiliate companies immense databases. These companies that have your website info within their databases feed your page snippets, without your permission, to vast numbers of the skyscraper sites. A carefully adjusted variant php based redirection script that causes a 302 redirect to your site, and included in the script an affiliate click checker, goes to work. What is very sneaky is the randomly generated meta refresh page that can only be detected via the use of a good header interrogation tool.
Googlebot and MSMBOT follow these php scripts to either an internal sub-domain containing the 302 redirect or serverside and “BANG” down goes your site if it has a pagerank below the offending site. Your index page is crippled because googlebot and msnbot now consider your home page at best a supplemental page of the offending site. The offending sites URL that contains your URL is indexed as belonging to the offending site. The offending site knows that google does not reveal all links pointing to your site, takes a couple of months to update, and thus an INURL:YOURSITE.COM will not be of much help to trace for a long time. Note that these scripts apply your URL mostly stripped or without the WWW. Making detection harder. This also causes googlebot to generate another URL listing for your site that can be seen as duplicate content. A 301 redirect resolves at least the short URL problem so aleviating google from deciding which of the two URL's of your site to index higher, more often the higher linked pagerank.
Your only hope is that your pagerank is higher than the offending site. This alone is no guarantee because the offending site would have targeted many higher pagerank sites within its system on the off chance that it strips at least one of the targets. This is further applied by hundreds of other hidden 301 permanent redirects to pagerank 7 or above sites, again in the hope of stripping a high pagerank site. This would then empower their scripts to highjack more efficiently. Sadly supposedly ethical big name affiliates are involved in this scam, they know it is going on and google adwords is probably the main target of revenue. Though I am sure only google do not approve of their adsense program to be used in such manner.
Many such offending sites have no e-mail contact and hidden WHOIS and no telephone number. Even if you were to contact them, you will find in most cases that the owner or webmaster cannot remove your links at their site because the feeds are by affiliate databases.
There is no point in contacting GOOGLE or MSN because this problem has been around for at least 9 months, only now it is escalating at an alarming rate. All pagerank sites of 5 or below are susceptible, if your site is 3 or 4 then be very alarmed. A skyscraper site only need create child page linking to get pagerank 4 or 5 without the need to strip other sites.
Caution, trying to exclude via robots text will not help because these scripts are nearly able to convert daily.
Trying to remove a link through google that looks like
new.searc**verywhere.co.uk/goto.php?path=yoursite.com%2F will result in your entire website being removed from google’s index for an indefinite period time, at least 90 days and you cannot get re-indexed within this timeline.
I am working on an automated 302 REBOUND SCRIPT to trace and counteract an offending site. This script will spider and detect all pages including sub-domains within an offending site and blast all of its pages, including dynamic pages with a 302 or 301 redirect. Hopefully it will detect the feeding database and blast it with as many 302 redirects as it contains URLS. So in essence a programme in perpetual motion creating millions of 302 redirects so long as it stays on. As every page is a unique URL, the script will hopefully continue to create and bombard a site that generates dynamically generated pages that possesses php, asp, cigi redirecting scripts. A SKYSCRAPER site that is fed can have its server totally occupied by a single efficient spider that continually requests pages in split seconds continually throughout the day and week.
If the repeatedly spidered site is depleted of its bandwidth, it may then be possible to remove it via googles URL removal tool. You only need a few seconds of 404 or a 403 regarding the offending site for google’s url console to detect what it needs. Either the site or the damaging link.
I hope I have been informative and to help anybody that has a hijacked site who’s natural revenue has been unfairly treated. Also note that your site may never gain its rank even after the removal of the offending links. Talking to offending site owners often result in their denial that they are causing problems and say that they are only counting outbound clicks. And they seam reluctant to remove your links....Yeah, pull the other one.
[edited by: Brett_Tabke at 9:49 pm (utc) on Mar. 16, 2005]
this is old news--I can't get caught up with you guys--go take a holiday :o)
Hey japanese is that your page about 302 hijacking at Loris web?A page that talks all about how to detect the various methods of 302 hijacks and what to do about it.
It is a very comprehensive page, showing various hijack methods and how to detect them, the solution however is along the lines of whois search and contacting hosts for TOS violations.
It's my own page. I have 25+ clients, several of which have been affected by hijackers, including myself, and I wrote that partly out of experience and from reading these threads and other research.
I wrote it simple enough for a newbie to understand. I'm not a programmer so I don't get into that part of the problem although I do link to sites that do.
Submitting to sites and later finding your site hijacked and or in a frame even happens to myself, even with all my cautions about avoiding bad links on another page I authored, and with finding links for 25 clients it happens often--too often.
I wish google would fix this. I spend several hours a day researching this matter or chasing bad links instead of earning a living.
*edited spelling
[edited by: Lorel at 7:08 pm (utc) on Mar. 17, 2005]
Clearly, there is a large random factor at play (or a factor that is simply not understood). When I checked a few days ago, there were three clone sites each trying to hijack one of my pages, but it's fine. That's been the case for several months.
I'm all in favour of experimentation, but if a workaround is found, it will require a great deal of luck.
Kaled.
Time-Dependend RewritingRewriteEngine on
RewriteCond %{TIME_HOUR}%{TIME_MIN} >0700
RewriteCond %{TIME_HOUR}%{TIME_MIN} <1900
RewriteRule ^foo\.html$ foo.day.html
RewriteRule ^foo\.html$ foo.night.html
How about instead of just night and day you set your page to have minor changes every hour or two. Actually create 12 or 24 copies of your homepage and change minor things in the coding and content. Not so much that any casual visitor would notice but enough that google might not see it as duplicate content. For example, replacing a few tables with divs and vice versa.
Not a fix by any means but until a fix is found it could be something to try.
Jim
http:URL....../go.php?id=aHR0cDovL3d3dy5zcHllcXVpcG1lbnRndWlkZS5jb20vcGVvcGx
He discovered 2 of my links that look like this when he did a Google search inurl:www......com for his site. I was completely unaware my script was doing this until he brought it to my attention.
I corrected the problem and apologized to him. I also thanked him for bringing this to my attention.
Redirects showing in an inurl: search prove nothing. Are those redirects showing when you search site:mysite.com? The site: command should show only the pages that Google thinks are truely part of your site. If any of those unrelated urls are showing, then you know there is a problem.
The inurl: search will show any page that has the search phrase in it's url. It says nothing about hijacking. If you are redirecting to him, it could be harmless. But, I suppose it's better to not risk it since Google has become increasingly incompetent and has proven it's inability to rank sites. Original content sites are getting replaced by 302s, tracker2s, scrapers and directories.
Any chance someone wants to start a new thread that starts with a complete and detailed description of the problem and then only allow people to post their ideas or possible fixes to the problem. A person looking for solutions on how to deal with this problem would be lost looking through this monstrous thread.
I operate about a half dozen related, loosely interlinked web sites which have been page one on Google for years for various competitive and non-competitive phrases. Last summer things got real goofy -- pages which had ranked well fell down or out, only to return to high listings sometime later, then drop out again...you know the drill.
I couldn't figure out why Google loved me, then hated me, then loved me. But this 302/duplicate content issue seems to make alot of sense for these otherwise unexplainable yo-yo like SERPS. I am quite convinced that this issue is responsible for my traffic being down about 2 million page views per month.
Here's the thing -- I have found only 5 URLs in Google that were "hijacked" versions of my pages. Seems like alot of damage for 5 lousy pages. I found zero hijacked pages for my biggest site which is nevertheless suffering from the G loves me/hates me yo-yo syndrome. So I suspect that there may be additional false URLs somewhere in the system that are screwing me over dupe content wise, but which don't show for any of the "how to find the hijacker" searches discussed here. Seems reasonable considering G's inability to list proper backlinks for the past year or two.
So what do you all think? Is it possible that some sites are effected by this problem even if the owner finds very few or even no "hijacked" pages via methods listed here?
Twist - There are NO solution for this, google has to fix the way googlebot is indexing redirecting links.
Your saying there is no solution, protection, precautionary measures, nothing a person can do to make it harder or more difficult for people to hijack your page? I find it hard to accept as a (wanna-be) programmer, that every possible angle on dealing with this problem on our side has already been tried.
It feels like jim (aka jdMorgan) is clark kent (aka Superman) in superman 2, the (webmaster)world is in trouble and he is off with lois lane somewhere completely unaware of the problem. The hijackers have taken over and we need jdman to save us.
For those that don't know, jdMorgan is the moderator for the apache forum and a programming GOD!
Welcome.
Now since those injected pages are scripts and Google thinks they are part of your site what happens if:
<a href="www.example.com/badsitewithevilthings/">site1</a>
<a href="www.example.com/badsitewithbadthings/">site2</a>
etc... etc .... gets presented to Google when its bot visits?
It is a script under the control of another party in most of these cases (it doesn't have to be however).
This is an arbitrary executible code injection bug.
Look past the dup content issue for a moment.
Completely lost me with the code injection thing and looking past dupe content -- can you restate, please?
While digging deeper into one of the hijacker sites I did see some questionable adult content...