| This 713 message thread spans 24 pages: 713 (  2 3 4 5 6 7 8 9 ... 24 ) > > || |
|302 Redirects continues to be an issue|
| 6:23 pm on Feb 27, 2005 (gmt 0)|
recent related threads:
It is now 100% certain that any site can destroy low to midrange pagerank sites by causing googlebot to snap up a 302 redirect via scripts such as php, asp and cgi etc supported by an unseen randomly generated meta refresh page pointing to an unsuspecting site. The encroaching site in many cases actually write your websites location URL with a 302 redirect inside their server. This is flagrant violation of copyright and manipulation of search engine robots and geared to exploit and destroy websites and to artificially inflate ranking of the offending sites.
Many unethical webmasters and site owners are already creating thousands of TEMPLATED (ready to go) SKYSCRAPER sites fed by affiliate companies immense databases. These companies that have your website info within their databases feed your page snippets, without your permission, to vast numbers of the skyscraper sites. A carefully adjusted variant php based redirection script that causes a 302 redirect to your site, and included in the script an affiliate click checker, goes to work. What is very sneaky is the randomly generated meta refresh page that can only be detected via the use of a good header interrogation tool.
Googlebot and MSMBOT follow these php scripts to either an internal sub-domain containing the 302 redirect or serverside and “BANG” down goes your site if it has a pagerank below the offending site. Your index page is crippled because googlebot and msnbot now consider your home page at best a supplemental page of the offending site. The offending sites URL that contains your URL is indexed as belonging to the offending site. The offending site knows that google does not reveal all links pointing to your site, takes a couple of months to update, and thus an INURL:YOURSITE.COM will not be of much help to trace for a long time. Note that these scripts apply your URL mostly stripped or without the WWW. Making detection harder. This also causes googlebot to generate another URL listing for your site that can be seen as duplicate content. A 301 redirect resolves at least the short URL problem so aleviating google from deciding which of the two URL's of your site to index higher, more often the higher linked pagerank.
Your only hope is that your pagerank is higher than the offending site. This alone is no guarantee because the offending site would have targeted many higher pagerank sites within its system on the off chance that it strips at least one of the targets. This is further applied by hundreds of other hidden 301 permanent redirects to pagerank 7 or above sites, again in the hope of stripping a high pagerank site. This would then empower their scripts to highjack more efficiently. Sadly supposedly ethical big name affiliates are involved in this scam, they know it is going on and google adwords is probably the main target of revenue. Though I am sure only google do not approve of their adsense program to be used in such manner.
Many such offending sites have no e-mail contact and hidden WHOIS and no telephone number. Even if you were to contact them, you will find in most cases that the owner or webmaster cannot remove your links at their site because the feeds are by affiliate databases.
There is no point in contacting GOOGLE or MSN because this problem has been around for at least 9 months, only now it is escalating at an alarming rate. All pagerank sites of 5 or below are susceptible, if your site is 3 or 4 then be very alarmed. A skyscraper site only need create child page linking to get pagerank 4 or 5 without the need to strip other sites.
Caution, trying to exclude via robots text will not help because these scripts are nearly able to convert daily.
Trying to remove a link through google that looks like
new.searc**verywhere.co.uk/goto.php?path=yoursite.com%2F will result in your entire website being removed from google’s index for an indefinite period time, at least 90 days and you cannot get re-indexed within this timeline.
I am working on an automated 302 REBOUND SCRIPT to trace and counteract an offending site. This script will spider and detect all pages including sub-domains within an offending site and blast all of its pages, including dynamic pages with a 302 or 301 redirect. Hopefully it will detect the feeding database and blast it with as many 302 redirects as it contains URLS. So in essence a programme in perpetual motion creating millions of 302 redirects so long as it stays on. As every page is a unique URL, the script will hopefully continue to create and bombard a site that generates dynamically generated pages that possesses php, asp, cigi redirecting scripts. A SKYSCRAPER site that is fed can have its server totally occupied by a single efficient spider that continually requests pages in split seconds continually throughout the day and week.
If the repeatedly spidered site is depleted of its bandwidth, it may then be possible to remove it via googles URL removal tool. You only need a few seconds of 404 or a 403 regarding the offending site for google’s url console to detect what it needs. Either the site or the damaging link.
I hope I have been informative and to help anybody that has a hijacked site who’s natural revenue has been unfairly treated. Also note that your site may never gain its rank even after the removal of the offending links. Talking to offending site owners often result in their denial that they are causing problems and say that they are only counting outbound clicks. And they seam reluctant to remove your links....Yeah, pull the other one.
[edited by: Brett_Tabke at 9:49 pm (utc) on Mar. 16, 2005]
| 10:29 pm on Mar 8, 2005 (gmt 0)|
Hi japanese, Welcome to WebmasterWorld :)
I've just seen this thread today, as well as this one [webmasterworld.com] and this one [webmasterworld.com], which are, i believe, all three threads you have posted in sofar - all on the same subject (at least with that name - no offence intendend).
I must say that you have done your homework, which is rare for a new poster. It's a subject i've been following for a few years, and i do appreciate that awareness is increased. No doubt you have seen some posts of mine on this subject, otherwise i can point to a few:
Here's one from December 2003 [webmasterworld.com] (#36) - note Brett Tabkes comment in msg #31, he's 100% right there and has been proven even more right later.
Here's another, from May 2004 [webmasterworld.com] (#1) - at the bottom you will find a collection of no less than 24 different related threads dating back to june 2003.
So, this is not new, and there's been plenty of discussions and complaints, literally for years now. Sofar Google has done absolutely nothing to remedy it for a few years.
Anyway, i just wanted to say hi, and that i do appreciate that you raise more awareness on these issues. I haven't read all post of the mentioned three threads yet, but i'm getting there.
| 10:58 pm on Mar 8, 2005 (gmt 0)|
This is a major topic that affects many site owners. I thank you and the other senior members here for allowing me to make sure that the problem yet to be solved is still very much on the agenda.
A democratic exchange of ideas is by far the best method to seek answers to a major problem such as this. A dilemma that has left thousands of website owners in a quandary, if only for people who cannot understand why their sites have disappeared into total oblivion we should raise the stakes.
You and I know almost exactly why. But I can assure you that most website owners simply cannot work out the intricacies and the illusory and deceptive methods deployed against their vulnerable websites.
Claus, This problem has really killed off many website owners aspirations regarding the internet. They are at a complete loss as to why their site is in total oblivion. Google is a very secretive company and very reticent with it.. Google sheds no tears to the lady who spent £10,000 dollars on her site, many, many hours of building her site, complied with every ethical doctrine google outlines, only to be swallowed up by a simple 302 directive that adulterates her dreams of being a website owner.
I know why her site is in oblivion and I know exactly how it got there. I am not prepared to let this issue go. I will debate things on her behalf.
| 11:04 pm on Mar 8, 2005 (gmt 0)|
I say the only way to combat this issue is to take it striaght to google's bottom line.
there are plenty of copywrite lawsuits currently being won against google. heres what i'm proposing
a company (there are now many) who's site no longer shows up for thier trademarked company name has a few offshore friends buy adwords for that name. a copywrite lawsuit follows and costs google a few hundred thousand. not only are they making money off of your brand name but they are allowing for confusion by not ranking your site even inthe organic results.
there are undoubtedly a few hundred, if not thousand poeople in this situation, this could make quite a dent in the bottom line, and would probably force google to rectify this situation.
| 11:27 pm on Mar 8, 2005 (gmt 0)|
I'm actually getting traffic from google by a site that has their 302 redirect listed in the SERPS. They have a higher page rank so it's actually a benefit in a way because of more search traffic. In this case they are doing a direct jump to a sub page on my site when the link is clicked. However I think this will hurt my site in the long run when google decides that my page is a duplicate of theirs but I can't say for sure.
* I did notice one thing that makes it look like it would be so easy for google to figure out how to know which page is the original. *
When you click on the google cache of the redirect site link and check the properties of the images or links on the page it gets worked out to referringsite/uniqueimage.jpg or referringsite/dir/link.html
Those links would all return 404 page not found errors when google would index them so it seems to me that an easy fix for googlebot would be a formula like this:
if page.links = mostly 404s
page = not original
don't credit referring url to content
page = probably the original
Even if googlebot didn't keep the variables needed you could run some sort of process over the index to check for invalid links. I don't see why a search engine would want to list pages with tons of 404s anyway.
A less process intensive fix would be to just give us a new robots.txt entry that says "contentdomain=validsite.com".
Is it just me or wouldn't it be just that easy?
I'm sure that every webmaster who's been affected by this problem would be more than happy to add a line to robots.txt.
My 2 Cents
| 11:32 pm on Mar 8, 2005 (gmt 0)|
The only way to force Google to fix this is with bad publicity. If this story made it onto CNN or whatever, the problem would be fixed within 72 hours.
| 12:06 am on Mar 9, 2005 (gmt 0)|
but this story by itself is not newsworthy, perhaps some sort of protest type action could get enough publicity and draw attention, maybe a sit in or something at the googleplex?
|The only way to force Google to fix this is with bad publicity. If this story made it onto CNN or whatever, the problem would be fixed within 72 hours. |
| 12:07 am on Mar 9, 2005 (gmt 0)|
A good point above
Google's public engine, software and hardware are tools being used to assist users in violating copyright laws.
I am certainly far from a legal rep, but my question: Would Google not be an accessory in helping someone violate copyright laws?
If I went to the public library and used their computers and hardware to reproduce copyright material in mass quantities for profit, don't you think action taken against the library would result in action against myself?
It would be an interesting argument to engage, however I don’t have deep pockets so I will sit back for the ride.
| 12:08 am on Mar 9, 2005 (gmt 0)|
I swear I read somewhere that Google does no evil, and that there's almost nothing someone can do to hurt your rankings.
| 12:11 am on Mar 9, 2005 (gmt 0)|
"The only way to force Google to fix this is with bad publicity."
Agree... seems Google is only worried about their bottom-line, and not search results...
Need a weather forecast, map/address, or some silly blog - then Google's your site. Outside of that, I'd personally focus my attention more on MSN or Yahoo.
Kind of ironic GoogleGuy can comment on the Google Desktop search ( quickly might I add ) - but seems to be incognito when it comes to Google faults ... especially since this subject has been discussed for six, or more months now - and yet there appears to be nothing done.
But heck... GoogleGuy says "People have already written some cool plug-ins" for Desktop Search... how about some cool code to fix the 301/302 problem?
| 12:43 am on Mar 9, 2005 (gmt 0)|
First, excellent post Japanese it was well and succinctly written for the intended audience. It is a problem and that problem is not only for mid-range PR4,5,& 6 sites.
|but this story by itself is not newsworthy |
This is actually not the case. If the story is put in such a way that doesn't mention, robots.txt, skyscraper sties, 301 & 302 redirect, php scripts etc then the journalist would decide whether it is a story.
To make it a story, some people from different disciplines, who know what the story consists of could outine the story for the relevant journalists.
There is a story in everything and everything is newsworthy somewhere!
| 12:49 am on Mar 9, 2005 (gmt 0)|
Well, i am getting the feeling that Google don't want to fix the problem as long as possible because this is surely good for the AdWords revenues. I can't see another reason by such a company like Google is - they always tried to serve good and fair results.
| 1:10 am on Mar 9, 2005 (gmt 0)|
Certainly anyone can send press releases to technical reporters at news organizations, and that would be a good thing to do.
The info given to journalists needs to give a clear and easily understandable overview of the problem, some examples and consequences of the problem, along with what happened when Google was notified. Bear in mind, the tech reporter might be unavailable and someone else without the needed skills might be filling in.
The person sending the press release needs to include his or her full name, address, phone number and email info in case the reporter wants more info. It would also be a good idea to provide Google's phone number so the reporter can contact them for their side of the story.
Info on how to write a press release: [lunareclipse.net...]
Let's see where this goes.
| 1:19 am on Mar 9, 2005 (gmt 0)|
After I seen some new posts in this topic, I got a little scared, I dident know this was happening since 2003.
This means for me I dont see any solution to this topic, so maybe we most face the music and start to build pages with some redirecting scripts to good sites and then our own content, that way the scripts will be a form of SEO.
I will try to look for some tech news that have posted something about google, which have also been shown in CNBC and other business news channel, I will start now, then we could give them the links to webmasterworld which is about this topic and some good text which should be written by some here that are better with text then I am.
I could post some names here which could be a good email worthy.
As said I dont think we will see a solution, then we can also go to the google complex with some banners.
| 1:21 am on Mar 9, 2005 (gmt 0)|
If someone can put a dollar/euros amount on this that will get some press. In particular how many of those dollars pass through Google.>?$$$?<
| 1:26 am on Mar 9, 2005 (gmt 0)|
"As said I dont think we will see a solution,..."
Apparently Yahoo, and to a certain extent MSN, aren't as susceptable to this problem.
Play up that angle..)
| 1:28 am on Mar 9, 2005 (gmt 0)|
Ok here are a few good news site/papers in the broker business, I will bet they would like some news about this, anything about google is news worthy.
Who knows the SE better then webmasters and now we are journalists.
Now we need a good writer who posts emails to them
The Wall Street Journal Online
Many are hurt by this situation and we must face if you are not on google you are not on the net. I dont see any other solution for this, Im sorry google but you had the chanc to let us know you where just working on this problem.
| 1:31 am on Mar 9, 2005 (gmt 0)|
cNet should be the first to write about this, but they been too busy kissing Google's a-s lately ...
| 1:34 am on Mar 9, 2005 (gmt 0)|
Walkman - just my words
| 1:39 am on Mar 9, 2005 (gmt 0)|
Serious are ther other solutons to this, I dont see it and we must do somthing now, since it has been going on for so long.
| 2:01 am on Mar 9, 2005 (gmt 0)|
|Apparently Yahoo, and to a certain extent MSN, aren't as susceptable to this problem. |
Play up that angle..)
I think one of the reasons this affects google more than the other search engines is because google has done much deeper crawling of the web (I seem to remember some press last year about doubling the total pages indexed or something like that) so now that they're indexing the generated php scripts etc it's balooned the problem.
I read somewhere that search engines used to immediatly stop indexing a site when it ran across a 302. Interesting.
There has to be a solution, and I believe it will come with time, I just wish someone would tell us they are working on it.
On the other hand I think this whole 302 redirect thing may be a conspiracy to keep me from writing content. ;)
| 2:21 am on Mar 9, 2005 (gmt 0)|
Once the 'bad' site captures your ranking from G, is there any need for them to continue 302ing your page? Seems to me once they knock the good site off and become acknowledged as the 'official' version, they can change the content to be anything they want. Is that correct? Then, if you've been 302'd long enough ago, it would be impossible to detect? or is it necessary for them to maintain a detectable index entry indefinitely?
| 2:21 am on Mar 9, 2005 (gmt 0)|
"Now we need a good writer who posts emails to them
The Wall Street Journal Online
Don't forget that Wall Street is currently in the midst of making a fortune selling Google shares to the public. Once the stock is in the hands of the public, then the story can come out so the stock can tank and Wall Street can then buy the company back dirt cheap. Bad press on Goog from wall st aint gonna happen now.
Why don't webmasters use their resources? A site explaining this problem and google bomb it.
| 2:44 am on Mar 9, 2005 (gmt 0)|
A quick question:
if entry is adwords then
recode ip to database
recode date/time to database
else if entry is other ad then
Will the above method harm my-site-B.com? As far as I know, the header() function of PHP also uses 302 header.
| 2:50 am on Mar 9, 2005 (gmt 0)|
Just a thought on fixing a 302 problem yourself.
Don't know if it would work but what if you took the page on your site thats being redirected to, lets say "widget/bluewidget.html" and did a 301 redirect to "newlocation/bluewidget.html".
Wouldn't that tell googlebot that the source page should now be at newlocation/bluewidget.html and take away referring sites listing to your page since its not the 301 location?
It's late, forgive me if I make no sense. The more I think about it, the more I think it won't work but I'll throw it up for review.
| 3:18 am on Mar 9, 2005 (gmt 0)|
Certainly anyone can send press releases to technical reporters at news organizations, and that would be a good thing to do.
Probably not a good idea. Anyone can, but would it get picked up or even noticed. There is a reason that major companies, governments etc hire press officers and press professionals. They can get the story published, they know the journalists, they know how to present the story, they know the audience of the publication as well as, or better than the journalist who would write the article. The best press people, in reality, probably write most of the article before the journalist even sees it. The other information in Beachboy's post is valid.
|tools being used to assist users in violating copyright laws |
Rewritten this also attracts legal journos.
"Is your business suffering from dodgy webmasters stealing your content and putting it on Google" - this will get the business journalists interested.
"Webmaster steal's whorehouse blog and surrounds with advertising" will get the Enquirer interested.
To get publicity, writing a press release, even an extremely good, well written, easily understandable press release is only the start. There is the pitch, the follow-up, the futher follow-up.
Then when the NYT, CNN, BBC all pick up the story, Google have their own professional press people to counter everything your press release says.
It is not a coincidence that Google's press rep was able to come to the defence of the new version of Google's toolbar last week. Whether webmasters believe all the statements given last week or not, does the average New York Post, Sun, Daily Record reader believe it, that is the question.
| 3:36 am on Mar 9, 2005 (gmt 0)|
The main problem is there aren't a whole lot of people who even can understand this problem even when it is explained to them. So long that a major site isn't hijacked... and I believe them to be whitelisted in some way else it would have happened by now... most of the general public won't care. IF someone got a class action going, it might gain some publicity... once again, you would need a law firm with somebody capable of understanding this. Remember over a year ago when we were talking about this, even senior members here were practically berating those of us who said this was happening or even that it could happen.
| 3:51 am on Mar 9, 2005 (gmt 0)|
The issue is not simply 301/302 redirects that is causing googlebot to choke. I have seen a redirect from a hijacker like this;
Is there 2 redirects here?
[edited by: kwngian at 4:23 am (utc) on Mar. 9, 2005]
| 4:05 am on Mar 9, 2005 (gmt 0)|
I believe we all have to understand that this problem is not a "Google error." Google is doing exactly what it's supposed to do with respect to 302s. Example: When www,BigUniversity.edu issues 200,000 302s for a few of its files, it wants and *expects* Google to do exactly what Google is currently doing with the 302.
It appears to me that the basic problem is a quirk in the http/server protocol that is being exploited (intentionally or otherwise) by an increasing number of sites.
And, yes, the increase is partly attributable to Adsense, which, by the way is one of the prime generators of $$$ in G's pocket, so expecting Google to slam on the brakes by going *against the published protocol* is sort of a pie-in-the-sky dream.
And that's why I think this will continue to be a severe problem.
| 4:16 am on Mar 9, 2005 (gmt 0)|
Hey, Kwngian... any chance of editing your post and getting rid of the massive horizontal scroll bar? I can't read the page.
| This 713 message thread spans 24 pages: 713 (  2 3 4 5 6 7 8 9 ... 24 ) > > |