homepage Welcome to WebmasterWorld Guest from 54.167.179.48
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 713 message thread spans 24 pages: < < 713 ( 1 ... 9 10 11 12 13 14 15 16 17 18 [19] 20 21 22 23 24 > >     
302 Redirects continues to be an issue
japanese

5+ Year Member



 
Msg#: 28329 posted 6:23 pm on Feb 27, 2005 (gmt 0)

recent related threads:
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]



It is now 100% certain that any site can destroy low to midrange pagerank sites by causing googlebot to snap up a 302 redirect via scripts such as php, asp and cgi etc supported by an unseen randomly generated meta refresh page pointing to an unsuspecting site. The encroaching site in many cases actually write your websites location URL with a 302 redirect inside their server. This is flagrant violation of copyright and manipulation of search engine robots and geared to exploit and destroy websites and to artificially inflate ranking of the offending sites.

Many unethical webmasters and site owners are already creating thousands of TEMPLATED (ready to go) SKYSCRAPER sites fed by affiliate companies immense databases. These companies that have your website info within their databases feed your page snippets, without your permission, to vast numbers of the skyscraper sites. A carefully adjusted variant php based redirection script that causes a 302 redirect to your site, and included in the script an affiliate click checker, goes to work. What is very sneaky is the randomly generated meta refresh page that can only be detected via the use of a good header interrogation tool.

Googlebot and MSMBOT follow these php scripts to either an internal sub-domain containing the 302 redirect or serverside and “BANG” down goes your site if it has a pagerank below the offending site. Your index page is crippled because googlebot and msnbot now consider your home page at best a supplemental page of the offending site. The offending sites URL that contains your URL is indexed as belonging to the offending site. The offending site knows that google does not reveal all links pointing to your site, takes a couple of months to update, and thus an INURL:YOURSITE.COM will not be of much help to trace for a long time. Note that these scripts apply your URL mostly stripped or without the WWW. Making detection harder. This also causes googlebot to generate another URL listing for your site that can be seen as duplicate content. A 301 redirect resolves at least the short URL problem so aleviating google from deciding which of the two URL's of your site to index higher, more often the higher linked pagerank.

Your only hope is that your pagerank is higher than the offending site. This alone is no guarantee because the offending site would have targeted many higher pagerank sites within its system on the off chance that it strips at least one of the targets. This is further applied by hundreds of other hidden 301 permanent redirects to pagerank 7 or above sites, again in the hope of stripping a high pagerank site. This would then empower their scripts to highjack more efficiently. Sadly supposedly ethical big name affiliates are involved in this scam, they know it is going on and google adwords is probably the main target of revenue. Though I am sure only google do not approve of their adsense program to be used in such manner.

Many such offending sites have no e-mail contact and hidden WHOIS and no telephone number. Even if you were to contact them, you will find in most cases that the owner or webmaster cannot remove your links at their site because the feeds are by affiliate databases.

There is no point in contacting GOOGLE or MSN because this problem has been around for at least 9 months, only now it is escalating at an alarming rate. All pagerank sites of 5 or below are susceptible, if your site is 3 or 4 then be very alarmed. A skyscraper site only need create child page linking to get pagerank 4 or 5 without the need to strip other sites.

Caution, trying to exclude via robots text will not help because these scripts are nearly able to convert daily.

Trying to remove a link through google that looks like
new.searc**verywhere.co.uk/goto.php?path=yoursite.com%2F will result in your entire website being removed from google’s index for an indefinite period time, at least 90 days and you cannot get re-indexed within this timeline.

I am working on an automated 302 REBOUND SCRIPT to trace and counteract an offending site. This script will spider and detect all pages including sub-domains within an offending site and blast all of its pages, including dynamic pages with a 302 or 301 redirect. Hopefully it will detect the feeding database and blast it with as many 302 redirects as it contains URLS. So in essence a programme in perpetual motion creating millions of 302 redirects so long as it stays on. As every page is a unique URL, the script will hopefully continue to create and bombard a site that generates dynamically generated pages that possesses php, asp, cigi redirecting scripts. A SKYSCRAPER site that is fed can have its server totally occupied by a single efficient spider that continually requests pages in split seconds continually throughout the day and week.

If the repeatedly spidered site is depleted of its bandwidth, it may then be possible to remove it via googles URL removal tool. You only need a few seconds of 404 or a 403 regarding the offending site for google’s url console to detect what it needs. Either the site or the damaging link.

I hope I have been informative and to help anybody that has a hijacked site who’s natural revenue has been unfairly treated. Also note that your site may never gain its rank even after the removal of the offending links. Talking to offending site owners often result in their denial that they are causing problems and say that they are only counting outbound clicks. And they seam reluctant to remove your links....Yeah, pull the other one.

[edited by: Brett_Tabke at 9:49 pm (utc) on Mar. 16, 2005]

 

kilonox

10+ Year Member



 
Msg#: 28329 posted 5:10 pm on Mar 15, 2005 (gmt 0)


someone posted this 302 thing on Slashdot
[slashdot.org...]
I hope the link is OK

great news. I'm sure G is enjoying the PR.

ciml

WebmasterWorld Senior Member ciml us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 28329 posted 5:14 pm on Mar 15, 2005 (gmt 0)

kilonox, many sites have sessions in URLs. Some do use redirects to help assign those sessions, and a it is not uncommon for such a site to suffer from crawling and link-assignment problems.

> useful imo if your site is suffering

Useful for a webmaster who wants even greater confusion, IMO.

theBear

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 5:22 pm on Mar 15, 2005 (gmt 0)

grits,

A symptom of this situation is as follows:

1: a site:yourdomain.com in google returns pages that are not your pages that is the green url listing at the bottom of each listing isn't one of yours.

and/or

2: your site has been split into a combination of:

www, non-www, IP addy, and/or parked domain forms.

The script being used can of course present Google with one thing and you with another based on UA and/or ip addy.

idoc

10+ Year Member



 
Msg#: 28329 posted 5:23 pm on Mar 15, 2005 (gmt 0)

"the information from boredguru is probably some of the most helpful in the thread sofar."

I would have to concur that. Rereading the Sept. 2004 post also helped me alot.

Boredguru, your suggestion is very plausible and my *only* concern with it is to be found by G to be serving them alternate content *if* they come around with a stealth bot. That is why I am leaning toward giving them their own site altogether in a subdomain or an even an alternate domain. I don't want to tarnish the main domain site in any way and I believe G would see the subdomain as separate as a totally unique domain site. Alternatively, I also have a couple seasoned alternate domains to play with here as well. I really plan on implementing something along these lines and would appreciate feedback as to why one way would be better than another. I am still weighing the factors myself.

kilonox

10+ Year Member



 
Msg#: 28329 posted 5:44 pm on Mar 15, 2005 (gmt 0)

kilonox, many sites have sessions in URLs. Some do use redirects to help assign those sessions, and a it is not uncommon for such a site to suffer from crawling and link-assignment problems.

> useful imo if your site is suffering

Useful for a webmaster who wants even greater confusion, IMO.

I don't think many people have the luxury of waiting on Google to fix it ciml if they are suffering. If they don't know how to code or manage code, something like this will make the do whatever it takes to try possible solutions. If it comes at the expense of a short term serp hit cause of dynamic urls, if it works, then its a good thing.

I know you understand what it means to be in a mc situation where time and money are flowing into the drain and you are forced to try every route to get to a solution --

nolen1

10+ Year Member



 
Msg#: 28329 posted 5:45 pm on Mar 15, 2005 (gmt 0)

I have noticed something that might protect a site from pagerank loss even if the hijacker has a higher pagerank. Two sites of mine that this happened to had 301 rediects in the .htaccess file. They were hijacked for several months and lost their Google rankings, but they never lost their pagerank and two weeks after the redirects were removed the sites went back to ranking as well as before.

One site of mine did not have a 301 redirect and the index's pagerank went to 0 (white bar). Before the site was hijacked the pagerank was five after the 302 redirects were removed the pagerank finally came back. When it came back it was a two. It has been a two for six months now. Also the site has never ranked as well in the serps as it did before the hijacking.

It looks like a 301 redirect will protect your Google pagerank and stop the hijacker from stealing it until you can get the 302 redirects removed. It won't stop you from being penalized for duplicate content and loosing your rankings but I think it will prevent permanent damage.

I can't get the redirect removed this time. A seo company from Thailand owns the domain that is 302ing me. I have emailed them several times at several different email addresses. They won't reply or remove the redirect. I have also contacted the offending site's hosting company and Google. No replies.

Msn is having the same problem handling 302s. We need to get a thread like this going in the Msn forum. I hope Msn is aware of this problem.

I know Google is aware that they are allowing hijackers to get other sites penalized. They just don't care. How hard can it be to fix it? Yahoo fixed it. Google claims that that there is nothing a competitor can do to hurt your site. A flat out lie.

Reid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 6:32 pm on Mar 15, 2005 (gmt 0)

Boredguru - your script sounds good but I'm not sure if I want my precious site to be a guinea pig for testing googlebot. I'm not totally in the sandbox just got a couple of bloodsuckers to get rid of.

How about a script to target a specific 302 links pointing at your site? .htacess would be a good option for me. I'd rather just deal with them as they come.

Reid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 6:49 pm on Mar 15, 2005 (gmt 0)

Maybe if you check the referrer string and if it matches a specific domain then send it via 301 moved permanently to the page. Would that cancel the 302 effect? Hey maybe we can suck some PR out of them even by reversing it. Their domain has permanently moved to my domain hehe. That would discourage the use of 302's.

zeus

WebmasterWorld Senior Member zeus us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 28329 posted 7:03 pm on Mar 15, 2005 (gmt 0)

I hope we soon see more about this googlejacking in the news or the internet.

boredguru

10+ Year Member



 
Msg#: 28329 posted 7:09 pm on Mar 15, 2005 (gmt 0)

Hi Reid
Using referrer string was a previous idea that i used but unfortunately Gbot does not have referer strings.

Okay Ciml. It might look confusing, but how many big sites do you know that have url the length of your hand, and that which keep changing.

Lots of big sites do it.

And you know what a year and half back i remember laughing at some medium to big news site that had placed in their terms and condition, that linking to their site without the written authorization from them was not allowed. That is if i want to link to them from my site, i need to get a written authorisation from them and verification of my page with the link. I thought what fools they were. Guess they are pretty safe now from all this.

And if anyone wants to sell an hijacked domain that is not that close to them, i am willing to buy.
Serious, anyone has an offer?
and please dont rip me off. I dont have that kind of money!

[edited by: boredguru at 7:11 pm (utc) on Mar. 15, 2005]

gregdi

10+ Year Member



 
Msg#: 28329 posted 7:10 pm on Mar 15, 2005 (gmt 0)

I have a question that may seem silly to some, but here goes...

One of my sites has been hijacked to the point that I get absolutely no organic search engine traffic whatsoever from Google. I was planning to re-design the site anyway, going from static Pages to dynamically generated PHP pages. Once Google spidered my site's pages, wouldn't I be getting my way back into the serps anyway? Google still spiders my site now, so my thought is that fresh pages with new URL's would partially solve the problem for me right now.

twist

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 7:16 pm on Mar 15, 2005 (gmt 0)

RewriteEngine on
RewriteCond %{HTTP_HOST} ^192.168.0.0
RewriteRule ^(.*)$ [example.com...] [R=permanent,L]
RewriteCond %{HTTP_HOST} ^www.example\.com
RewriteRule ^(.*)$ [example.com...] [R=permanent,L]
RewriteCond %{HTTP_HOST} ^example\.com
RewriteRule ^(.*)$ [example.com...] [R=permanent,L]

If it is even allowed, what would happen if you put a 301 to yourself and lets say you add current time, date and maybe a quote of the day to your homepage.

Evertime googlebot comes across your page it will see that it has been permanently moved. Would this cause google to recheck your page against the offending hijackers page at the very least?

boredguru

10+ Year Member



 
Msg#: 28329 posted 7:32 pm on Mar 15, 2005 (gmt 0)


One of my sites has been hijacked to the point that I get absolutely no organic search engine traffic whatsoever from Google. I was planning to re-design the site anyway, going from static Pages to dynamically generated PHP pages. Once Google spidered my site's pages, wouldn't I be getting my way back into the serps anyway? Google still spiders my site now, so my thought is that fresh pages with new URL's would partially solve the problem for me right now.

I have been having dyanamic php urls from the time the site was built and i dont have a problem in ranking well. But the only problem is that you wont be able to see your PR. Does not mean you dont have one, but just that you wont be able to see it in the toolbar.

And do something on the double. If G still visits your site, you do have the power what to feed her when she comes.

theBear

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 7:55 pm on Mar 15, 2005 (gmt 0)

twist,

It would be called a redirect loop.

twist

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 8:48 pm on Mar 15, 2005 (gmt 0)

to theBear

In the words of homer simpson, doh! (wasn't thinking)

How about this,

Make your site reference like this, ht*p://example.com/site/

i.e.
Homepage ht*p://example.com/site/
Search ht*p://example.com/site/search.html
content ht*p://example.com/site/content.html

If a person comes to your website under that normal ht*p://example.com/ then use a RewriteCond to 301 them to ht*p://example.com/site/

Is it possible to use a few RewriteCond and the [OR] and! to create this?

Have no idea if this is possible, but i'll throw it out there.

ciml

WebmasterWorld Senior Member ciml us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 28329 posted 8:51 pm on Mar 15, 2005 (gmt 0)

> how many big sites do you know that have url the length of your hand, and that which keep changing

Those sites tend to have an enormous amount of link power, continuously pointed at one of those URLs. I tend to assume that Google's decision is easier in those cases.

Reid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 11:05 pm on Mar 15, 2005 (gmt 0)

I'm getting nervous now. After reading those earlier threads. After getting a few of those 302 links to my site removed.

Here is a quote from a post last september in another webmsterworld thread
They took over a week to answer my first email. I sent it to webmaster@google.com and the replies were coming from help@google.com. I tried a different address for them and the reply still came from help@google.com. I implored them to please refer my questions to somebody higher up and put ATTN:Googleguy in the message title. I started getting responses from googlebot@google.com.
I can tell you that the responses make me want to laugh, cry, and scream at the same time.

In the meantime, my hijacked index page has moved up from number seven to number three in the SERPS even though they did remove the redirect and the link now goes to a 404 error page.


claus

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 11:21 pm on Mar 15, 2005 (gmt 0)

twist and others, here's an .htaccess code you could use:

-------------------------
RewriteCond %{HTTP_USER_AGENT} Googlebot [NC]
RewriteRule (.*) http://www.example.com/$1 [R=301,L]
-------------------------

Make it eg. the first rule (or the last rule) in your .htaccess file. It does this:

If Googlebot requests a file (any file), redirect that request one time only to the exact same URL with a 301 status code, and do no more. What happens after this is that Googlebot will get the file with a code "200 OK" or whatever code your webserver would otherwise throw at it (eg. if it's a dead link it will of course get a 404). [NC] means that the spelling of "GooGLeBot" is not case sensitive.

It also makes sure that Googlebot will always just see the domain with "www." in front of it (if you don't want this, just remove "www." from the rule).

This way, each and every URL that Googlebot requests will get some sort of "extra verification stamp" saying "the right URL for the file you requested is the same URL as the one you used"

(actually it says: "the URL you requested has been moved permanently to the exact same place - ie. to the location you already requested once". So, if there were no hijackers this would be pure nonsense. The "www." part adds a small bit of real and useful functionality.)

It is a bit similar to the (second part of the) method posted by boredguru, but it does not change any URLs and it does not use 302 status codes, so it will not create extra duplicate content for you.

>> slashdot

yeah, i noticed that ;) Too bad the slashdot crowd only need to see the word "adult" one time in an article to be talking about pr0n for hours. However, the point was picked up after a few screens of off-topic posts.

[edited by: claus at 11:33 pm (utc) on Mar. 15, 2005]

Stefan

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 11:23 pm on Mar 15, 2005 (gmt 0)

Googleguy is here to help with the small things website/google related and a big thanks for that.

Respect, Zeus, but I'm unsure of what that help could be. Other than advice on whether I should post a pic of my dog on the site, I don't know what might be forthcoming. It ain't like the good old days when one could almost believe they meant that, "Do no evil", stuff, (and GG would actually check on obvious injustices). IPO or not, there's little credibilty left if they can't even comment on this.

Reid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 11:55 pm on Mar 15, 2005 (gmt 0)

Thanks Claus - that sounds like an excellent solution.

My only worry now is that one link that I requested a directory to remove, it was hijacking my homepage and they removed it at my request but when i went to the URL removal tool it responds "www(dot)othersite.com/go.php?id=58585 returns 302 found but the the HTTP response header is empty"
In other words they removed my link but the php redirect sends to an empty url - resolves to a 404 on their server but google can't seem to figure that out.
I e-mailed them back but they don't seem to care anymore.

claus

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 12:04 am on Mar 16, 2005 (gmt 0)

Reid, did you check that URL with a server header checker? It is more important that the script url itself returns a 404 than it is to get the link removed from a physical page.

Reid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 12:07 am on Mar 16, 2005 (gmt 0)

It should also be noted Claus that would cancel out any META refresh 302 pages on your own site as well.

theBear

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 12:09 am on Mar 16, 2005 (gmt 0)

Claus,

Have you tested that rewrite setup?

I added that to my .htaccess file on a test site and I get a redirect loop detected .... but the site has tons of things that get remaped ....

I also changed the agent to test .... and I have a tool that allows ua changes ...

claus

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 12:15 am on Mar 16, 2005 (gmt 0)

>> you tested that rewrite setup?

theBear i have read ears. No i did not - i just wrote it like that.

Don't use it. I'm sorry, it will loop of course. Back to the drawing board.

boredguru

10+ Year Member



 
Msg#: 28329 posted 12:33 am on Mar 16, 2005 (gmt 0)

Claus
I think this could be it. But only one doubt (which i posted elsewhere and you did not notice i guess)

How do you know if when the Gbot visits it visits thinking its fetching your domainname.com or it is thinking it is fetching hijacker.com/url.php?domainname.com .

Because when you are redirecting it Gbot could really have come asking for yourdomain.com but the next time(more like day) it could be asking for hijacker.com webpage which it thinks has moved to your homepage.

And as your homepage will be visited more often than some page three levels deep on your hijackers site, we would be pretty lucky catching the bot at the right time to make it think that the hijackers page has perrmanently.

This is the only flaw, but it is un-countable times more safer and cooler (no ciml i really think cool urls too dont change:) ) than what i suggested. I think taking this idea a step further will bring us closer to some realization of our goals.

How about doing it once every day for google bot alone.

That is
Day1 : Gbot asks for yourdomain.com. You redirect it once that day to yourdomain.com. No harm done today and no gain also.
Day2 : Gbot asks for yourdomain.com. You redirect it once that day to yourdomain.com. No harm done today and no gain also.
Day3 : ditto
Day4 : ditto
Day5 : ditto
Day6 : Gbot asks for yourdomain.com thiking it is fetching hijacker.com/url.php?url=yourdomain.com. Today no harm done but lots of good done.

I need to refine this better. So i am planning to look at my logs for the past year to see how Gbot has requested my pages, starting from the homepage and how many times a day.

Will post if i think i see any pattern and ask for your ideas.

boredguru

10+ Year Member



 
Msg#: 28329 posted 12:37 am on Mar 16, 2005 (gmt 0)

Claus
But that idea can be implemented without looping using a ssi language like php.

I will take a shot at writing it and post it later on in the day.

But it will involve using database to sort of have an memory of what happened in the recent past.

twist

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 12:48 am on Mar 16, 2005 (gmt 0)

claus - Don't use it. I'm sorry, it will loop of course. Back to the drawing board.

I made the same mistake in msg #:552

I wonder if jdmorgan (jim) from the apache forum could come up with a workable solution, he's good.

-------

Another idea,

For example, when people link to your site ask them to use www and then have this in your htaccess (or vice versa),

RewriteCond %{HTTP_HOST} ^www.example\.com
RewriteRule ^(.*)$ [example.com...] [R=permanent,L]

That way even if they use a 302 to www.example.com it would be corrected automatically when it was 301'ed in your htaccess file. Although this could potentially mess with your backlinks. Thoughts?

theBear

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 12:59 am on Mar 16, 2005 (gmt 0)

I know of several ways to harden your site.

1. random content shuffle
2. programmed 1 shot 301 redirects (kinda like the random content).
3. massive invasive insertion of code.

Now I already do a lot of 1, 2 is on the boards, 3 I just did for another reason (related to this) and had a sore wrist for a week or so after (not going to do it again).

jk3210

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 1:11 am on Mar 16, 2005 (gmt 0)

<<Back to the drawing board>>

Keep'em coming, Claus. We appreciate your efforts.

boredguru

10+ Year Member



 
Msg#: 28329 posted 1:35 am on Mar 16, 2005 (gmt 0)

I have got his one sinking feeling.
Because whatever we do is dependent on how G is treating 302 redirections.

You stated claus that Gbot sees the 302 redirection and goes yipee one more url and indexes it again as hijackers url. Are you certain? Because if this is the way its done then we can get over it.

But...and this is a big But....what if Gbot does not go yipee one more new url. It already knows that the redirected url exists in its index. It just by default assigns that url to the hijackers url without doing an fetch.

We can get to know this with the help of gregdi & idoc &other victims (no too strong a word.. more like casualties), we can get to know.

I suggest this. gregdi & idoc check for the hijackers page in the index and check the cache date. if you have more than one hijacker, then check all their cache date and then check your logs to see gbot activity on that date. Is there any difference, like your page that was hijacked being fetched twice etc. Or if there any pecularities that you see please post them here concisely. Also dont be afraid to use your gut instincts, after all no-one knows your site better than you. Because in those pecularities lies our answer. really

<edit reason> Corrected typos</edit>

[edited by: boredguru at 1:36 am (utc) on Mar. 16, 2005]

twist

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 28329 posted 1:36 am on Mar 16, 2005 (gmt 0)

Alright, looking around at some random htaccess examples I noticed one possible(?) solution although I wouldn't know where to begin to create the code for it.

Rough example (don't actually use anybody),

RewriteCond %{HTTP_USER_AGENT} Googlebot [NC]
RewriteCond %{If no TIME_MIN at end of url}
RweriteRule {Append TIME_MIN to end of url}

RewriteCond %{HTTP_USER_AGENT} Googlebot [NC]
RewriteCond %{If TIME_MIN on url == current TIME_MIN}
RewriteRule ^(.*)$ [example.com...] [R=permanent,L]

Then remove the appended TIME_MIN in a php script.

You could of course set it for 5 or 10 seconds instead of a full minute.

This 713 message thread spans 24 pages: < < 713 ( 1 ... 9 10 11 12 13 14 15 16 17 18 [19] 20 21 22 23 24 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved