homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 123 message thread spans 5 pages: < < 123 ( 1 [2] 3 4 5 > >     
Can Others Hurt Your Rankings - Part zillion...

 9:27 am on Apr 2, 2008 (gmt 0)

Don't see this with any other search engine. Whereas other SEs will simply ignore any inbounds or content that seems to be dodgy, Google actually damages your rankings if it sees anything untoward.

Currently competitors can damage you in a variety of ways (probably best not to get too specific as you'll give some people ideas) but general things that Google lets your competitors affect your rankings with are:

1. Inbound links from "bad" places.
2. Hundreds of links from one IP address.
3. Duplicate content:
3a) from competitors scraping your entire site and creating their own from it
3b) from competitors stealing your articles and submitting them to article submission services
4. 301/302 redirect hijacking (after all this time it STILL happens!)

and more. There is a new one that is very quick to work using 301's that I daren't put in here but a black hat "associate" of mine said she's dropped quite a few of her client's competitors within days!

Why do you think that Google has this mentality of allowing this war between webmasters? They could simply ignore bad stuff rather than filter sites, such would not create less relevant results.



 11:20 pm on Apr 8, 2008 (gmt 0)

This 301 problem isn't just about "other webmasters" .

It's about the confusion Google has in handling 301's - just take the above examples and imagine yourself applying the issues to your own internal and website network.

Then you can start to get a sense of the problems. Google may not handle 301's perfectly. And that can potentially eliminate your site from the SERP's.


 2:57 am on Apr 9, 2008 (gmt 0)


I think that depends on the forum, for example, if you have a link in your profile from webmaster world it does not hurt because it is a very trusted site. On the flip side, if you have a link in your profile at iamablackhatseo.com that could hurt.

Marvin Hlavac

 8:46 am on Apr 9, 2008 (gmt 0)

Thanks, trinorthlighting. So the (high) number of links is in itself is not relevant, and I'm fine as long as backlinks are coming from trusted sites. Thanks, I'll sleep better tonight ;)


 9:06 am on Apr 9, 2008 (gmt 0)

Thanks, trinorthlighting. So the (high) number of links is in itself is not relevant, and I'm fine as long as backlinks are coming from trusted sites.

Until the great Google changes the rules that is.




 9:35 am on Apr 9, 2008 (gmt 0)

As I sit here reading through this I'm amazed that not a single commnet has been made by a google rep. Where are you G? This is a serious issue so how about one of you guys from the inside give us some feedback on this issue.


 10:04 am on Apr 9, 2008 (gmt 0)

Making a comment here would be a great mistake on google's behalf. It's like throwing yourself in front of a killer mob.

They could comment when the heat of the discussion is gone. Too many point fingers at google too easily. You gotta understand from a coder's perspective what they are facing. Otherwise ... you can't get it.


 10:23 am on Apr 9, 2008 (gmt 0)


What could a Google rep say?

We're sorry!

We forgot to think about side effects when we started handing out penalties.

I'm too busy making videos for my blog about stuff that is peripheral to worry about core issues like this. (Only joking Matt.)

Or what?



 10:29 am on Apr 9, 2008 (gmt 0)

Ah-ha, I've been a firm believer in competitive sabotage such as this for many years. I've even made mention of a few exploits more than once around here. :)

You know how people say "SEO is not Rocket Science"? Well, I find that statement to be totally untrue. Today's SEO is a bit more complex than yesterdays. Being able to secure your house to prevent most of these exploits from taking place is the key. There are probably a few exploits that you cannot prepare for such as the 301 being mentioned. For many, all you can do is "hope" that someone doesn't target your website next. It really sucks. And, if you don't have your own servers and know everything that is going on with those servers, you're in trouble...

Import Export

 10:55 am on Apr 9, 2008 (gmt 0)

In theory... Maybe Google analytics & webmaster tools are a partial answer? The more data you give Google about your website via these services, (one would think) the more Google would be able to understand if your site is of value to others. This could then be partial resistance resistance to a malicious attack to your website on Google's properties.


 11:00 am on Apr 9, 2008 (gmt 0)

@IE: So you strip naked so Google can see if you got birth marks? I don't think so.

My site is my personal space. Giving others access to your analytics is not smart... I'm paranoid.


 11:57 am on Apr 9, 2008 (gmt 0)

Some people keep mentioning that there ought to be a GWT 'filter' of inbound links, a list of domains or even IP ranges that a webmaster could ask Google to completely disregard. A win-win situation as they will not only have clean results, but fresh data on possible offenders as well.

That way Google wouldn't need a new algo, and both bad neighborhood linking and redirects would be ineffective. As long as webmasters can spot them.



 12:36 pm on Apr 9, 2008 (gmt 0)

I am sure that this has already been covered but how would one know if one was on the receiving end of a competitors black seo eye ?

Other than the sudden loss of 25-50% of traffic of course?


 12:37 pm on Apr 9, 2008 (gmt 0)

Miamacs ... how many people with websites have GWT accounts? There are a bunch of fanatics like the ones here on WebmasterWorld but the vast majority don't have accounts nor do they know what those are.

This is a win-win for you. Not for everyone.
History's worst mistakes started with the best intentions.


 1:04 pm on Apr 9, 2008 (gmt 0)

Google allowing other webmasters to damage your site.

I'd like to point out that it is not only Google that is subject to this type of competitive manipulation. The others are too. So, maybe we should change the title of this topic to...

Search Engines allowing other webmasters to damage your site.

I am sure that this has already been covered but how would one know if one was on the receiving end of a competitors black seo eye?

I wouldn't necessarily call it an "SEO Black Eye" although that's a pretty cool description for it. ;)

No, I believe the scraper sites are responsible for a good portion of one's loss in rankings. Who is behind those scraper sites is the question. Is it a competitor? Or, is it some rogue offshore server that happened to latch on to your website?

I do site: searches for websites that appear to have indexing challenges. There is one thing in common amongst all of them, they have a high number of scraped entries being indexed. Its almost as if someone purposely made sure that this particular site appeared in every "bad neighborhood" they could find. From MFA's to Arbitrage sites, they were everywhere. They were also included in hundreds of directories, blogs, social media sites. I'm tellin' ya, that whole network of scraper sites is damaging to one's Internet presence. I don't care what the search engines say, there is cause and effect taking place.

A proactive approach at the server level is required to combat most of this.


 1:35 pm on Apr 9, 2008 (gmt 0)

uh... 5ubliminal...?

fanatics...? wheee... here we go again.


... my question then is as follows:

you expect the free service of Google Search Engine to work for you.

For this, in the past when there weren't much competition in your niche, you had to do very little.

Then as competition grew more fierce, you actually had to invest some effort in the so-called free search engine traffic, either by learning basic SEO on your own or trusting this task to a specialist ( individual or company ).

For those who spent that approx 3 weeks to find the most credible source of information on SEO ( for example the 'fanatics' of Webmaster World ) GWT is a matter of keeping up with the latest trends, namely, invest yet another 5 minutes in creating a free GWT account, and administering their site there. btw, the information in GWT is known to Google anyways so paranoid people are advised to hold their comments regarding privacy.


If you chose the latter method of hiring professionals ( you know, those 'fanatics' ) they ought to know about the situation, and as soon as Google rolls out yet another free tool to let them fight back the ever increasing black-hat economic opportunism on the net... heck, they should use it. If not, you should turn to someone else, for those you pay for a service clearly are not providing it. SEO is evolving, keeping up with the technology is an SEO/SEM's duty in order to stay in business.

Alternatively one can always start here, by asking professionals, enthusiasts, freelances and mom'n'pop-turned-SEOs to get help and keep in touch with the realities.


What people still don't get is that this is not a game.

For me it is, because I'm well versed in it, but since 'website owner' doesn't translate to 'webmaster', it needs to be said again. There's no fair play in SEO. There are rules, ethics, guidelines and laws, some enforced, some not. But there's no fair play.

This Internet Search thingie, while may be free, is actually a kind of business.

Supplementary or not, it's a model to gain profits, and as such, is regulated - or not - by the entities that make it possible, i.e. those who provide the infrastructure: ICANN, ISPs, Google, Yahoo... etc. There are a lot of interests to keep in mind with every update in their policies/methods, but in the end, it is in their own interests as well to keep the net, and search as credible, safe, profitable and usable as can be.

with that said.

A simple tweak in the algo to eliminate the effects of ill-willed redirects might - with a good chance - negate legitimate 301s which outnumber black-hat instances probably by 5-10,000%. ( migrating link power from old site to new )


So... IF Google did include this tool in GWT...

The tool to forcibly 'exclude' domains, IP ranges from your own link profile. Similar to the way they make filtering possible in AdWords and AdSense...

If all you needed to do to turn down the dial on your unwanted penalties... coming from unwanted redirects and inbounds ( that deindexed, -950'd, penalized your free listings ) is to provide some data on your findings of such instances to Google (your free SE traffic provider) ...

Where on Earth isn't this the fastest, most efficient, most reliable method - of course next to Google doing the same algorithmically. Which however will take time to develop. And even if they did react with an algo update, there hasn't been a *single* change in their system that didn't rely on *thorough profiling* and *case studies* of websites to elevate, lower or ban the listings of... to which again, this temporary, make-shift solution would also contribute.

I don't see your point... GWT a tool for fanatics?
... not at all.


( unless Google's departments and ICANN - and any other 'we don't care who you are' click and affiliate advertising agencies - stop funding cr@p sites on the net, it'll be up to Google's search / spam / product team to solve such problems. )

[edited by: Miamacs at 1:38 pm (utc) on April 9, 2008]


 2:03 pm on Apr 9, 2008 (gmt 0)

IF Google did include this tool in GWT ... Where on Earth isn't this the fastest, most efficient, most reliable method...

To my eyes, GWT is one of the best things to happen to search in years -- I applaud Google for making this tool freely available. What I like about Miamacs argument is that it puts the decision in the hands of the webmaster/siteowner (via a feature in GWT), as opposed to being imposed unilaterally from above (algo change). When given the choice, I'll always go for self-empowerment.



 2:21 pm on Apr 9, 2008 (gmt 0)

I'm a blackhat and I see this totally different. Sorry:)

Google should fune-tune their algo to fix this problem and should not put the solution in the hands of the crowds. I don't believe in the wizdom of crowds. People as individuals are smart, mobs are not and this has been proven over and over again.

It's Google's game so stop whining and robots.txt block their crawler if y'all hate'em so much.

I {heart} GOOGLE!


 2:22 pm on Apr 9, 2008 (gmt 0)

This is not the old usual 301 hijack, this is a entirely new variant and deadly to every site regardless of age or rank.....I will not reply to stickies about it with details of the exploit..... although it is becoming common knowledge on the blackhat boards.

Thrall, I take issue with this as this can only serve to benefit us if this is already common knowledge within the blackhat community.

If it is then common knowledge in the 'whitehat' boards (I like to think of WebmasterWorld as a whitehat community), then we would be able to detect it.

How did you detect it?
What are the symptoms?
How are blackhat competitors using this exploit?
What is the exploit?

If at the very least, collectively we can come up with a change suggestion for Google and they could implement it. The only thing we accomplish by not showing what this new sabotage is that Google will just make a change without our input.


 2:55 pm on Apr 9, 2008 (gmt 0)

I'm just wondering when the mafia will come a-knocking to offer me protection. Now who that mafia will be I don't yet know - could it be Google themselves offering an annual fee to deflect this sort of thing from my site? Or will it be a true mafia who has power over the BHs. Only time will tell...


 3:59 pm on Apr 9, 2008 (gmt 0)

Here's my own take on using GWT - not necessarily Google Analytic, just plain old Webmaster Tools.

You're giving away nothing! The instant you validate your domain, you see data that Google already knows. They don't take a while to build a history for your site, as happens with GA. All you've given away to Google is associate your domain with an email address that you choose.

Oliver Henniges

 4:11 pm on Apr 9, 2008 (gmt 0)

Without knowing the details of this new 301-thing I dare say that the core of the problem does NOT have to do with links, backlinks and redirecting links, BUT with detection of ownership of original content of any kind, and deciding it from it's scraped copies.

The speed of the scraper's spiders is reported to be much faster than googlebot or other SE spiders, so it should be clear that the job for the SE-engineers is far from easy.

Is this thread brainstorming ideas on how to overcome this issue?

The base-tag may easily be substituted within the scraping-process and is thus only a limited protection. But it is - at least in principle - possible, to generate some sort of "watermark-link:" A link to a separate page only delivered to specific IPs, which contains some information on the basis of which SEs might identify the original author of the content. Dunno if feasible, maybe only in combination with some automatic submission of the timestamp of the generation/upload of any new content via webmaster central or other interfaces. And note that this would automatically exclude new SEs unless positively defined to have access to the watermark-URI.

Scraping and duplicate content are only part of the more general problem of copyright infringement. There ARE means to overcome this. What we need is a scalable approach for SEs without the need to cooperate with ISPs or the ICANN. And on SE's part the will to do something.

[edited by: Oliver_Henniges at 4:12 pm (utc) on April 9, 2008]


 5:09 pm on Apr 9, 2008 (gmt 0)

This whole thread is a bunch of CRAP!

I'm convinced that there is NOTHING that can be done to a competitors site (short of hacking into the site) from an OffPage standpoint to mess with an established site...what happened to the mantra...

"You can't control who links to you"


 5:18 pm on Apr 9, 2008 (gmt 0)

I do not agree with your first point. Inbound links from bad neighborhoods cannot hurt you. I know this was actually stated on Matt Cutts blog and really it's only common sense!


 5:44 pm on Apr 9, 2008 (gmt 0)

>> 1. Inbound links from "bad" places.

Does this really attract a penalty? Surely this only applies to outbound links.

2. Hundreds of links from one IP address.

Again, is there a penalty here? Seems to me that all that happens is that little, or no 'PR' (or whatever we are dealing with) is transferred. If you have a few dozen other decent links, how would this harm your site.


 5:54 pm on Apr 9, 2008 (gmt 0)

Oliver wrote:
it is - at least in principle - possible, to generate some sort of "watermark-link:" A link to a separate page only delivered to specific IPs, which contains some information on the basis of which SEs might identify the original author of the content.

Cheers for that! Oliver, what a great idea. It shouldn't be difficult to implement, and there'd be no way for the bad guys to get past that unless they hijacked the SE's IP.

Tedster wrote:
Imagine going back some years and hearing that someone would intentionally spend money to build lots of backlinks to their competition. That would have sounded insane!

I seem to recall making comments to that effect in years gone by. Now I just sit here and tremble. [:-)]

Oliver Henniges

 8:34 pm on Apr 9, 2008 (gmt 0)

> I'm convinced that there is NOTHING that can be done to a competitors site

maherphil, despite your self-assuredness I think many people in here will disagree, and the key point is

>to mess with an established site

If I understood analysis correctly, "establishedness" for google is mainly a matter of pagerank. 301 and 302 hijacks only work if the scraper site has a stronger PR than the scraped one.

There is a lot of money and experience in the PPC-industry, those guys may easily built a PR6 or even PR7 site from scratch "balancing a glass of beer on their nose" as Brett would say.

But I can not. I don't have the experience and I don't have the necessary money and contacts and ruthlessness for intensive backlink-campaigns, because the limits of my market niche won't justify an open-ended investment.

I can make a comfortable living from my PR4 page, because I mainly care for my customers' needs, but I still rely heavily on new customers coming from search engines. If the PPC industry decides to grab my 3k uniques for the turnover of 3 packages of pills a day, they will ruin my income and that of my employees and my family. This is ethically pervers, but - to my knowledge - technically possible.


 6:01 pm on Apr 10, 2008 (gmt 0)


I've changed my mind. No point in writing that last post. You either believe Google when Google tells you something or you don't.


 7:56 pm on Apr 10, 2008 (gmt 0)

> I'm convinced that there is NOTHING that can be done to a competitors site

You may be convinced but I'm convinced you are wrong and here's why.

I've now found a few sites that break every rule in the Google don't do that or you'll get banned book that have pages on them where the only outbound link is to my site.

My home page was #1 for ever (except for a period after Florida) for the most important 2 word term in our market. Now it is #10 on Google.com and strong anchor text brings it up to #3 on Google.co.uk. I've spent weeks analysing other results around me, tracing their backlinks etc and the only explanation I can find for why I've lost that lucrative top slot. If my page was over optimised or under optimised or needed more backlinks or different anchor text in the links I might, in time have a chance of getting back but what do I do when some very dark characters are being paid to scupper me?




 8:29 pm on Apr 10, 2008 (gmt 0)



EXACT same story here. GOBs of scraper sites linking to us, directory links we didnt request and had NEVER been there before.

Some of the sites actually "appear" like we did some sort of paid link thing which we do not.


 8:38 pm on Apr 10, 2008 (gmt 0)

I will not elaborate beyond the following,

The exploit is being used by large inhouse corporate seo teams
The exploit involves using a SERIES of redirect chains
The exploit involves DUPLICATE content in these chains
The urls being used to initiate this exploit reside on powerful TRUSTED websites being acquired by the blackhats and then used for this very precise purpose.

This takes big and I do mean big money to pulloff but is quite impressive watching your page be destroyed by it. It took us several hundred hours to figure out whats going on, it is very complex and if you are the target of it, you are dead.

My peers are having the same thing happen to them, we actually have started talking to each other to try to address it.


 9:01 pm on Apr 10, 2008 (gmt 0)

My peers are having the same thing happen to them, we actually have started talking to each other to try to address it.

There is only one way - take the fight offline. If there were doing this to my store front I'd be round their house with a baseball bat. No rules saying that you have to keep the war they are raging online, online. Right now I'm finding out the home addresses of all the directors of big company name 1, big company name 2 and big company name 3 and sorting out some alibis for myself ... ;)

The anonimity of the web has given cowards a way to seem brave. I'm sure 5ubliminal wouldn't be spouting "I'm a blackhat and I see this totally different. Sorry:)" so cheerfully if we were having this discussion in one big room ...

This 123 message thread spans 5 pages: < < 123 ( 1 [2] 3 4 5 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved