homepage Welcome to WebmasterWorld Guest from 54.166.65.9
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
The Saboteurs of Search - Part II (one year later)
pageoneresults




msg:3679576
 2:47 pm on Jun 20, 2008 (gmt 0)

2007-07-01 - The Saboteurs of Search Part I
[webmasterworld.com...]

From the above topic...

Some search marketers question whether tactics like Google bowling even exist. Google's Webmaster Central site, designed to answer search marketers' queries, says merely, "There's almost nothing a competitor can do to harm your ranking or have your site removed from our index." But Duke, and many search marketers, take that "almost" as a concession from Google that negative SEO does occur.

I'm reading a bit of press lately that Google is becoming more vocal in its claims that there is "almost" nothing a competitor can do to harm your ranking. But, my research to date along with findings by clients, etc. indicates that is not the case.

There is one additional part to the above that leaves a huge gap between the lines...

Your rank and your inclusion are dependent on factors under your control as a webmaster, including content choices and site design.

Okay, the second part negates the first part in the instances I'm referring to. Example, the client had a hole that allowed a saboteur to take control of their DNS without their knowledge. Over the course of who knows how long (years), the site continually declined in the SERPs to the point where closure might have been considered.

Another example, the client had a major flaw in how their URIs were structured. That flaw led to a heap of other issues which in turn brought the site to its knees. In this case, they were actually sabotaging themselves and at the same time, third party saboteurs caught on to this and joined in the fun which "I feel" escalated the self-sabotaging effects already at play.

The list goes on and on. While me "Tin Hat" has all sorts of dents in it, I continue to sport that puppy! Its helped me to "ting" my way to a few findings that were actually correct. I had to pat myself on the back. :)

So, it is one year later. Technology has probably made another 360 turnaround and there are a new round of techniques coming into the picture. And for many, they will never know what hit them, ever. I had someone explain a cloaked sabotage technique to me. Dude, you don't want to be the victim of cloaked sabotage, that's really where the major damage "I personally think" is being done. Many of us, including myself, won't be able to find and/or back track much of the really "high-tech" stuff. I'm not too certain the SEs can either, hence the term "cloaked sabotage".

Oh, and DNS Hijacking appears to be at the core of some pretty damaging campaigns.

Don't worry though, I'm sure many will come to our rescue and tell me that the above just isn't so and that its time for me to retire the damn Tin Hat!

 

tedster




msg:3680316
 5:11 pm on Jun 21, 2008 (gmt 0)

Yes, there is still liability for search sabotage - both ranking sabotage and reputation sabotage. Google clearly works to neutralize it, but the marketplace is there and vendors continue to get more "creative" in their technology.

I hope that Google has a dedicated sub-team for researching the footprints of the latest trends. I know that many tricks that worked last year have been neutered this year.

pageoneresults




msg:3680366
 6:25 pm on Jun 21, 2008 (gmt 0)

I hope that Google has a dedicated sub-team for researching the footprints of the latest trends. I know that many tricks that worked last year have been neutered this year.

I believe much of that stuff from the past is just the "upper layer" of it all. Those are the "visible footprints".

Let me see if I can get some heated discussion going. Let's paint a picture and you tell me what the end result would be. Hey, if you are worried about what I'm going to share, then you shouldn't read any further. This is rated #*$! for SEO. Don't worry, I won't go into details but I'll throw some theories out there...

301 Cloaking
302 Cloaking
DNS Hijacking
Garbitrage

And I won't even start the discussion as to what is happening within the "domainers" space. Whew! Those guys are making money hand over fist in some instances. Big, BIG, bucks passing hands.

Tell me, for those of you who fully understand this, would the bulk of what is really happening out there today be easily detected? And, what percentage of it is "truly cloaked". I mean, I'm aware of strategies that utilize groups of throwaway domains, all via some sort of proxy, etc... I've tried backtracking some of this stuff and it takes time. And I'm literally clueless in some instances and have to bring others in to explain to me what I'm finding. But, I'm finding "some" of it.

I would imagine those who are "in tune" with their logfiles by the second have a clear picture of what might be happening out there. They of course have made the changes to combat that which they have control over. The rest of the stuff is a continual battle in finding out ways to stay ahead of whatever sabotage efforts are at play. If you read the Google Webmaster Guidelines, they clearly state this...

Your rank and your inclusion are dependent on factors under your control as a webmaster, including content choices and site design.

I'll say it again. That part of the guidelines negates this part of the guidelines...

There's almost nothing a competitor can do to harm your ranking or have your site removed from our index. If you're concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don't control the content of these pages.

For me, the above two statements paint a very clear picture. If your platform is open to various exploits outlined in multiple topics here and abroad, you are open to sabotage and Google really can't do much about it other than their normal routine of finding it and developing an algorithmical solution to combat it. But, at that point, it may be too late, the damage has probably already been done for who knows how many thousands (or millions) of websites. The recovery periods may be far too long for many to stay afloat and they will shut their doors.

That was the intended goal of the original sabotage campaign.

Ting, ting, ting...

And, the more data Google make available publicly, the easier it is for those performing the sabotage...

Google Trends
[webmasterworld.com...]

Receptional Andy




msg:3680373
 6:40 pm on Jun 21, 2008 (gmt 0)

If your platform is open to various exploits outlined in multiple topics here and abroad, you are open to sabotage and Google really can't do much about it

I think this is the essence of the matter: you have to protect your site. No-one else (including Google) is going to do it for you.

I think we could class most sabotage attempts as Search Engine Denial of Service (SEDOS, catchy, huh?): crowd out, or push down the opposition.

To me, it's just a variant of hacking: mostly affects high profile targets, no one is ever totally safe, there are large networks out there.

If you're concerned about another site linking to yours, we suggest contacting the webmaster of the site in question.

The wording of (and edits to) Google's guidelines have always been a subject of interest to me. The recent change: Google are getting a lot of emails from people worried about who links to them.

pageoneresults




msg:3680386
 6:56 pm on Jun 21, 2008 (gmt 0)

Ya know, its so cool that the two of you injected some life into this topic, thank you very much. Kudos to the WebmasterWorld Admins and Mods, ya'll do a bang up job!

I think this is the essence of the matter: you have to protect your site. No-one else (including Google) is going to do it for you.

There is an inherent flaw with that statement. Many site owners don't know how to protect themselves. Many application developers are releasing platforms that are open for these types of exploits. Put the two together and its a disaster waiting to happen. Mind you, the percentages are small, but I think if you look at the effect those small percentages are having on the targeted domains, the numbers are huge, in the Billions!

I think we could class most sabotage attempts as Search Engine Denial of Service (SEDOS, catchy, huh?): crowd out, or push down the opposition.

DUDE! That is an awesome Acronym for this particular discussion. In fact, I'm going to quote you and give credit to you when I use that. SEDOS < How brilliant is that? ;) First thing I thought of were my times on a SEA*DOO when they first came out. That was years ago! I know, it should be pronounced CDAS.

The wording of (and edits to) Google's guidelines have always been a subject of interest to me. The recent change: Google are getting a lot of emails from people worried about who links to them.

Me too although I don't follow them as much as I used to except instances such as this. Then I'll read them from start to finish and make sure that I match all the conflicting statements "from my perspective" that is.

Receptional Andy




msg:3680389
 7:07 pm on Jun 21, 2008 (gmt 0)

you have to protect your site. No-one else (including Google) is going to do it for you

There is an inherent flaw with that statement. Many site owners don't know how to protect themselves

I would say 'problem' rather than 'flaw'. There may well be a flaw in Google's ability to handle organised sabotage, but the statement itself is sound :)

For me, Google is a computer system (hackable) directed by people (with strong opinions). So, it breaks a lot, and you won't always agree with it ;)

If you're an SEO in a competitive area, you have to be aware of the implications of this.

[edited by: Receptional_Andy at 7:08 pm (utc) on June 21, 2008]

tedster




msg:3680581
 3:52 am on Jun 22, 2008 (gmt 0)

Many of us understand that parasite hosting is an issue today. Once a hacker has access to a server, they can plant whatever they want on the hacked site. Today's saboteur is not often going to waste that opportunity and just visibly deface the site.

They may hide links, for instance. If those unwanted links are on your site, it can hurt your Google rankings. If they hack into another site and hide links to your site, that's an opportunity for another kind of mayhem. They might also cloak some negative text, to mess with your reputation in the search engines.

There are further evolutions in parasite hosting today that are making it harder to research, whether it's your website or another. There are hacked pages that will cookie you and then change what you see if you come back a second time. There are hacked pages that will send the average user nothing suspicious, but send cloaked content only to spiders. There are hacked pages that will do something different for a user with a search results referer, such as hijacking that traffic with a redirect. Some will only steal a percentage of your traffic - and that can be a lot less noticeable than a total hijack of all SE your traffic.

So if you're researching a suspect page, whther it's on your domain or someone else's, you should be inventive. For instance, you may want to turn off cookies, or meta-refreshes, or referers, or scripting. You may want to switch user agents. You may combine these tactics in various ways. If you see some behavior one time but you cannot reproduce it, that doesn't mean your first observation was wrong.

If you're trying to uncover cloaking by switching your user agent to that of a search engine spider, then remember that spiders do not send referers, or accept cookies, or request images. Make your footprint as close to a real spider as you can.

I've found mischief with these tactics that would not have been visible through more casual approaches.

Receptional Andy




msg:3680649
 8:33 am on Jun 22, 2008 (gmt 0)

Good tips, Tedster :)

It seems to me that 'defending' a site will occur naturally if you're doing thorough SEO. Perhaps the increase in 'anti SEO' just reinforces the idea that 'best practice' areas cannot be overlooked.

So, on-site issues like canonicalisation, accidental duplicate content and instant UGC are also part of the 'attack surface'. And any instances of dubious linking practices can be more easily reinforced by potential saboteurs.

I don't think it's as easy as some make out to affect another site's performance: the attractive targets are higher profile sites that can withstand a lot. But I also suspect that the overall level of sabotage attempts is still relatively low, and that most sites are protected by their low profile, rather than through deliberate effort.

jaffstar




msg:3680719
 12:25 pm on Jun 22, 2008 (gmt 0)

Just seen a weird thing happen... I discovered this happening to a friends blog.

You google his name or any item on his blog, and results are found in the serps.

You then click any result , and it redirects to a "mega search" site after showing his URL.

For starters I thought it was spyware on my pc. But it did the same thing from multiple pcs and the redirection continued.

Now for the interesting part. If you type his URL directly, no redirection. Therefore the redirection only takes place when traffic enters through Google.

The logic with this is that the owner would always use the direct URL to access the site, so would never pick this up.

Anyone heard of this? I think his version of WP is not the latest, which could of opened the exploit. But , very clever !

[edited by: tedster at 1:30 pm (utc) on June 22, 2008]
[edit reason] moved from another location [/edit]

tedster




msg:3680754
 1:33 pm on Jun 22, 2008 (gmt 0)

Ah, you caught one of those critters that I mentioned above. Yes, this kind of hack is one of the newer tricks, and Wordpress seems to be a common target for it. If a site uses Wordpress today, the webmaster really needs to stay up on the latest versions.

pageoneresults




msg:3680806
 2:47 pm on Jun 22, 2008 (gmt 0)

Today's saboteur is not often going to waste that opportunity and just visibly deface the site.

Ah, those are the Veteran Saboteurs. The ones that have been there through thick and thin and figured out the best way to do this way back when. Heck, if I'm correct in one of my theories, one particular site has been hacked for years without the owner's knowledge. And it was a slow death!

There are hacked pages that will do something different for a user with a search results referer, such as hijacking that traffic with a redirect.

Oh, that's a more common strategy if you ask me.

Some will only steal a percentage of your traffic - and that can be a lot less noticeable than a total hijack of all SE your traffic.

Those are the one's that are most damaging and the most difficult to find. Its website cancer. It starts out small and over an extended period of time, gets progressively worse until who knows what happens. Does your website need Chemotherapy?

It seems to me that 'defending' a site will occur naturally if you're doing thorough SEO.

We could also look at that another way. SEO may be the root of your challenges! Here's the conversation I see taking place with the saboteurs...

Call to Russia...

"Dude, did you notice that little regional site popping up in the top ten lately for one of our primary keyword phrases?"

"Ya, vat vould u vike me to doski?"

"Relegate them to a slow death please, throttle it so they slowly drop back into the depths of The Gorg"

So, me thinks its actually the SEO that puts you on the radar of the saboteur. And, if you are thinking I'm talking out my ying yang, that's fine, just take everything you read from me with a grain of salt. ;)

So, on-site issues like canonicalisation, accidental duplicate content and instant UGC are also part of the 'attack surface'.

The above are probably two of the biggest "visible" culprits.

I don't think it's as easy as some make out to affect another site's performance: the attractive targets are higher profile sites that can withstand a lot.

I don't think many of the high profile sites need to worry about this unless they have some major flaws within the foundation. Its the other way around, its those one man shows that start to take the limelight that are at high risk. Heck, it could be the high profile site(s) that are issuing the directives to sabotage. You'll most likely never find out, the SEO Underground runs deep.

But I also suspect that the overall level of sabotage attempts is still relatively low, and that most sites are protected by their low profile, rather than through deliberate effort.

Agreed that low profile sites are probably protected by default due to their "very low" visibility. Let one of those low profile sites start to encroach on a high profile sites space. If you are in that type of industry, let the games begin.

You then click any result, and it redirects to a "mega search" site after showing his URL.

Arrrggghhh, that's a common one too.

I think his version of WP is not the latest, which could of opened the exploit.

WP is an SEOs nemesis right now. That last exploit was rather severe and I'm sure it left quite a few WP folks wondering if they chose the right platform. If I were a serious publisher, I'd take a close look at building my own damn platform. Who the heck wants to sit there with a platform and wonder when the next exploit may be coming down the pipeline? How many WP platforms are left out there with that gaping hole? There's an entire bad neighborhood for the SEs to filter out.

If a site uses Wordpress today, the webmaster really needs to stay up on the latest versions.

Even staying on top of the latest versions may not be enough. You'll probably need to install some rather intense monitoring software to make sure changes are not taking place to your platform, and to your pages once served at the browser, without your knowledge.

For me, commercial blogging platforms are an accident waiting to happen!

And, since we are talking about blogs, Blogger is rife with this type of stuff. Yes, Google's own properties are involved too. My Google Alerts have uncovered all sorts of stuff when backtracking references within those Alerts. There are a slew of Blogger sub-domains serving up absolute crap, I've not spent the time to fully backtrack, its a bit tedious as I'm still learning. But, when you have a company name being mentioned on a blog and you visit that blog and find no immediate visible references, some Russian text along with affiliate links to various products, something is not right. Google is seeing one thing, the visitors are seeing something totally different. Cloaking 101.

Reno




msg:3680848
 4:01 pm on Jun 22, 2008 (gmt 0)

I find this to be a fascinating discussion, probably because I know zero about any of the "tricks/techniques/hacks" that you all are referring to in the various posts.

So I have a question...

Are these hacks only detectable by a knowledgeable webmaster/siteowner?

OR,

As Google does a deep crawl, could the algorithm "suspect" that a site has been hacked with the sort of things that have been listed in this thread, because it sees many of the tell-tale signs?

.........................

Receptional Andy




msg:3680854
 4:11 pm on Jun 22, 2008 (gmt 0)

could the algorithm "suspect" that a site has been hacked

In some instances, I imagine Google would be able to detect problems. But frankly, there's a difference between 'good' hacking (which is all but impossible to detect) and 'bad' hacking which are the more amateur efforts.

A lot of SEO sabotage attempts involve trying to trick Google into thinking that a site should be penalised and does not meet guidelines. That can be by directly modifying the site (through legitimate mechanisms to do so, or by finding vulnerabilities and exploiting them) and by modifying or setting up external references to a site. The goal being for this to be seen as malpractice on the site owners part, not as a result of a third party.

In some cases, it may not even matter: if your site is hacked in the 'old school' sense of the word and contains, for instance, malware, your site can be dropped anyway - even if you are the victim of a third party.

Of course, if someone else can literally gain control of your site, you have major problems anyway you look at it.

pageoneresults




msg:3680861
 4:38 pm on Jun 22, 2008 (gmt 0)

Of course, if someone else can literally gain control of your site, you have major problems anyway you look at it.

Oooh, a nice lead-in for some more scary stuff...

DNS Hijacking

Back in 2000 I believe it was, I had a telephone conversation with a very respected peer of mine. At the time, we were discussing DNS Hijacking. They wouldn't tell me how it was done but, I was made an example of during the phone conversation. The schmucks managed to hijack "my" DNS on a personal domain and serve their content. To this day, I still don't fully understand how they did it and I've always had this "nagging" suspicion in the back of my mind that there is a bit of more of this going on than we may realize. They did it in a matter of 10 minutes. And you know what, I was so freakin' excited that I forgot all about the hijack. I had to call them back a week later to "get out of jail".

I've really immersed myself into this and I have some highly educated people assisting too. We are building applications to alert us when the potential for this arises. I know, some of it we cannot fight, yet. And, when we figure out how to do it, I'm retiring for good!

Are you a potential victim of DNS Hijacking? Let's outline some things you may want to look for "first" before assuming that you've been jacked.

1. These days, the first thing I'm suggesting is that you mine your logfiles. Hire a professional log analyzer and have them scour those files for anamolies.

2. The next on the list would be traffic patterns. You'll need to be "totally" in tune with the day to day patterns of your traffic and sales.

Those two suggestions should get you started down the right track to see if there is anything unusual going on with the "under pinnings" of your website.

For example, if you were viewing your traffic and sales over an extended period of time, do you notice any anamolies in the graphs? Things such as "no sales" during specific time periods and/or "sales" during specific time periods? Does it seem as though your traffic is being throttled at certain times?

Another example, "what the heck are all those entries in the logfiles for queries that are not part of your taxonomy?" Could it be that someone was probing for technical flaws? They may have even generated a few 404s (by mistake) prior to all those 200 queries, a signal to be on the lookout for. And guess what? That long, long, long, list of queries may find its way on to a cloaked 301 page somewhere and you'll never see it coming or know what hit you.

There are a few around here who know what I'm on to. Its only a matter of time...

Ouch, argh, oooh, stop with the rocks already! Me Tin Hat has enough damn dents in it!

[edited by: pageoneresults at 4:55 pm (utc) on June 22, 2008]

jaffstar




msg:3680864
 4:44 pm on Jun 22, 2008 (gmt 0)

Are these hacks only detectable by a knowledgeable webmaster/siteowner?

The way the sites are hacked makes its very difficult to detect for the average joe.

On another note, I recently reverse engineered an entire spam network. At first glance, I thought they sold the links or the spammer owned each site.

But the tactic was clever.... As said above, instead of defacing the sites as a HACK for glory, they imbedded text links hidden from the naked eye, only detected through source code.

In this instance, the site owner could easily be banned by Google for hiding these links (even though they didnít).

Google's approach is guilty until proven innocent....

pageoneresults




msg:3680873
 5:05 pm on Jun 22, 2008 (gmt 0)

Oh, I almost forgot, a topic without good solid references doesn't hold the same amount of water as one with. :) This particular article should be of interest to many (and on topic) as it provides a scanning tool (Goolag) along with complete instructions for the novice hacker. And, you get to use Google to uncover the vulnerabilities of the target domain(s)...

2008-02-23 - Goolag Tool Lets Google Aid Hackers
[pcworld.com...]

Please do be very careful in how you interpret the above. The Goolag tool is not for the "average" Joe and could get you into a pickle. I do not advocate the use of Goolag. Downloading and installing Goolag is an experience in itself, it does things to your system that I've not seen before all while listening to a female version of HAL. I've since uninstalled it as my scanning software flagged it as "unsafe" and there was no need for me to have it on the system, no use for it.

Apparently though there are over 100,000 other users who may have found a use for it.

Download Goolag Scanner (> 100000 downloads to date)

You'll have to excuse my long windedness but, this topic requires a lot of whoosh...

To understand Goolag Scanner, it is important to understand how "dorks" work (see 1.4) and with that, to establish the use of dorks as an acceptable tool for information security experts, penetration testers, and practical paranoids.

Also this below referenced website is an absolute must read. The stuff you will find here is going to open your eyes to a whole new world, I promise. I'm tellin ya right now, very few around here are going to be using the types of advanced search queries you will find at the below reference. Look at all those freakin' googledorks!

Google Hacking Database
[johnny.ihackstuff.com...]

Admins, Mods, I think the above should stay as it is a solid reference point for this topic. It provides a nice outline of those things that are being scanned for and exploited. People really need to see this.

tedster




msg:3680891
 5:43 pm on Jun 22, 2008 (gmt 0)

Sure, pageone, we will allow that link just this one time. In the future we can just point to this thread instead of linking over and over. I think the messae is clear, even if your commitment is to be 100% white hat (whatever that is) you still need to spend some education time on the other side of the tracks.

Google's approach is guilty until proven innocent....

Google's approach has more to do with providing the best experience for their search users, and webmasters are probably #2, behind that. Is an obviously hacked website something you want to serve to your users?

This is one of many reasons that I urge people to have a Webmaster tools account. You don't need to use GA, or xml sitemaps if you choose not to, but juust by setting up an account you give Google a chance to communicate directly to you as a person authenticated to be responsible for the site. If you've always been a good guy in their book, you may well get a message in your account if they detect a hack. And if you fix it, you can be restored rather quickly.

You also get to see a lot of other information that Google already has for your site anyway. All you've given up is an email address they can associate with the site.

[edited by: tedster at 6:19 pm (utc) on June 22, 2008]

Receptional Andy




msg:3680906
 6:13 pm on Jun 22, 2008 (gmt 0)

very few around here are going to be using the types of advanced search queries

And get used to seeing the Google "sorry" page when you do ;)

Note that those resources are about using Google to find vulnerabilities, to actually hack into a site or expose resources that were not intended to be public (it is truly shocking what some organisations 'accidentally' get indexed on Google). Which, of course, is likely to be illegal.

Attempting to sabotage performance is a much greyer area. Could it be deemed illegal? I'm not sure. IANAL, of course ;)

Reno




msg:3680942
 7:20 pm on Jun 22, 2008 (gmt 0)

In some instances, I imagine Google would be able to detect problems

So I come back to a question we've asked before at Webmaster World -- is it to Google's benefit, to the siteowner's benefit, and to the user/visitor's benefit for Google to give a head's up when it finds some of these malicious types of coding?

For example, in GWT, under "Diagnostics", would everyone benefit if Google indicated that malware (or many other types of suspected hacks) had been detected during a crawl?

I don't see a downside to that...

........................

pageoneresults




msg:3680980
 8:46 pm on Jun 22, 2008 (gmt 0)

I need to stay out of the DNS Hijacking stuff, tis not my forte. Since I don't fully understand it, I don't want to cause any undue alarm. I did that last time I donned my Tin Hat and discussed DNS Recursion, something way above my head. So, I'll stick to those that I have a better understanding of and allow those with the DNS experience to share what "might" be able to happen if there are certain holes open. So please, don't run off and blame your loss of traffic on a DNS Hijacking, that probably may not be the case. :)

This is one of many reasons that I urge people to have a Webmaster tools account. You don't need to use GA, or xml sitemaps if you choose not to, but juust by setting up an account you give Google a chance to communicate directly to you as a person authenticated to be responsible for the site. If you've always been a good guy in their book, you may well get a message in your account if they detect a hack. And if you fix it, you can be restored rather quickly.

Man, if that isn't a solid piece of advice! Who knows, Google may come out one day and say if you don't have your site "registered" with Google, we can't assist in any challenges you may be faced with. That would put a damper on things. :)

They imbedded text links hidden from the naked eye, only detected through source code.

And tell me, how many small website owners are going to detect that? How many commodity hosted websites are vulnerable to someone just coming in and freely taking advantage of your website like that?

[edited by: tedster at 3:49 pm (utc) on June 23, 2008]
[edit reason] grammar fix - by poster's request [/edit]

drall




msg:3682046
 2:06 am on Jun 24, 2008 (gmt 0)

As I mentioned in a thread here [webmasterworld.com...] a few months ago, we were the target of a large scale negative seo exploit in googles algo and so were many of our competitors. We brought this to Googles attention but to date nothing has been done.

The company doing it was a publicly traded company. The bigger the site you own, the more attention it draws and our site brings stuff out of the woodwork like you literally cannot believe.

For those who think this is tinfoil hat material, well I wish I could allow myself to take your urls and show you just how fast you can be taken out. This would only further the knowledge of the exploit so I hold my mouth shut with great angst and hope Google fixes the problem.

Sadly I am becoming more convinced with each passing month that the rankings we spent 8 years to achieve on these pages through all white hat methods will never recover.

301 cloaking, 302 cloaking, meta refresh and DNS Hijacking are very real things people and they can take out even the most powerful of websites rankings.

Whitey




msg:3682089
 3:27 am on Jun 24, 2008 (gmt 0)

Who's Google trying to kid. Absolutely a competitor can harm another site.

If we were able to knock out our own site with offpage activity - what then ?

If our site was hacked and robots.txt applied across our site - what then ?

The list goes on and on.

Now on the flip side, surely Google's WMT is the means to communicate with webmasters to counteract that. But do they do this?

IMO it would be good for Google to monitor QA through it's WMT console. And maybe to improve the quality of that communication webmasters could be graded for compliance and reliability so that there is meaningful support - or maybe the task is too big.

pageoneresults




msg:3682351
 1:21 pm on Jun 24, 2008 (gmt 0)

drall, I followed that topic reply by reply. And the many others that occur around here when someone mentions stuff like this. It gives me more ammunition for me research. That totally SUCKS that you are a victim. A somewhat competitive space I would presume? Or, maybe you have a section of product/service that encroached on another competitive space?

Let me share some more stuff with ya! Did you know that some of you are being watched at this very moment? Yes, your websites position is being monitored regularly by someone else. You've now appeared in the report of websites that they are tracking. They've seen that you've done everything right and are a potential threat to their existence. Before you even get to page one, even if you make it that far, someone knows. There's a good chance that the sabotage will occur before you even make it to page one, this way there is less of a chance that you will cry "foul play" and chalk it up to the algo which most people do anyway. These days, I am more apt to chalk it up to sabotage before I look at the algo angle.

Google's technology of course has improved leaps and bounds. Unfortunately, Webmaster's technology has not kept up with them, well, some of us have and/or are dealing with this. Stuff that was impervious to indexing 5 years ago, is now cakewalk for many bots, Googlebot being the smartest of them all. Schit, the damn thing submits forms now, that can't be good for the Webmaster, can it?

Its great that we can all sit here and discuss this. How many Webmasters are out there though who are not discussing this? How many Mom and Pops who used to get enough orders from the Internet to justify their existence are totally unaware of this and have fallen prey to something? They have their site hosted at some commodity shop and maybe the server they are on has some vulnerabilities. And, maybe they just don't know the very first thing to look for. They visit their site and it "looks" fine and that is all they need to see. Little do they know that someone got in through their FTP port using a default un/pw combination and they've inserted hidden links here and there. Nothing overboard, but just enough to do the deed. That's the amateur stuff.

Where do I see a great opportunity here? Definitely at the Host Level. Boutique Hosting is going to become a booming market, actually it is already. Some of the larger hosting outfits may be open to all sorts of exploits by default just because they have to install all the bells and whistles to make that environment totally automated. Well, the Bells and Whistles may provide the back doors to your websites. I surely wouldn't want any of my websites hosted on a server with 3,000 others, ain't no way Jose!

"Run Forest, Run!"

2006-04-23 - THE - Trusted Hosting Environments
Are you at risk in your current hosting situation?
[webmasterworld.com...]

tedster




msg:3682408
 2:25 pm on Jun 24, 2008 (gmt 0)

One of the potential hosting vulnerabilites are those hacks that exploit holes in VDeck, CPanel and the like. If a host provides this kind of software, but they don't keep it patched and upgraded (which can be a nasty job for them) then the little guy with a site sitting on this big webhosting service has an exposure that they didn't bargain for.

pageoneresults




msg:3682410
 2:39 pm on Jun 24, 2008 (gmt 0)

tedster, I hate to "walk" on this topic but I've come across some stuff in regards to commodity hosting that is pretty scary and I'm chock full of writing energy!

I'm willing to go as far as to say that some, just some, of the less reputable hosts are involved in the exploits themselves. I would never point a finger but there are a few telltale signs sitting out there. One of the most common ones appears to be the non-www parasite. I like to call it that. It happens frequently in shared hosting environments. And it typically happens to the unsuspecting small guy/gal.

Another major culprit? AWStats. I'll point fingers there as our websites get probed every day for AWStats vulnerabilities and we've never used it. Apparently there are all sorts of holes there with that free analytics program.

Check your logfiles. Huh? Just what I thought. I would have said the same thing 10 years ago, huh? Check what? How do I get to those? Where are they. Can I use Word to read them? Can you send them to me via email? :)

Do you know how much information is publicly available to your visitors about your website? Why are you allowing people to view your statistics like that? You are providing the map for would be saboteurs. You've given them a fairly good overview of your traffic, where it comes from, which pages are the most visited, all sorts of stuff. What are you thinking? Turn that access off right now and if you can't, remove it and use GA for now. At least you can place a certain level of trust in the GA data not being made publicly available. Oh wait, didn't they just do that?

Google Trends for Websites
[webmasterworld.com...]

Reno




msg:3682603
 6:09 pm on Jun 24, 2008 (gmt 0)

Where do I see a great opportunity here? Definitely at the Host Level. Boutique Hosting is going to become a booming market

As a non-network/non-server guy, here's what I'd like to see -- a group of street-smart hack-savvy experts who formed a company which would be hired by hosting services for the explicit purpose of finding exploits in their servers, along the lines of the things being discussed here (cPanel, VDeck, AWstats, etc).

If the ISP passed the ongoing tests, the group would provide some sort of certification/verification icon which the ISP could display, and (more importantly) this company would list on their own website those hosting services that appeared safe ("safe" being a relative term).

While it's probably impossible to say with 100% certainty that any server is problem free, something like this would at least provide some level of confidence. Right now, it feels like we're all taking our chances every time we set up a new site, and THAT gets old.

.........................

netmeg




msg:3682644
 6:46 pm on Jun 24, 2008 (gmt 0)

Definitely at the Host Level. Boutique Hosting is going to become a booming market, actually it is already.

That's good to hear.

As a non-network/non-server guy, here's what I'd like to see -- a group of street-smart hack-savvy experts who formed a company which would be hired by hosting services for the explicit purpose of finding exploits in their servers, along the lines of the things being discussed here (cPanel, VDeck, AWstats, etc).

Hunh. I had a friend who started a business fairly similar to this, but in the end, he shut his doors a few years ago; he claimed the ISPs were convinced they were all totally secure with these programs. Maybe he should give it another shot.

ken_b




msg:3682654
 6:50 pm on Jun 24, 2008 (gmt 0)

As a non-network/non-server guy, here's what I'd like to see -- a group of street-smart hack-savvy experts who formed a company which would be hired by hosting services for the explicit purpose of finding exploits in their servers, along the lines of the things being discussed here (cPanel, VDeck, AWstats, etc).

Maybe make the service acvailable to the hosts customers.

I'd sure like to know if what the host says is at all related to reality, and if not, what I could do to make my site safe.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved