|please do a favor to the web community. Knock out techcrunches, expedias, wikipedias, facebooks, apples, etc. |
Ah, you're forgetting - Google can make manual fixes behind closed doors for individual sites (its 11pm so hopefully someone else will point out some examples over the years). Even if someone did spend the money and time it would take to do damage to such big brand names Google could simply flip a switch from behind the curtain. If you spend hundreds of thousands on Adwords then your rankings are pretty secure as you'll have the Goog Tech Team on the case within the hour.
i cannot get my head around this hijacking stuff. if out clicks are of a 302 response.redirect nature. Can some tell me step by step how i can attack this, so that i can understand how it works and then tell me how to prevent this.
[edited by: Robert_Charlton at 1:05 am (utc) on April 20, 2008]
[edit reason] moved here from another source [/edit]
You can find the basics on w*kipedia.
|With dup content is the originator penalized as well? Surely all google has to do is compare its cache, or creation date of the page, to see which site has the original content and not penalize that version. |
Yes, and its happened to me. The story was, if your page was cached first, you got the credit - which is normally the case when you add original content to your site.
I had such a thing on one of my sites. It got cached, and my site was #1 for it. But, the page somehow became decached by google and another site that used it [without permission] is now #1 even though that page is now cached again.
I suspect the page with the most page rank is considered the original.
|In the last few days my website started to dissapper from Google index. The only thing which happend during those days is that my forum was "attacked" by spammers - a few hundred posts with spammy links, mostly adult. |
I just want to add that a few days after I removed the spam, the site is back in the index.
konrad, in that situation the bad links were on your own website, correct? Google will penalize a website for outbound links to bad neighborhoods - they've been clear about that for a long time. A more disturbing issue is that some people report problems that seem to come from inbound links - links that the webmaster would have no control over.
I noticed in the last couple of weeks some one off referrals coming from multiple cruddy directory submissions that look like they were simply the editor checking the link just added. It's as though someone has begun an active campaign to get me into every unrelated rubbish directory out there, including adult ones.
Now, ordinarily, I wouldn't have blinked... but a couple of weeks ago, I noticed in my logs that someone was doing detailed site: operator queries on my site, and checking the SERPS for industry/niche specific search terms... I think I my site may have grown to a point where my competitors are taking notice.
How worried should I be?
[edited by: tedster at 6:31 pm (utc) on May 25, 2008]
[edit reason] moved from another location [/edit]
After reading in this thread since last few minutes, we have found a same problem with some of our websites. We have seen traffic coming from adult sites/directories where we never placed our links as well, URL hijacking/proxy traffic.. some Ip Address which are not SERPs but read every new content posted on our website. Our website content was indexed within 30 minutes ago but now this takes more then 8 hours of time to get indexed and listed in SERPs. After 8 hours we find majority of articles we post on other .info .biz etc. related domains (word to word content)
What can we do to prevent this ? I think there is no solution yet to this problem ? But this is being very very scary because the source like us are getting de-listed.
As someone stated above in this thread, this can be precvented or taken into care but only @ the server level. But far reading through this entire topic there are possibilities according to some members, even after taking utmost care @ server level competitors can break us overnight which is hard to beleive unless a proper evidence/way is provided.
So we can all get some counter-tactics to fight against.
one question, I've a big traffic drop since 3+ weeks now. My traffic dropped from 40k uniques daily to 10k uniques.
Now I've detected another site through webmastertools that have 500+ links pointing to different topics on my website.
Could that cause my traffic drop? Oo
Too bad if that is possible -_-
smc1, it takes indepth research on your part to determine if that caused your drop.
For us it is very clear that certain tactics can take out your pages in google due to well defined technical exploits of google algos.
If your site doesnt have enough trust someone can depress your traffic by 70% for as little as $200.00 for at least 6 months.
When you get into our range of traffic you draw the attention of the fortune 100 crowd and the in-house seo teams they employ. The tactics they use against our biggest authority property would make the average webmaster dazed and confused in short order.
I dont blame Google, they do try more than any other engine to give every webmaster a fair shake but as with all things some people will take advantage.
You can only do so much though, at some point you just realize you are not playing on a even field.
|I dont blame Google, they do try more than any other engine to give every webmaster a fair shake |
I agree with your general sentiment, but realistically, isn't Google's obsession with penalties part of the problem?
Ideally, if an established site had high quality content and no blackhat tactics, and the pages within that site continued to satisfy specific queries, then they could do well in the SERPs no matter what an external party was doing. It seems to me that it's Google's own penalty structure that makes this sort of exploitation possible. I would hope they are working to correct that, but in the meantime, a lot of people are going down.
|Google's own penalty structure that makes this sort of exploitation possible. |
I wonder if the switch from ignoring bad links to penalizing bad links was based on results, or principle. Some of the things Matt Cutts has said almost give me the impression that the war on paid links is as much about preventing others from profiting on a market that google created as it is about integrity of the SERPs.
This is why very methodical pre-planning on website architecture, delivery systems, redirection and a whole new list of elements are necessities in the toolbox of today's SEO - especially in the case where you have active input into the creation and lifecyle of development.
I agree with CainIV - SEO should always be part of the initial Plan. Going back afterwards is a waste of time and can create all sorts of unnecessary hassle that could be avoided. That for 'public' sites obviously secure apps have no real need for SEO work.
"You've got a point there - perhaps you overstated bit, but still, today's sabotage situation does seem to be a side effect from some of Google's spam prevention approaches."
As you pointed previously in another thread, Google's Spam Team is not its SERP Team. Therein is the root of the problem. The division is causing cracks in the system through which sabotage falls.
These two teams because of the artificial dichotomy fail to detect the sabotage and right now the SERP Team is deferring to the SPAM Team in ways it shouldn't.
You can imagine some of the dynamics between these two teams at the GooglePlex. Who is to say which team's idea takes precedence? Somebody's got to give in.
I think you Pub Con folks need to really talk to Google staff about this because it's a blind spot. I seriously doubt being so close to the trees they can see the forest.
In my opinion there needs to be a third team besides the Spam Team and the SERP Team: Sabotage Team.
Evil triumphs when good men do nothing. (Women, too.) This discussion needs to go beyond this site--sooner rather than later.
So the exploit is still working against you? Google hasn't patched things up yet?
The exploit is still in full effect and to date Google has done nothing (that me and my competitors I talk to can see).
I would post examples here that are within the tos but I do not want to spread it any further. All the big seo houses know about it now though and many are using it.
It does not matter how much pr you have nor trust or age, this exploit can and does decimate your page in googles serps if targetted because it is exploiting Googles own crawling and handling of redirection core technology.
I have ran it by very brilliant people that are longtime friends of mine at MIT and its very clear that this is a problem that only Google can solve.
My hunch is that the average webmaster doesnt even know this is happening to them, they just see a huge traffic loss and cant figure out why the page dropped out of the index because it is a very well designed exploit.
does the sites drop out of the index or do they just rank bad?
is there a easy way to detect this exploit on our sites?
The site will not drop out, it is a page specific exploit taking advantage of how Google treats redirects and in particular chains of redirects.
The exploit to you will mimic a google penalty.
You will see a PR0 and not graybar on the toolbar for the page.
Checking the cache for your page and seeing another websites page listed is primary detection method although this doesnt last long because Google will drop that out within 30 days and there will be no google cache for either your document or the document the blackhat has tricked google into thinking is now your document.
I cant give out more than that at this point because Im certain people we dont want knowing about this also read here.
We have sent many detailed breakdowns to Googles security team and website feedback form. And many of my competitors (yes its that bad) have as well.
|I cant give out more than that at this point because Im certain people we dont want knowing about this also read here. |
At this point you probably have more than enough people who don't care if you post the exploit, they want to see and make sure they are not a victim of the same thing. Keeping it under your hat is not going to do the rest of the board any good. Also, it won't "force" Google into fixing it. They seem to react much quicker to stuff that goes public and viral. You detected this what, 4-6 months ago and Google have done nothing? I think its time to publish it so they act. That's how things get done with other software flaws.
Based on me initial impressions, it appears to be a "cache" exploit of some sort. At this point, I'd recommend placing noarchive on those pages and get them out of cache. That would be me first response.
What are the potential risks of Google Cache?
<meta name="robots" content="noarchive">
I'd then backtrack the cached version and see what I could find out about its owners. Probably not much but there is always that one "slim" possibility that they slipped up somewhere in the process. Apparently they did or you wouldn't see what you are seeing. ;)
I do believe that recent changes in Google caching procedures have uncovered some things that were not visible before. I can't be 100% certain on quite a bit of it as this stuff is really time consuming and tedious to backtrack. Instead of spending the time doing that, I find it more beneficial to just stop what I can at the source. Dog Zebra! Ban 75% of the Planet. Lock down your house!
[edited by: tedster at 1:17 am (utc) on Aug. 18, 2008]
I'm going to suggest that all of us run DNS Reports for our domains and make sure there are no FAILs within DNS. Don't ask me where to run one, search Google and choose one. Some are subscription based, others are free. I find the subscription based versions like DNS Report to be more detailed in the instructions given when a FAIL or WARN is present. Plus, we have a variety of WebmasterWorld topics to reference in regards to DNS challenges.
If any of us FAIL in the DNS Recursion area, we can't complain any further. Oh wait, we can, the complaints should now be directed towards your hosting provider, not Google or Yahoo! or Live. They can't do anything about your back door being wide open.
Once you've got DNS squared away, it is the very first step, then you can move on to other methods of locking down your house or what I like to call Dog Zebra which is a Naval term (Material Condition) and provides the greatest degree of subdivision and tightness to the ship.
The addition of the metadata element noarchive to your pages will most likely deter a large percentage of what "may" be happening. There is major activity within the caches of the SEs, it is not just Google. Ask Google why they made the recent changes they did to their cache. ;)
After you've done the above, now comes the really tedious task of "blocking" unwanted visitors. That is the goal. Don't leave your websites open for indexing by anything. That is where the "majority" of the challenges lie for many of us.
DOG ZEBRA < Remember...
> 1. Inbound links from "bad" places.
Under rare circumstances only. Then, only if the "bad" links out weight the good links. eg: this is a good thing. If a site is so weak that a couple bogus links could hurt, then I think it is a good hoop that sites must jump through as it keeps the suspect junk out of the serps and lets the legitimate sites rise. I bet Google feels the same. I doubt there have been many cases where Google has found a "Bad neighborhood" link has knocked a good page/site out of the rankings.
> 2. Hundreds of links from one IP address.
Are ignored. Simple as that. There is clearly a weighting criteria in the algo that devalues multiple links from one site to another. They do it for nav templates, they do it for ips.
> 3. Duplicate content:
I do not believe it is an issue with Google.
> 3a) from competitors scraping your entire site and creating their own from it
The dupe site will always be penalized. I would make sure your content is snagged by Google before your competitors.
> 3b) from competitors stealing your articles and
> submitting them to article submission services
Is not an issue with Google. The first page found is almost universally marked as the original content and all dupes penalized or buried in the algo.
If you are worried about it, make sure to "brand" your content with something related to your site only. A url, a mention of a specific person, a specific product mention, or anything that would point to your site, will put off alot of content rippers.
> 4. 301/302 redirect hijacking (after all this time it STILL happens!)
I don't believe that. We have challenged and challenged anyone to show us an example - and no one has come up with one.
|We have challenged and challenged anyone to show us an example - and no one has come up with one. |
Brett, would you take examples of what may or may not be 301/302 exploits? I'm sure there are a few following this topic who will be happy to provide you with some examples of what they feel are having a negative impact on their presence.
|And no one has come up with one. |
I think many don't want to expose either their own properties or the potential exploit. Not at the public level anyway. I've seen a few examples that leave me wondering. I'm through backtracking this stuff, it just isn't worth it anymore. At some point the trail becomes broken and/or something that I or others don't fully understand comes into the equation.
Since you have "many years of experience" in this area, maybe you can help those of us with challenges determine what may be happening with our websites?
|Can Others Hurt Your Rankings - Part zillion... |
Heh! Noticed the title change. Let me go start the zillionth and one discussion on this. ;)
[edited by: tedster at 1:19 am (utc) on Aug. 18, 2008]
I believe I have just successfully thwarted page hijacking that had been going on with my site for over 6 weeks (specifically proxy hijacking). You can read about it in this thread [webmasterworld.com].
I'll be at the Google Dance tomorrow night at the plex in Mountain View and I hope to be able to talk to one of the crawl engineers to get some answers to some of the questions that I have which are still unanswered. If anyone has a specific question you would like me to bring up, please let me know asap.
Well I just wanted to add something to this thread.
I'm not sure how much effect this is having, but links have started to appear to us on adult sites - pretty nasty ones as well. The links use our home page title as anchor text.
I have only found this out by seeing one or two referrer sources in the logs. Last week I tracked down 40 or so, now they are springing up everywhere.
On non adult sites we've got a similar problem. Part of our homepage copied with a link back to us with the anchor text as our home page title.
Ranking is down, but I don't know if this is the result.
We've also been victims of proxy hijacking as well.
You forgot widgetbaiting. That is a widget that gets targeted keywords
to your site.
The competitor then waits a few weeks then reports you to google.
You then get heavily penalized or banned !
This needs to be fixed. No fair ! No fair !
Google is very picky on widgetbait.
well today it appears our site was penalized, we were shuffled back 10 pages (use to be on page 1). Seems odd because a few months ago I got a few hate mails from our competition, I have no doubt they hired cheap offshore labourors with seo skill to destroy us.
Why wouldn't Google just devalue a link but rather choose to "penalize" the site instead..
"It does not matter how much pr you have nor trust or age, this exploit can and does decimate your page in googles serps if targetted because it is exploiting Googles own crawling and handling of redirection core technology."
I think at this point considering the level of authorities here at WBW that it would make sense to post it. The more people talk about it the quicker it gets fixed.
In a quick and simple reply; Yes, others can hurt your rankings.
| This 123 message thread spans 5 pages: < < 123 ( 1 2 3  5 ) > > |