homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 31 message thread spans 2 pages: 31 ( [1] 2 > >     
33 000 backlinks in 2 weeks... disavow is manual or automatic?

 5:12 am on Jun 18, 2013 (gmt 0)

I have a wordpress blog and although I never approve any of the spam...

The spammers still link to the post in the hope that I will approve...

So now my IT site gets searches like Cialis and Canadian Pharmacy...

I want to ditch all 33 000 links in one shot, now will dissavow work automatically or will it need human review?



 6:03 am on Jun 20, 2013 (gmt 0)

Does anybody know at all?


 8:01 am on Jun 21, 2013 (gmt 0)

You need to be more specific. Are all the 33,000 links from one domain?


 9:33 am on Jun 21, 2013 (gmt 0)

Generally speaking, you should need to disawov each single domain.

Example of file for submitting:

and so on...


 9:35 am on Jun 21, 2013 (gmt 0)

are the 33k links from one site?


 10:35 am on Jun 21, 2013 (gmt 0)

I think what the OP is asking is not how to disavow, but will the disavow work automatically or will he have to wait for human review for it to have any effect?


 11:28 am on Jun 21, 2013 (gmt 0)

It is automatic. The manual side is only required if you have manual action imposed on your site.


 3:03 pm on Jun 21, 2013 (gmt 0)

33k from different sites, but I have all the list in webmaster tools, so no problem to know what they are

Question is:

A. First dissavow and then if no results ask for review

B. Ask for review and THEN dissavow


 5:25 am on Jun 24, 2013 (gmt 0)

Did you receive a message from Google on Webmaster tools that they have detected unnatural external links? If not then you dont need to submit a reconsideration request. Just disavow and it is automatic.


 9:45 am on Jul 10, 2013 (gmt 0)

Thanks morpheus83

Nope, didn't get any message from Google, so will dissavow then.


 10:23 am on Jul 10, 2013 (gmt 0)

So what are they linking to? A page that doesn't exist? Then I would hope a 410 response would be useful.


 6:39 pm on Jul 10, 2013 (gmt 0)

This kind of hack caused me lot of troubles, since I got about 1 million links to about 55k pharma pages created on my hacked blog.

Disavow all those domains, serve those pharma pages with http status code 410 and do not fill any reconsideration request. Check GWT crawl errors to see which pages to serve with 410.

Robert Charlton

 7:20 pm on Jul 10, 2013 (gmt 0)

The question of what they are linking to is potentially important. Conceivably, if they've added something like pharma pages, the pages are cloaked and you will need to view as Googlebot to see them.

If you're hacked, you should of course get rid of the target pages. I would certainly also disavow after removing the pages, and I'd send Google a reconsideration request, whether or not I'd received a notice, just as a matter of course to get it on the "record". I wouldn't necessarily expect any change, but I'd want Google to note that these weren't my pages.

If the site is clean (ie, no hacked pages), try some of the urls to make sure you are getting 404s.

[edited by: Robert_Charlton at 7:35 pm (utc) on Jul 10, 2013]


 7:32 pm on Jul 10, 2013 (gmt 0)

I don't think the original poster was hacked.

I think that what happened is:

1) spammers TRIED (but failed) to post a bunch of links in the COMMENTS section of one of his LEGITIMATE word press posts

2) Even though their spam comments did NOT get published on his blog, those same spammers started linking to that LEGITIMATE page on his blog, just in case the spam comments did get posted.

so now he has 33,000 low quality links pointing to a LEGITIMATE page on his site where spammers thought they could drop links.


If this were the case, I would do a disavow AND I would do a reconsideration request - even if there is no manual penalty.

I would spell out quite clearly what is happening to your site.

google says they LOOK at every reconsideration request. this should at the least help them figure out how to improve their penquin algo so that innocent webmasters such as yourself don't get nailed.

Robert Charlton

 7:42 pm on Jul 10, 2013 (gmt 0)

Planet13 - I was editing as you were posting. Your assessment of the situation, that these links are targeting unapproved comments, makes lots of sense.

Yes, I would do the reconsideration request here in any event, for reasons you describe.


 3:18 am on Jul 11, 2013 (gmt 0)


That is EXACTLY what I was asking, thanks so much for understanding!

This is such a dangerous problem nowadays that I recommend to turn off commenting altogether.

NOBODY is safe from this kind of attack now.

This is new spam tactic that nobody can control thanks to Google's new policies of punishing webmasters for actions that are totally out of their control.


 5:30 am on Jul 11, 2013 (gmt 0)

This exact thing happened to us en masse. Over 100K bad links targetting blog entries etc. over the course of several months. We have tried dissavow. Doesn't matter... we've been hammered by penguin updates over the last year. Maybe we should submit a reconsideration request?
The bad links keep pouring in (presumably they have set a whole army of bots linking to us) and are dilluting our link profile I am sure.
So depressing to feel that our rankings have gotten hammered because of stuff that is totally out of our control.
Negative SEO is very, very real.


 7:32 pm on Jul 11, 2013 (gmt 0)

@ kanetrain:

I hope you don't mind me making a few suggestions.

first, do make sure that you don't have any spam links or hacked pages on your site.

Then do sumbit a reconsideration request. Can you maybe get your logs and show how the bots were first trying to spam your site, and then show how the links were being built to those pages that they thought / hoped they had successfully spammed?

I hope this helps.


 9:04 pm on Jul 11, 2013 (gmt 0)

I too have had a similar problem.

I redirected a forum that I owned to my main site as I no longer needed it and because it got a few hits a day I thought it would be fine.
The forum had got a LOT of spam (like a million spam posts) over a few years (the forum was abandoned). I deleted the DB and thought that was that.

Unfortunately on 6 June my main site got hit and I found out I had 900,000 back links pointing to the forum. As I has redirected the forum to my main domain this is why I had got hit.

I disavowed and redirected the forum to another domain with a click to access link.


 11:57 pm on Jul 11, 2013 (gmt 0)

We had both. Spam comments and then spam external links to good blog enties. Then we had a hack where they posted 40,000 spam blog entries (hidden) and then generated tens of thousands of spam external links to those spam blog entries without us knowing or realizing it. Once we saw them we cleaned it up, deleted and 410'd the spam blog entries, but the damage was done.

@ Planet13 - We never got the "manual penalty" email but were hit by algprithmic penalties (Penguin and Panda) soon after. Are you saying we should submit a reconsideration request even if we didn't get a manual penalty notice?
I think we still have the logs saved. The backlink profile speaks for itself too. We have a craft site with a bunch of women... and then our backlink profile is littered with tens of thousands of backlinks with anchor text of "viagra" "seo software" "#*$! site" etc.
We did the dissavow tool for as many as we could track down.
I will certainly submit a reconsideration request if experts here think it might actually get looked at.


 12:14 am on Jul 12, 2013 (gmt 0)

@ kanetrain

Are you saying we should submit a reconsideration request even if we didn't get a manual penalty notice?

If everything else you are doing on your site is legitimate, I would say yes, go ahead and submit a reconsideration request.

firstly even though you might not have received any messages about a manual penalty, the only way to be CERTAIN that you don't have a penalty is to submit a reconsideration request.

Second, Since google at least READS every reconsideration request, I think it is important to pass on to at lest SOMEONE at google that spammers are trying to build links to legitimate blog pages - probably as an effort to not have their main money sites hit by penguin.

Third, the Penguin recoveries I have heard of (and there seem to only be a few Penguin recoveries posted), the recoveries either happened when there was a Penguin refresh (i.e., people who were hit by Penguin 1 recovered when Penguin2 was launched), OR when they filed a reconsideration request, EVEN THOUGH THEY DIDN'T HAVE A MANUAL PENALTY.

so in short, there are a few claims that people who were hit by Penguin only and they claim to have recovered by submitting a reconsideration request (and NOT having to wait around for a Penguin refresh).

Oddly enough, I read one account of a penguin recovery via link removals / disavow tool, but the recovery happened with a PANDA update, not a PENGUIN refresh.

So if your site is NOT doing anything wrong:

There is no harm in filing a reconsideration request.
There is no benefit To NOT filing a reconsideration request.



One more thing: I filed a reconsideration request myself on July 6th, and I was just notified today (July 11th) that there was no manual spam penalty applied to my site. So maybe now is as good a time as any to file a reconsideration request.

I will check back in as soon as I know whether filing that reconsideration request actually helps ME with recovering from Penguin2.0 (I lost about 35% of my impressions, clicks, and traffic due to Penguin 2.0 - oddly enough, my ecommerce revenue is UP despite having 35% less visitors.)


 12:49 am on Jul 12, 2013 (gmt 0)

@ Planet -
Similar situation here. We got hit partially by one of the later Panda updates, but not enough to really hurt ecommerce sales too bad. Penguin 2.0 was not kind though. Wish I could say sales were up.
Thank you so much for sharing your information and story. We cerntainly don't do anything wrong or black hat, but Google honestly keeps changing the "rules" so you never know. There are things that Matt Cutts advised that were OK or even enouraged 7 years ago (focus on relevant anchor text backlinks in good content), that now are "against the rules" if you were too good at it and those same techniques can cause over-optimization penalties if you did "too much" of it. Been at this for 13 years and the site is even older so what was encouraged and white as white hat can be might now look like gray hat.

Robert Charlton

 2:02 am on Jul 12, 2013 (gmt 0)

I lost about 35% of my impressions, clicks, and traffic due to Penguin 2.0 - oddly enough, my ecommerce revenue is UP despite having 35% less visitors.

Planet13 - Albeit what you're experiencing could be coincidental, it reminds me of the kind of traffic shaping that Shaddows first discussed back in Oct 2010....

Google & Traffic Shaping - a hidden method to the quality madness?
http://www.webmasterworld.com/google/4222996.htm [webmasterworld.com]

...Then the biggy. 12th October, huge referral shift. Traffic-neutral, but conversions back at pre-Sept level. In other words, we are now 20% up on sales. The referrals are NOT the same (or even particularly similar) to the pre-September level

Robert Charlton

 2:43 am on Jul 12, 2013 (gmt 0)

...Google honestly keeps changing the "rules" so you never know. There are things that Matt Cutts advised that were OK or even enouraged 7 years ago (focus on relevant anchor text backlinks in good content), that now are "against the rules"

kanetrain - Are you talking about internal nav text or external inbounds?

Matt discouraged artificial anchor text inflation and paid backlinks from the beginning. So did Google's guidelines. I fault Google for not cracking down on it sooner. I suspect that Penguin didn't come earlier because of a concern about false positives, which is perhaps what created the algorithmic "loophole" that many tried to capitalize on. It sounds like you came to have a lot anchor text issues that eventually did hurt you, and I'm sorry about that.

I've spent a good part of my efforts over the years keeping clients from shooting themselves in the foot, and I've often had to argue about it. It amazes me how often webmasters chose not to believe that Google meant what it said in its guidelines. Possibly they didn't read them, but they did manage to learn about the loopholes.

Matt wrote this post in 2005, but posted in 2008 (he explains in the article)...

SEO Advice: Getting Links
Posted March 11, 2008 in Google/SEO, Leftovers
http://www.mattcutts.com/blog/seo-advice-getting-links/ [mattcutts.com]

...here are some ways to get high-quality links without emailing, paying, or even paying attention to search engines...

I suppose it's difficult to tell people not to pay attention to search engines, or at least not in ways that are too obvious, but I don't think that Google is penalizing for things that they once encouraged.


 1:35 am on Jul 28, 2013 (gmt 0)

Seems like the spam linking is actually helping my site now...



 5:13 am on Jul 28, 2013 (gmt 0)

Seems like the spam linking is actually helping my site now...

Wordpress sites are a target of webspam in part because if you approve a comment once then you give the person making the comment permission to post future comments without being moderated so long as they don't post more than one link in future comments(default wordpress settings).

Thus they try to sneak in one real comment and come back later with their spam tools to post lots of comments on old posts, presumably to boost their 'tiered link networks' which is another fad of the moment. A fad that Matt Cutts has said will be dealt with this summer.

I suspect that if Google successfully banishes 'main' sites who employ tiered link schemes that the pressure on wordpress comment systems might lighten up a little. Personally I don't allow wordpress comments and I send automated requests for the comment file straight to project honeypot. I prefer 3rd party javascript solutions when comments are needed.

I also recommend you pre-approve all comments and/or change the default setting to allow no links without moderation.


 4:02 pm on Jul 28, 2013 (gmt 0)

I actually have learned to like the comments ability of wordpress.

There is a plugin (don't know if I am allowed to say it by name) but basically, it will require capture if certain criteria are met (like javascript is turned off or if cookies are not allowed).

If they fail the capture within a certain time frame, the comment is moved to the trash.

I combine this with a lot of words in the spam filtering system as well.

I like it a lot. The nice thing is that capture is ONLY required for no javascript or not accepting cookies, which is pretty common for bots.


 5:30 pm on Jul 28, 2013 (gmt 0)

Would someone explain to me what benefit spammers get from linking TO your content thousands of times?


 7:06 pm on Jul 28, 2013 (gmt 0)

Would someone explain to me what benefit spammers get from linking TO your content thousands of times?


First, what they TRY to do is they try to post some spam comments on your blog that link to their money making site.

Then they make thousands of spam links to the page on your site that they THINK has a link to their money making site.

In the case of the original poster, his spam catcher caught the comments they were originally posting on. So basically they are building thousands of links to a page that DOESN'T link to their moeny site, after all.


 7:10 pm on Jul 28, 2013 (gmt 0)

Actually I think they get that first comment on your site pointing to another site and that site points to their main site. This way, if your page gets penalized, they break the chain and protect their main site. You get all of the ranking problems and the chain no longer reaches their main site so theoretically they don't lose rank.

It takes a lot of energy to be lazy! Thankfully Matt Cutts suggested that these 'tiered' linking methods will meet their demise later this summer in an update he expects all SEO's to talk about.

I wish I could see the next generation of link evaluation technology in action at Google, it would be quite something to see. SEO's used to be able to know what a 'good' link was when it was based on two pages but now that links are evaluated in a systemic manner it becomes virtually impossible for SEO's to get 'the big picture'.

This 31 message thread spans 2 pages: 31 ( [1] 2 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved