| This 47 message thread spans 2 pages: < < 47 ( 1  ) || |
|Penguin recovery not in the least about removing links?|
Disclaimer: This school of thought might already have been proposed in some threads, if so please remove my post.
Assumption: My shot at Penguin recovery is based on the premise that Google can't penalize your site based on inbound links, unless there is concrete evidence that you own those sites that are linking.
If the above premise is true(which I personally believe is to be true), then what good is going to come from removing links to your that were not the cause of a "Penalty"?
Penguin mainly affected sites that relied heavily on keyword anchor text backlinks, from sites where links can be obtained without breaking a sweat (read bookmarking, forums signatures, blog comments so on). In all likelyhood, Google merely discounted those links, as if they didn't exist. Google has in fact done the "link removal" for you. Since the sites relied heavily on such links enjoying the ranks they did, when those links were discounted in a bunch, the fall was obvious. Penalty is not the word.
Now, if those links didn't exist and so do the ranks, what is to be done? Get new links, that are Penguin-proof. From sites where anyone and everyone can't get links, relevant and natural anchor text. It is the addition of such links that matter (IMO) not the deletion of crap links.
None of the sites I deal with have been affected by Penguin (touch wood) and I do not have the first hand experience of pulling through any site from Penguin, but I have studied a good deal of Penguin affected sites.
So... sorry for being a little confused on all this Panda update. Does this mean that inbound links from blogs, sigs etc with keyword txt are now a "no-no" ?
I always thought that a site that has a considerable amount of age to it, as well as having a reputation for being white hat (good rep) in all the years it's been around, was pretty much immune to negative SEO.
Google’s position on negative SEO has been basically this:
Matt Cutts: “Because you can’t really control who links to you and how they link to you, that’s something that, being out of your control, we try to be very careful about to try to make sure it doesn’t affect your site’s reputation or hurt your site in some way.”
Even well known SEOs have defended Google’s position about negative SEO, including Rand Fishkin of SEOmoz. Fishkin issued a negative SEO challenge to take down SEOmoz or RandFishkin.com using negative SEO tactics. Fishkin issued the challenge interestingly close to but prior to the Penguin update in April. When asked in May about the challenge, Rand said, “It’s still ranking well, though!”
What most people missed with the results of SEOmoz’s negative SEO challenge is that negative SEO is a threat to smaller sites that do not have a solid link profile and reputation like SEOmoz. In fact, SEOmoz and big brands like them are largely immune to the effects of negative SEO.
But when Penguin hit in late April and sites with an unnatural link profile fell, it became clear that in addition to sites with unnatural links, some sites had unnatural links that were beyond the site owner’s control (no way to remove them at all), and indeed that affected the site’s reputation and ranking.
And although Google has not said anything officially about it… Google’s disavow link tool proves that negative SEO is real.
|Google’s disavow link tool proves that negative SEO is real. |
As I understand it, the disavowal tool was created in response to webmaster requests to help them dig themselves out of unnatural links/Penguin penalties. If I'm wrong then please link to a statement from Google that supports another reason. Thanks!
Google already discounted questionable links prior to Penguin, especially links that trigger the “unnatural link” filter, but Penguin has taken the discounting or discrediting of links to such a new level that Google feels obligated to provide webmasters with a tool to help clean up link profiles (perhaps Google even believes they have gone too far with their campaign to discredit questionable links).
In other words, G’s algorithm is not capable of detecting whether a link is natural or unnatural in all cases and is dependant on webmasters to keep track of all links to their website whether natural or unnatural and to take appropriate action to either cleanup their link profiles or not.
Google provides general guidelines on what links violate their Webmaster Guidelines rather than getting into specifics. Webmasters are left to interpret those guidelines as best they can so they can clean up their link profile appropriately, but many times webmasters are left wondering which links are ok and which ones need to be removed or disavowed. As you can see, interpretation of the guidelines is a tiny bit important.
G has highly developed and intelligent algos that can take a set of sites that are known to be spammy, slightly spammy, credible or highly credible and learn all the characteristics of those sites so they can be applied to very similar cases the algo identifies – known as a machine learning algorithm. Panda and Penguin involve machine learning algos which are accurate in many many cases but aren’t accurate in every case.
Google has Matt Cutt’s Webspam Team, or whatever they call themselves now, who make accurate interpretation and judgement about natural and unnatural links every time. The algo is designed to do the same job except on a much larger scale and therefore, without as much accuracy.
In May after Bing Webmaster Tools came out with their disavow link tool, Google Webmaster Tools provided a way to see when new links have been discovered by their algo. From within Google Webmaster Tools, go to “Links to Your Site” click on “Download latest links” and you get a spreadsheet of links with the date Google discovered them.
As I understand it, the disavowal tool was created in response to webmaster requests to help them dig themselves out of unnatural links/Penguin penalties. If I'm wrong then please link to a statement from Google that supports another reason.
My viewpoint is that the "disavow links" tool was not specifically created to "help webmasters dig themselves out". What financial benefit is there to Google by spending time and man-hours upon this?
I personally feel that the "disavow" tool is a mass-snitch report that Google can use to further punish those with "low quality" links. Now on the surface one might think this sounds great.....no more spam!?
But in reality, not every single website out there can get links from the NY Times and great linkbait pieces.
eg: Mike owns a small mom-and-pop factory that manufacturers an electronic circuit board that controls a specific kind of CNC machine, used to manufacture products in America. Correct me if I'm wrong....but I don't see the NY Times, or any other "high powered" website giving Mike a link back to his site for his "great content". I also don't forsee millions of people "liking" Mike's Facebook page because of the hilarious article about the various sizes and shapes of internal circuit boards for CNC machines.
This "disavow" tool in my humble opinion is nothing more than to help Google do what they have failed repeatedly to do with their algorithm: stop spam.
I have said this many times before: I personally know several spammers who are laughing their asses off at Google and their updates.....
If you're a mom-and-pop webmaster, by using the "disavow" tool, you're only serving to dig your own grave.
To those who may disagree me with, I say unto you this:
Penguin algorithm first came into existence in May of 2012. Immediately everyone started spending thousands of dollars and thousands of hours "cleaning up links". If memory serves me correct, there have been at least 20 revisions of "Penguin" over the past 5 months.
How many of you regained your rankings and traffic by dancing like a good little monkey as Cutts & Co. suggested you should?
I understand that some may have alternative viewpoints and opinions on secret motives behind the disavowal tool. But let's not confuse opinions with the facts. The fact is that the tool was created for dealing with unnatural link warnings. Any other motive attached to the tool should be considered opinion, suspicion, assumption, and even theory. But let's first acknowledge the fact that we do know. This is important because not doing so will lead to confusion. We dont' want that, right? ;)
From Danny Sullivan's Q&A with Matt Cutts about the Disavowal Tool [searchengineland.com]:
|Question: Who should do this? |
Answer: The post [Google's announcement post [googlewebmastercentral.blogspot.com] last week] says anyone with an unnatural link warning. It also mentions anyone hit by Penguin, but I keep getting asked about this. I’m going to reiterate that if you were hit by Penguin and know or think you have bad links, you should probably use this too.
From Google's Link Disavowal Tool Announcement Post [googlewebmastercentral.blogspot.com]:
|Today we’re introducing a tool that enables you to disavow links to your site. If you’ve been notified of a manual spam action based on “unnatural links” pointing to your site, this tool can help you address the issue. |
To underscore that the tool is not for any other reason, the post goes on to say:
|If you haven’t gotten this notification, this tool generally isn’t something you need to worry about. |
|If I'm wrong then please link to a statement from Google that supports another reason. |
I am not convinced any statement from Google is relevant in this discussion. Google releases statement based on their own self interest, like any big corporation. As webmasters, we need to look at what is actually going on, not what has been spoon fed to us.
|The fact is that the tool was created for dealing with unnatural link warnings |
Why is this a fact? The truth is you have no idea why it was created. It is a fact that that is the reason Google gave for the tool. Which is a long way from it being a fact that is the reason it was created.
|The truth is you have no idea why it was created. |
You can quibble about whether what we perceive as blue really is blue or just an abberation of how our eyes function. You can quibble about what chicken tastes like or is supposed to taste like. Really, just about everything is fair game for quibbling.
However, what can be considered ground zero for fact is what was announced by Google and reiterated by Matt Cutts. Everything else is quibbling, suspicion, opinion, viewpoint, etc.
What I am saying is, don't confuse what was officially announced by Google with what you suspect. I'm fine with alternate motives and such. In fact, if you read the article, Matt Cutts admits there may be other uses down the line for the tool.
If those with an alternate opinion had given themselves the benefit of reading the post they would have cited this:
|We haven’t decided whether we’ll look at this data more broadly. Even if we did, we have plenty of other ways of determining bad sites, and we have plenty of other ways of assessing that sites are actually good. |
We may do spot checks, but we’re not planning anything more broadly with this data right now.
Come on people, I'm trying to raise the bar on this discussion and delivered to you on a silver platter the opportunity to rebut me with an official quote from Google and everyone missed it because none of you read the post I linked to. So I have to do it myself?
Ok, now that I have properly rebutted myself with an official link and citation, you now know there is a possibility that in the future Google may use the disavowal tool in other ways than has been stated to date. But that's just the possibility, a door that has been left ajar.
|Ok, so now we know there is a possibility that in the future Google may use the disavowal tool in other ways than has been stated to date. |
The quote above from your post is a good example of how people approach this type of thing differently. You see the admission from Google as proof that it might happen. I already knew that it might happen before Google told me.
I understand that you are trying to keep the discussion focused on data points that are quantifiable. But I do not think that treating Google statements as fact helps the discussion.
Any progress that is going to be made in determining if the disallow tool will help people recover from Penquin needs to start with clean slate. Not based on what Google has decided to share, claim, release(however you want to say it).
Google's disavow tool brings up a number of other questions as well like, will this tool become a new form of link sculpting and if someone submits links from my site to be disavowed without contacting me first, is that going to be affect my site's reputation?
Just thought I'd mention something interesting I noticed; in Webmaster Tools there is no link to be found for a Disavow Links tool for our site; not even under "Health". Mind you, we have had no messages of critical issues. And yes, we have quite a number of links coming back to our site over the years (all white hat and natural). I just thinks it's rather strange, albeit probably a good thing, that we don't see this tool, yet everyone else seems to.
Would it be that it only shows up for sites that are considered "questionable" as well as sites that receive that dreadful "critical issues" message?
|Just thought I'd mention something interesting I noticed; in Webmaster Tools there is no link to be found for a Disavow Links tool for our site; |
There's no link to the disavow tool in my WMT either. I'm guessing they just don't want it being used when it shouldn't.. or maybe they just haven't included a link yet I'm just guessing.
For anyone looking the disavow tool is here:
and I found it on the webmaster central blog:
|.. or maybe they just haven't included a link yet I'm just guessing. |
No link in mine either
Can you tell me:
(1) How many internal links of the "hotel name" and "hotel name city name" variety did you have?
(2) How many of these links did you change and/or remove at once? When did you make the next set of changes? After the first set was picked up by the SERP? Or did you just choose any day?
(3) Are you seeing improvements when there are Penguin runs or are your rankings just improving gradually? After you made your first set of changes, how long did it take to start seeing an improvement?
|Slight rant, but I digress....links are always going to be king |
That's their patented process of analyzing WWW presence of websites. And this pagerank patent made google what it is today. So yes some of the LB techniques might have diluted impact but it cannot be ignored, completely agree with you.
- lalit kumar
| This 47 message thread spans 2 pages: < < 47 ( 1  ) |