| 7:40 pm on Jun 4, 2012 (gmt 0)|
Isn't this something that every web marketer would actually want?
|the most obvious SEO anchor text |
In regards of removal, I can only say that Google has scored here as some links being removed are those that Google could never sort as paid or manipulated. Penguin got webmasters to do Google's job.
At the end, if I promote red widgets on my example.com site, and I run a blog widgets.com, and I post articles about widgets and link "red widget" back to my site, is this a spam?
What I'm doing is I'm posting articles to promote a product, and I link to the site that sells a product. Same as when you see an ad on TV. What's the difference? A clear disclosure about possible/certain bias?
| 8:22 pm on Jun 4, 2012 (gmt 0)|
|In my SERP I was the ONLY one offering something different and I got hit |
Same with me, @Jez123. Unfortunately, I think that all the “machine learning” and statistical comparisons between sites in the same niche may well cause Google to throw out any sites that seem to be outliers, and to only promote sites that conform to the norm. That’s a recipe for boring search results, and a disaster for those trying to offer something different.
| 9:16 pm on Jun 4, 2012 (gmt 0)|
About the false positive problem and Penguin, Google has made a report form available for anyone who feels they are in that boat:
| 9:33 pm on Jun 4, 2012 (gmt 0)|
I think the reality of it is that there are not that many false positives. Google could come out with an update next week that demotes all websites that use a green page background. Sites with a green background would get demoted, no false positive it just is what it is. I speak for myself when I say that we did build links to get better placement on google. At the end of the day we did what we had to do and google did what they had to do. I don't believe that google is better for it but my stock price is not sitting at $578/share so who am I to say.
| 9:59 pm on Jun 4, 2012 (gmt 0)|
|I don't believe that google is better for it |
the search results that is, I think that google the business is right on the money.
| 12:32 am on Jun 5, 2012 (gmt 0)|
The troubling factor in all this is usability suffers through non use of anchor text. I was reading an article today and CLICK HERE replaced the anchor text. It was an obvious seo article (pretty crap to be honest) but the anchor text keywords were removed from the link.
| 2:46 am on Jun 5, 2012 (gmt 0)|
< moved from another location >
Some of my sites were affected and some not...all sites used the same article marketing strategies but not blatent spam, innadvertangly putting too much anchor text...
The sites with a PR3 survived the others didn't.
My question is this.
For the PR3 sites that were unaffected, should I go back and fix the exessive anchor text or just leave it as is and build more diverse anchor text in the future?
I could fix this, but I'm afraid this might raise a red flag to the sites unaffected?...
[edited by: Robert_Charlton at 4:16 am (utc) on Jun 5, 2012]
| 6:09 pm on Jun 5, 2012 (gmt 0)|
Although I'm not entirely positive we were affected by Penguin we did get about a 20% drop in traffic from Google since their update, conversely we got a 30% gain from Bing. This is telling me that more people are going to Bing to find what they are looking for... at least in our realm of business. (FWIW we are a PR4 site that's been around for nearly 15 years)
Personally I'm using Bing more and more because I'm having problems getting relevant results. I get tired of searching for answers and only coming up with ehow and sites like it that only list more questions and rarely answer my question to begin with
| 5:05 am on Jun 6, 2012 (gmt 0)|
How are we defining false positives? And for that matter, false negatives?
It's been confirmed by a skeptical third party with SEO experience that my site was a false positive: it's clear to a human that I'm writing (however badly) for visitors, but a couple of things I did (like including a LOT of editorial links in some pages) could look to an algo like an aggressive SEO tactic.
The very fact that Google created a special Penguin reinclusion form, after not doing that in any other update I can remember, suggests they were expecting more false positives this time than in past updates. It may be that what Penguin's meant to do is right on the edge of being too ambitious: you need context to distinguish SEO from plain ol' marketing, or to tell whether something that technically looks spammy to a bot is actually just what searchers are looking for in a particular case. Those are just two examples.
| 6:54 am on Jun 6, 2012 (gmt 0)|
|after not doing that in any other update I can remember |
This is the first time I remember a Google Doc form. However a special email address or subject line has been offered for major updates for many years. In fact, years ago GoogleGuy would often offer that kind of avenue specifically for WebmasterWorld members. Even Panda offered a dedicated thread on Google's own forums.
I think this form input is simply a step toward giving the webmaster input more structure so that the information they are gathering can scale more easily.
| 7:57 am on Jun 6, 2012 (gmt 0)|
|The very fact that Google created a special Penguin reinclusion form |
It isn't a reinclusion form. It's description is "Feedback on our recent algorithm update ("Penguin")".
|the information they are gathering |
They are asking for feedback, not offering to take any action.
If I think a page has been wrongly penalised and someone at Google looks at it and agrees, how likely is it that will they manually overrule Penguin for that page?
Has anyone here reported a false positive? If so, what happened?
| 10:03 am on Jun 6, 2012 (gmt 0)|
The "fact" that Penguin requires intensive computation coupled with other stuff I'm seeing leads me to the hypothesis that some form of ratios are involved. If you are at, or just past, the tipping point then working to correct on page/within site over optimisation OR backlinks/backlink anchor text should have a positive effect.
If you are in this situation, as I believe I am, then as I've eluded to earlier it may be best to focus on your own site first, then sort out any slightly dodgy backlinks or anchor text.
I'm worried that hard won backlinks with KW anchor text may not be the thing tipping me over and once I get rid of them I might do irreparable damage.
If you are badly affected there may be no alternative but to aggressively do both on site and backlink anchor reduction. In this situation I'd like to bet you already know why you've been hit and can relatively easily decide what you are going to do.
I've found it most telling to analyse competitor sites that have been affected either positively or negatively. It is easier to understand why those that have been hit got hit. Those that have benefited are harder to understand as some are similar to sites that were hard hit.
| 1:27 pm on Jun 6, 2012 (gmt 0)|
I finally heard back on my reconsideration request. It was a canned response saying that my site violates Google's quality guidelines.
They said to look for unnatural links. They also said to let them know if there are links that I can't get removed.
At least I know now, that they still see "unnatural links" pointing to my site. Since I have been constantly getting more removed, I filed a new reconsideration and listed the ones I haven't been able to get removed yet, like they asked.
Has anyone been through this process before and gotten your rankings back? (pre or post penguin?)
The weird thing is that I never got any of the warnings that other people got earlier this year about unnatural links.
| 1:46 pm on Jun 6, 2012 (gmt 0)|
Here's a link to some coverage of a talk Matt Cutts gave about Penguin and penalties vs algorithms yesterday at SMX. Advise all who think they are affected to read it. And everyone else too. It may or may not shed some light for you.
| 1:47 pm on Jun 6, 2012 (gmt 0)|
|Has anyone been through this process before and gotten your rankings back? (pre or post penguin?) |
Matt Cutts touches a bit upon what to do in this interview:
He alluded to some sites removing 90% of their inbound spam links and scanning emails or letters that webmasters have sent requesting link removals.
He also implied that one reason they are punishing link sellers (and possibly buyers) instead of just ignoring link spam is because they want to stop people being hurt by wasting their money buying links.
| 3:50 pm on Jun 6, 2012 (gmt 0)|
"He also mentioned — and this will be good news to many search marketers — that Google is considering offering a tool that allows web masters to disavow certain links, but that may be months away if it happens."
- that would be nice if you are going to penalize us for something beyond our control :)
| 4:08 pm on Jun 6, 2012 (gmt 0)|
|they are punishing link sellers (and possibly buyers) |
That may be what they think they are doing. I am neither seller nor buyer, and I know my site provides quality content from the amount of time I have to spend pursuing copyright infringers. Similarly, having returned first-page results for a high number of sector-specific terms for the best part of a decade, my site has acquired a lot of backlinks (mostly - because I have been careful to ensure titles, descriptions, urls, etc, accurately reflect content - using page topics as anchor-text). If staying in Google listings means scrapping that way of working, then I, for one, will not recover from Penguin.
One might forgive the odd mistake if the results were now filled with better quality, higher relevance, or less spam. They are not. Penguin doesn't seem to be able to tell a genuine link from a stuffed aubergine, or spot the difference between good content and no content at all.
| 4:11 pm on Jun 6, 2012 (gmt 0)|
So, reading this. If you got the WMT inorganic links message you get a way to come back. If you didn't you get no clues at all? Does that mean that ONLY inorganic links were involved if you got that message. If so, does it follow that you may have inorganic links and other factors involved if you didn't?
It seems unfair that the links people get warned and other just get wiped out with no clues.
How many have now recovered from Penguin? I have only heard of a few.
Could we have a summary please? What do we actually know so far?
| 4:14 pm on Jun 6, 2012 (gmt 0)|
Would high quality article submissions, with a plain URL link in the footer (no anchor text), be considered an unnatural link by Google?
This is something we still do, but it is quality content written by a licensed professional in the field.
We also recently submitted a press release to PRWEB (no free PR sites), with a plain URL link and no anchor text. I'm wondering if that is frowned upon as well?
These are things that make sense to me for marketing purposes regardless of SEO, but I am worried Google might look at them as an attempt to manipulate their results now.
| 4:48 pm on Jun 6, 2012 (gmt 0)|
|So, reading this. If you got the WMT inorganic links message you get a way to come back. If you didn't you get no clues at all? |
If you get the inorganic link message, then a manual action is about to be imposed / is being imposed (from what I can tell).
If you don't get the message, or if you file a reconsideration request and you are told that no manual action was taken, then you are being affected by one of the algo changes (Panda, penguin, or another).
|Would high quality article submissions, with a plain URL link in the footer (no anchor text), be considered an unnatural link by Google? |
I would assume yes, it would be considered inorganic.
I am assuming that ANY link that you can control placement of, whether paid for or not, would be considered inorganic.
will there be a manual action? Who knows. will it "punished" by Panda / Penguin? Again, who knows?
Just my two cents: If it were high quality, I would rather have it on my OWN site than on someone else's site. Maybe I am drinking the "just have great content" Kool Aid though.
Also, I think it is important to note that Matt cutts said that they are not using the term "penalty" any more, but instead are using the words "manual action."
I DON'T think this change was made because there were no more penalties being applied.
I think this might be so as to clarify whether a "penalty" was initiated by the algo, or whether it was initiated by a manual action.
to me, the fact that they are using the words "manual action" instead of penalty means that google employees themselves are confused about what the algos are capable of / responsible for doing, and need some clarification.
| 7:04 pm on Jun 6, 2012 (gmt 0)|
|Also, I think it is important to note that Matt cutts said that they are not using the term "penalty" any more, but instead are using the words "manual action." |
I got quite a kick out what Cutts said. Maybe the word spam isn’t utilized because guess what it isn’t spam. Maybe the more accurate definition would be “things we decided we didn’t like this week” and your site happened to be one of them update.
| 7:59 pm on Jun 6, 2012 (gmt 0)|
|I finally heard back on my reconsideration request. It was a canned response saying that my site violates Google's quality guidelines. |
seoArt, how long did it take for you to get a response?
I regret the way I did my request. I wish I had provided a link to a text document with more detail. I wonder if I should resubmit, or just wait it out. It has been 4 days since I submitted.
[edited by: crobb305 at 8:07 pm (utc) on Jun 6, 2012]
| 8:06 pm on Jun 6, 2012 (gmt 0)|
|“things we decided we didn’t like this week” |
Panda and Penguin in Google SERPS introduce negative factors to the overall calculation of relevance.
Instead of returning sites with the highest positive relevance scores (by whatever criteria these are measured) in top positions, things that previously simply didn't count towards a positive score are now given a negative value.
Also, instead of adding to a positive score on an ad hoc basis (i.e. whenever new elements entered the index), Panda and Penguin actively seek out factors that add to the negative score. That, in my view, has all the hallmarks of a Witch Hunt. I don't know of a Witch Hunt that has ended well.
Saying that a negative factor isn't a penalty is denial by semantic inexactitude.
| 8:34 pm on Jun 6, 2012 (gmt 0)|
|Maybe the more accurate definition would be “things we decided we didn’t like this week” and your site happened to be one of them update. |
Unfortunately, that's pretty much exactly what he's saying here, IMO:
|Matt: We look at it something designed to tackle low-quality content. It started out with Panda, and then we noticed that there was still a lot of spam and Penguin was designed to tackle that. |
In that snippet, he uses "spam" and "low quality content" interchangeably, which means he sees them as the same thing. I thought spam meant "stuff that exists only to cash in on high Google rankings." But now they're including content that may exist for another purpose - such as, promoting one's business so one can sell things - but just happens to rank well.
They seem to be losing what grasp they had on the fact that all promotion is by definition an attempt to manipulate your traffic. Getting inbounds was valuable long before Google, because there was basically no other way to get seen by the 200 people who were online, LOL.
I don't think we should give up legitimate means of promoting TRAFFIC just because those means also happen to offend Google. Because there is a possibility Google actually WANTS us to stop promoting our sites so that their control over the web is complete.
| 8:35 pm on Jun 6, 2012 (gmt 0)|
Trying to continue business as usual by adding unique fresh content each day but to no avail. Even did reconsideration request and they informed me there have been no manual spam penalty.
Losing my will to work further on this site, and considering to go for brand new site if things don't change soon. More than 500 original articles of content. Traffic used to be around 1500 uniques prior to Penguin, now it barely passes 100 mark.
Two years of hard work down the drain.
| 10:43 pm on Jun 6, 2012 (gmt 0)|
@pelizden, i sympathize.. we were at 10,000 uniques a day for 6 years and then BAM... down to 2500 now..
| 12:14 am on Jun 7, 2012 (gmt 0)|
@pelizden, you said "Losing my will to work further on this site, and considering to go for a brand new site if things don't change soon."
The question is whether even working on a new site will make a difference. Perhaps the new search world will always be a tough one to navigate if you're a small business owner. Google is showing great preference for certain sites, larger and more authoritative sites -- sites that have more to offer. I'm not convinced that a new site would work out any better unless you're going to add something to the table.
| 9:11 am on Jun 7, 2012 (gmt 0)|
|DS: What about tweets earlier today about using bounce rate? You don’t look at how quickly someone bounces from a search result and back to Google? |
MC: Webspam doesn’t use Google Analytics. I asked again before this conference and was told, No, Google does not use analytics in its rankings.
From the Matt Cutts / Danny Sullivan interview
How does this fit with Google supposedly being able to judge great content?
| 9:28 am on Jun 7, 2012 (gmt 0)|
We were on 30,000 uniques a day slapped down to 12,000 - ouch.
We found some really dodgy links in WMT that we think killed us.
We build websites for a niche business sector and when we first kicked off the business in 2004 we backward linked from our customers websites back to our main website in a noscript tag. Not just one link though...about 70. Very dodgy I know, but as our business grew and the seo world evolved, we had our web designer backtrack the dodgy links and take them off all our customer websites as we realised that you just could'nt get away with that anymore. We have adopted a very clean seo method ever since.
The backtrack was in about 2007 - but guess what - our designer missed links from about 12 websites (from about 350 sites) and the pages that the links were on are dynamic so they have been multiplying ever since.
I found about 31,000 of these links in WMT all of which have been deleted on 27th April, but are still sat in WMT.
The WMT refresh could take months and I will update here when the links are gone, but it just goes to show that even some old sneaky stuff that worked years ago can come back to bite you in the a$$.
The confusing thing is that we were fine through Panda and have never had a links warning from Google.
I have been a bit of a watcher on forums since penquin not being able to talk about it due to deep depression and various bouts of kicking myself around the room.
| 9:48 am on Jun 7, 2012 (gmt 0)|
@BBonanza. I notice the WMT are very fast to add new links but very slow to remove old. I commented on a wordpress plugin blog with a question about the plugin soon asfter penguin hit me. I removed links from one of my own sites the same day but I not have around 200 links showing in WMT from the plugin blog ((new comments get featured on every page it seems and it's a very slow moving blog) The links I removed that same day from my own blog are still showing. It's very frustrating.
| 10:18 am on Jun 7, 2012 (gmt 0)|
|Matt Cutts Weâ€™ve done a good job of ignoring boilerplate, site wide links.In the last few months, weâ€™ve been trying to make the point that not only is link buying like that not doing any good, weâ€™re turning the dial up to let people know that certain link spam techniques are a waste of money. |
I'm starting to wonder if ignoring sitewide links from satellite sites is what has affected my main site. If that is the case I may need to find ways to vary the links and find which areas of templates they are ignoring. If it is the footer for example I'd like to know how they are identifying it. If I call the div something other than footer would that help? I can feel an experiment comming on.