| This 116 message thread spans 4 pages: 116 (  2 3 4 ) > > || |
|Penguin Recovery Tips - think tank part 2|
| 10:06 am on Jun 20, 2012 (gmt 0)|
< continued from: [webmasterworld.com...] >
|Let's go back to Penguin recovery tips, of which there are none easy logical ones for an average small business. Except maybe develop more good links where Google found bad ones, quickly and in numbers. Which, apparently, Google dislikes. And at this point black hats are reporting being able to successfully nuke sites at will thru simple linkfarms. |
I don't think it's quite so simple. I had / have a number of sites in dmoz, the open directory (remember that!?), whole categories of which were targeted by spammers for some nefarious purpose. End result - sudden gain of hundreds of spammy inlinks to my sites.
While not quite sure what they were up to (possibly testing the idea of nuking sites, if I had to take a guess), and I can't really be bothered to find out, what I know is that of my 5 sites which fell into the category they attacked, Google has treated them differently.
Three were deindexed to varying degrees: one has been completely deindexed, obliterated; another is showing just the url in Google when I look for the domain. A third shows a couple of pages.
The two that survived this attack and appear not to have been affected, as far as I can tell, already had lots of solid links garnered over 12 years.
One possible conclusion is that having solid links already (or a history) gives protection against spam attacks.
[edited by: tedster at 2:34 pm (utc) on Jun 24, 2012]
| 10:14 am on Jun 20, 2012 (gmt 0)|
How interesting! not sure if the .co.uk domain is same as .com, but in the UK there is basically "no" distinction between ads and organic (you'd have to be web savvy to spot it)
I've seen the future.....and I don't like it!
| 12:04 pm on Jun 20, 2012 (gmt 0)|
If I were to serve up results like that my AdSense account, not that it's worth anything these days, I feel sure would immediately be cancelled.
Yet more fuel for regulators surely?
| 2:19 am on Jun 22, 2012 (gmt 0)|
tedster, it's not a site penalty, as you suggested above, we have a clear example of one site, top 10 ranking for most money key words, bad seo done on it (not my doing, others) targeted anchor texts, very poor quality link placements, shockingly low when I finally was forced to examine the problem due to the horrendous advice client was getting from seos. I do however agree that it's almost certainly a machine learning algo, which is now being taught to do crude emulation of spam detection via patterns that are sadly going to make black hats laugh very soon in my opinion. Machine learning has always been a joke though, but google persists and will do it through sheer computational power, just like they did in the end with their translation stuff.
Studying the matter, which was made easy because we had several sites in the top 10, only one of which was hit, two of which had same seo firm working on it, I realized: site still ranks for moneyword + keyword1, which was not anchor text targetted, totally gone from rankings for moneyword +keyword2, which was anchor text targeted. My personal guess is bad seo, bad link farms, bad anchor text generation, with highly probable negative seo campaign by two total scum sites, reeking with black hat seo in their link campaigns, to an absurd degree, maybe used black hats to dump higher ranking sites, like ours, which was no innocent maiden in this game sad to say. Also that site had least natural high quality organic inbounds, which weakens it substantially.
In other words, as some others here have noted I believe, the penalty is anchor text backlinks to site+search phrase, not site alone. Which helps explain why penguin cannot run live it's too complicated to do that much calculation live. Almost certainly only adwords type phrases too, probably drawn from adwords I'd guess, which is why the -amazon.com trick worked, that bypassed roughly the filter. Which means the phrases are what are thrown to the algo, which then must flag each site based on some scaling thing or filter or penalty, but only per phrase, not the site itself etc, I still can't figure it out exactly, but it's along those lines.
Just for what it's worth, but there's no way a site penalty was applied, this site was hit by penguin one and then even worse in penguin two, may 25, and it was still ranking for that high value money phrase, only thing was, it had not been targetted. Now dipping out of rankings, am wondering if black hats are negative seoing it, or if our dumping backlinks is making normal algo lower it naturally until next penguin update.
Very difficult one, proves to me what I've always believed, the only people who benefit from black hat are the black hat seos themselves, long term anyway, though short term you can pump sites up, or down, sadly, much as google is pretending that negative stuff isn't possible, it is, clearly, if our main money terms were knocked out by our actions, why not someone else doing the same thing, or helping us because of our sloppy seo work, plus lack of quality inbounds in the first place, left us vulnerable? Food for thought, I think google has some thinking to do here, or take down the don't be evil plaque and just admit they are another media company striving to be rich as possible.
To make it even more complicated, google deindexed huge link networks as well, which also causes natural drops due to links just being removed. All in all, google has been thinking on this one, but they have made mistakes, that's obvious to me, they may have gone too far, probably have, driving sales to adwords is sadly, to their advantage, and they are a business.
For the record, I strongly dislike gray and black hat seo, and had ignored their part of the site for years, deciding to just not deal with it, but my feeling was always, scum in, scum out, long term anyway, and that is precisely what we have experienced.
As more blackhat seo forums have noted, however, negative seo only works on pages and sites that do not have strong natural organic backlink presences, such as our site. Which means, as usual, long term, quality sites, quality content, for all the flaws and weaknesses, it's the only chance out there, and it is just not that hard to do, but it does take commitment. I know black hats laugh at this, as well they should, in a sense, since they mess with the serps day in and day out and know full well that's just not really true (certainly isn't in our site's case, the site that beat it reeks top to bottom of pure low end black hat link work, some almost comically black hat when I examined some), plus they actually copied our layout roughly in some ways, and just updated some parts of their site to copy our core product when they jumped to top ten and we were out. But the point is, the only way to generate quality inbounds is by having quality product or content, and if that's not happening, you're just going to bob up and down with these algo and seo developments, bad strategy for any business.
Good to see leosghost, hissingsid still making good sense, glad some things don't change.
| 3:48 pm on Jun 23, 2012 (gmt 0)|
I would appreciate your thoughts on this as this is on a couple of sites that I am working on.
I read that if you have an internal link going from Page B to Page D in the body text, for example, and the anchor text used is build your abs and the title tag for Page D is build your abs, that is considered over-optimization.
Has anyone seen this or heard about it?
Could this be seen as too focused?
Also, if this is something that is done for several keywords (between different pages on the site but in the way mentioned above), what kind of effect can it have?
On one site that I am working on, about 20 pages, this pattern is there for over 50% of the pages. When I did this, I did not think this was over-optimization, I thought it helped to strengthen the themes of the pages, but now I am not sure what this is considered.
Your help would be appreciated.
| 6:06 pm on Jun 23, 2012 (gmt 0)|
|On one site that I am working on, about 20 pages, this pattern is there for over 50% of the pages. |
I'm guessing that's the part that might trigger a flag as over-optimization. I don't think anyone knows for sure on this, but I'd try mixing it up a bit. After all, it is something that "only an SEO would do."
| 3:28 am on Jun 24, 2012 (gmt 0)|
On one site that I am working on, about 20 pages, this pattern is there for over 50% of the pages.
|I'm guessing that's the part that might trigger a flag as over-optimization. I don't think anyone knows for sure on this, but I'd try mixing it up a bit. After all, it is something that "only an SEO would do." |
I agree with you.
When I did this, I did not think that this was over optimization but when I look at it now, I think that it might be seen as a pattern and something that is too focused.
| 10:17 am on Jun 24, 2012 (gmt 0)|
|I think that it might be seen as a pattern and something that is too focused. |
I have lots of original articles related to the content of my site that used to carry lots of contextual internal linking. I removed the links a long time ago (pre-recent algo changes), I think because I realised they weren't that helpful to a vistor who, getting halfway through an article in which they are interested, would get whisked away to another page if they clicked the link.
I am however considering adding a final paragraph to the articles along the lines of "If you enjoyed this article about How to Ripen Fruit you may be interested in our selection of Green Bananas" with the last keyphrase linked to my green banana page.
This is helpful to the user, who has reached the end of the article, and helpful to me because it represents an appropriate "call to action" for a likely conversion to a sale. It is a marketing tactic, not (purely) an seo tactic. But is Google clever enough to spot the difference?
| 2:24 pm on Jun 24, 2012 (gmt 0)|
That is a common approach, and it is done for visitors. I use it on a magazine-style site I've been publishing for 13 years and I've never had a problem. I don't use exact match between the anchor text and target page, however.
| 4:16 pm on Jun 24, 2012 (gmt 0)|
|I don't use exact match between the anchor text and target page, however. |
Thanks for the intel.
| 10:24 pm on Jun 24, 2012 (gmt 0)|
|I don't use exact match between the anchor text and target page, however. |
I was hoping to ask:
Are you saying that if the title tag of Page D is build your abs and there is an internal link in the body text of Page B going to Page D, you wouldn't use build your abs as the anchor text on Page B but maybe a variation such as develop great abdominals?
If Page D is trying to rank for the phrase build your abs, you wouldn't use build your abs as the anchor text for the internal link on Page B (maybe you would use develop great abdominals) and the title tag of Page D might be some phrase such as How to Develop Great Abs or How to Build Great Abs?
| 2:19 am on Jun 25, 2012 (gmt 0)|
I'm saying I'm very relaxed about creating any kind of signals. I don't always avoid exact matches, but I don't intentionally create them either. Instead, I consider the context and purpose of the link and how the anchor text will read to the visitor.
My purpose for any link is to get the visitor to click, so I write anchor text (and surrounding text) to entice that click - to make it as attractive. Similarly, I know that the title element of a page is likely to appear in the search results. So I consider how attractive the title will be be in that very different context - different because it is competitive whereas on-page is all my site.
The whole thing is much more artful than it is rigidly technical, and I trust search engines today to wrap ALL the relevance signals they can find into one ranking decision. That's why I include "surrounding text" in my writing decisions. The surrounding text is also part of the link's influence on the target page - and it is clearly important to lead the visitor into clicking, too.
It's been years since we needed to scream our keywords at the top of our lungs just to make the point.
| 12:51 pm on Jun 25, 2012 (gmt 0)|
Thanks for the response.
I apologize for asking a lot of questions about this, but I feel that this might be contributing to my over optimization.
|I don't always avoid exact matches, but I don't intentionally create them either. Instead, I consider the context and purpose of the link and how the anchor text will read to the visitor. |
Are you referring to exact matches between the anchor text of the internal link on a page and the title tag of the page being linked to? For example, the anchor text, build great abs, of the internal link on Page B being the same as the title tag of Page D if that is the page being linked to from Page B?
|That's why I include "surrounding text" in my writing decisions. The surrounding text is also part of the link's influence on the target page - and it is clearly important to lead the visitor into clicking, too. |
When search engines look at the “surrounding text” of anchor text in an internal link, are they looking at some of the words before and after it, a couple of the sentences before and after it or maybe the whole paragraph to see if the link fits within the context of the paragraph?
| 1:00 pm on Jun 25, 2012 (gmt 0)|
|Are you referring to exact matches between the anchor text of the internal link on a page and the title tag of the page being linked to? |
|When search engines look at the "surrounding text" of anchor text in an internal link, are they looking at some of the words before and after it, a couple of the sentences before and after it or maybe the whole paragraph to see if the link fits within the context of the paragraph? |
All of that is possible - so is the title tag of the linking page! It's just one of MANY small details that do get noticed by search engines - and they're always looking for correlations they can use.
If you overdo it and try to "SEO" every small detail, then you might well trigger a Penguin demotion. If you want to rank well today, I'd think a lot more about old school marketing to directly attract more visitors and less about these technical details.
| 3:11 pm on Jun 25, 2012 (gmt 0)|
|If you overdo it and try to "SEO" every small detail, then you might well trigger a Penguin demotion. If you want to rank well today, I'd think a lot more about old school marketing to directly attract more visitors and less about these technical details. |
Thanks for the advice. I agree.
| 9:55 pm on Jun 25, 2012 (gmt 0)|
I don't understand the [easily agreed upon here] notion of "only an SEO would do it" in relation to anchor text matching the title and H1 tags.
It is very common in CMSes, including forum and blog platforms every time you have a list of pages be that a category, tag, forum posts or just about any kind of list there is, to link to the pages using their title as the anchor page. That's, after all, why it's called a "title" - it's the best way to address a page because it's literally the name of the page. Since <title> goes to the top of the browser window and not rendered on the page, H1 is often simply the same phrase - it would be silly to not show its own name on a page, and the top-level header H1 is the best place to put it.
Compounding the "pile up" on the title phrase is the fact that many CMSes include so-called "search engine friendly" URLs which take that very same phrase, URL-encode it and make it not only the name but also a part of the address of the page. I have to agree, it does sound like an SEO-related approach, but it is a very common, very basic and a natural one.
I propose a dissenting view that you don't have to be an SEO (whatever the definition) to have multiple links on your site and others that use the very same phrase for a part of URL address, anchor text, title and H1 for most of your content pages.
In addition to that, the same phrase may also surface in link titles (href title=""), image "alt" and "title" tags, rich snippets, RSS feeds (of several different formats), sitemap (through "se-friendly" URLs), FB and Twitter updates and I'm guaranteed to have missed a few other places, all without you trying to "game the system" as it were, and just using either the natural way of addressing a page as you would in a (paper) book or simply using default settings in your CMS.
Don't you guys think the word "over-optimization" has been severely "over-used" lately?
| 10:45 pm on Jun 25, 2012 (gmt 0)|
1script, I think they were talking about links within the text body of the page, not links set off the way CMSes typically do ("related posts" in the sidebar, for example). I agree that what CMSes do automatically is not something Google's likely to blame webmasters for. The problem is when you're writing a paragraph that says "You may also want to check out Widgets: the feeding, care and ultimate disposal thereof" (wherein that's the exact title of the linked page) instead of something like "You may also want to learn more about feeding your widget."
At least, that was my understanding.
| 10:46 pm on Jun 25, 2012 (gmt 0)|
I think that for a computer it is a quite easy to spot whether page title element, h1, link anchor, link title attribute and URL align just because the user is non-SEO aware who is using CMS with these "SEO features" or whether the site is over-optimised by SEO (person).
In the first case - the alignment between these elements will be the same across the site. In the second case you are more likely to have internal anchor text being subset phrase of the whole page title element and URL of over-optimised page often dropping off stop-words from the URL (which would be there if the URL is created from the CMS page name purely by replacing spaces by dashes or similar)
| 10:54 pm on Jun 25, 2012 (gmt 0)|
All of my sites have gone back to where they where before the penguin update, by just removing all blogroll links except one site I can't figure out why all the pages are back except homepage is gone, once a week it will show up then disappear for a hour or so. Oh well
| 2:24 am on Jun 26, 2012 (gmt 0)|
|I think they were talking about links within the text body of the page, not links set off the way CMSes typically do ("related posts" in the sidebar, for example). |
This is what we were referring to. Great example in your post.
| 5:58 pm on Jun 26, 2012 (gmt 0)|
Years ago, I think that it was considered okay or good SEO to have the same phrase in the title tag (e.g. How to Increase Your Bench Press) and h1 tag (e.g. How to Increase Your Bench Press).
I read this on articles giving SEO advice and SEO forums.
I am wondering if that might now be considered over optimization.
Would some variation between the two be better?
Any opinions and/or results from analysis that you may have done would be appreciated.
| 6:51 pm on Jun 26, 2012 (gmt 0)|
|I am wondering if that might now be considered over optimization. |
Would some variation between the two be better?
I'd say varying them would be a sign of "SEO activity", not the other way around. Like I said in my previous post in this thread, it's very common and natural to have the <title> coincide with <h1> , as it is very common and natural to see the same exact text in the anchor of the links to that page (and elsewhere).
If Google is now taking it as a sign of trying to gain the system, they are missing the mark. I'm not saying it's impossible they are wrong, I'm just saying that talks of over-optimization of anchor text may have a merit but not when the anchor phrase is the same as <title> and/or <h1>. Just as it would make no sense to penalize someone for over-optimizing for the word "here" or "read more" or "this site".
| 7:45 pm on Jun 26, 2012 (gmt 0)|
|If Google is now taking it as a sign of trying to gain the system, they are missing the mark. I'm not saying it's impossible they are wrong, I'm just saying that talks of over-optimization of anchor text may have a merit but not when the anchor phrase is the same as <title> and/or <h1>. |
But could having your target phrase in a page's title tag, h1 tag and as the anchor text in an internal link from another page on the site going to that page be seen as being to aggressive in trying to rank for the phrase?
I think that this is the way that many SEOs would go about trying to rank for a phrase but now Google is saying that by having some variety, you are showing that you want to rank for the phrase and, at the same time, you are not overdoing it.
| 12:55 am on Jun 27, 2012 (gmt 0)|
|I'd say varying them would be a sign of "SEO activity", not the other way around. Like I said in my previous post in this thread, it's very common and natural to have the <title> coincide with <h1> , as it is very common and natural to see the same exact text in the anchor of the links to that page (and elsewhere). |
Actually, I'd say this is true in a small number of cases. If your article is called "Macaroni and Cheese Recipe" b/c recipes are just not the sort of topic we create cutesy titles for, then of course it sounds natural to say "You might want to check out my macaroni and cheese recipe." But if your titles are more like "Are green widgets dangerous to small pets?", then it would sound really bizarre to a reader if you say "You might want to read more about Are green widgets dangerous to small pets?. That would set off huge "SEO" alarm bells. A much better approach, at least IMO, in that case would be "green widgets and pets." You've got your important keywords there in your anchor text, but it sounds natural.
| 8:16 am on Jun 28, 2012 (gmt 0)|
How is everyone doing? I heard that some people had some kind of recovery or at least improvement last weekend. Myself, I have seen nothing but I have more going on than just Penguin - I have duplicate URL issues that are clouding the water for me.
I am interested as, as far as I know, Penguin has not been updated since 1.1. Have you all recovered? Seen some gains? Nothing?
| 9:38 am on Jun 28, 2012 (gmt 0)|
Tip: Don't use robots.txt to deny Google crawling pages that you want to remove. Use ONLY 410 since its better than 404. I tested this and the denied pages in robots.txt weren't removed but the ones with 410 are getting removed pretty fast.
| 8:12 pm on Jun 28, 2012 (gmt 0)|
Mine's getting worse, but like you, I have a lot else going on. I decided since things were so lousy anyway, I would go ahead and make some big changes I'd been wanting to make for ages. That included changing my domain name, just to give you an idea of the scope here. :D
| 12:55 pm on Jun 29, 2012 (gmt 0)|
Then wikipedia is over-optimized with lots of anchor keywords from the start to the end of articles. Or G sees wiki differently.
| 3:34 pm on Jun 29, 2012 (gmt 0)|
A lot of this thread is just panda talk, not penguin as the thread title suggests.
With respect to penguin, recovery factors are all just pure speculation for now. We will know more if penguin updates and some sites actually recover. Then, word can get out on what factors were successful in driving recovery.
| 5:59 pm on Jun 29, 2012 (gmt 0)|
So, what should be the best possible strategy to go about links when it comes to Penguin?
From the way i look at it, removing a lot of links can cause some sort of trigger, where it can either confirm Google that you were actually participating in link schemes or you accidentally stand on a wrong foot by removing links which are actually good ones. Removing good ones can cause further drop in ranks, removing bad ones can trigger Google.
What i recomend is, only remove links when you are 100% sure they are not worth it, but at the same time keep on building better and much more qualitative links and let them fight your case in Google's court.
So far, for one of my site, i have concentrated more on building better links, and i got links from couple of pretty big websites, news papers, and sites under 400 alexa, but apparently nothing has made an impact, i see no improvement.
My next phase would be to start removing couple of links which i deem bit spammy or migt try by changing their anchor text! It would have been a lot better if there were regular monthly penguin updates!
| This 116 message thread spans 4 pages: 116 (  2 3 4 ) > > |