May I suggest taking a step back and asking - Why does it matter?
When would it be smart to build poor backlinks or weak inner pages? If your site has any of these problems then you need to fix all problems.
Panda, Penguin and the hundreds of other Google updates means you need quality everywhere for long term success. Taking shortcuts simply leads to a shorter business life.
As for what to fix first? Every site is different but I tend to fix content first. If you have crap content, that just makes attracting quality backlinks even harder.
Penguin is more strict on spam links or low quality back links. One can not remove all backlinks from other site but can come out with natural links that can be generated only by valuable content on your site. Anyway by this updates one more time content became more powerful and low quality link build trick like same article posting is about lost its existence :)
Its clear based on reports in this forum that it is looking at external links and exact match domains. I'm sure it is looking at other factors as well.
I use two "just for SEO" practices on a website that was not hit by penguin.
First I have a list of 20 links to other pages on the site so that Googlebot can crawl the entire site even without a sitemap and so that more important pages get more internal links and a variety of important anchor text.
Second, I have about 50 pages (on a 20,000 page site) targeting synonyms and misspellings of my main keywords.
My implemented the list of links years ago when Google was often sending users to the wrong page. By pointing internal links with different related anchor text to each page, Google crawled the site then started sending users to the correct pages much more often. Today, I'm not sure I need it. Google put a lot of emphasis on synonyms and with all my pages in a sitemap, the site should be crawlable anyway. Its clear that users don't navigate the site using this list.
My 50 keyword targeted pages are now explicitly against Google's quality guidelines. When I implemented it (well before Panda) that wasn't so clear. Google users get a very relevant landing page and a good user experience. Users are very unlikely to know that multiple pages like this even exist. Since Google wasn't ranking the site for very relevant synonyms, this seemed to be the way to make it happen.
So either these two practices are not part of Penguin, or my site uses them in moderation such that they fly below the radar.
As much as I personally get frustrated by Google's use of synonyms in web search (I'm much happier with Verbatim search), it is clear that it makes the results much better in some cases. Maybe I don't need to hit Google over the head with a mallet to get them to understand the broader context of my site nowadays.
How can you deal with external links that you have no control over ?
You can request they be removed from the linking source (unlikely, but it doesn't hurt to try)
You can move or rename the page they're linking to.
That's about it.
So I could quite easily create spammy links get them indexed by Google and point them at my competitors to hurt there site ? Surely this isn't the approach Google is taking.
|So I could quite easily create spammy links get them indexed by Google and point them at my competitors to hurt there site ? Surely this isn't the approach Google is taking. |
You can see this (somewhat long) discussion about this here:
The general concensus seems to be that Panda is onpage factors - it would be logical to assume that Penguin is offpage - anyone agree/disageee?
There's an interesting blog post going around today that analyzes anchor text on back links, and its relation to Penguinized (?) and non Penguinized sites. I'm not sure I agree with all the recommended steps it says to take (no more microsites please) but it did present some interesting data.
Basically, the more money term anchor text you have, particularly from link sources not directly related to your niche, the more likely you were to be hit.