Forum Moderators: open
The results look really good.. hopefully this will reflect across all the datacenters soon. Seems like the spammier sites I compete against have moved down while the good honest pages have moved up.
Unnatural, maybe. Obviously if it was "good anchor" it would be generating positive effects :)
Anyway, if you know how to use the allinanchor command it should be pretty evident that this is where the majority of the "algo tweak" took place.
Of course, we all knew the heavy weight of anchor text was bound to change, but this is pretty extreme. Methinks the "knob" was turned WAY to far the other way - sending many valuable pages spiraling into oblivion in the process.
Of course, this only furthers my belief that having hundreds of sites in Google is a good "insurance policy".
BTW, I am not talking about my observations for a few sites, but rather around 1,000 keywords, relating to dvds to loans to software, etc (yes, I had a late night)
Well... if all the Google data is actually present.... there are not really too many factors that could cause an index pages decline for specific terms only.
You look at stuff like KWD, and see sites ranking well with all sorts of density. Ditto titles and all the other general stuff.
You look at the aforementioned anchor text, however, and yes, there are many reports from those affected that their incoming anchor text spread is fairly tight. So yes, I agree, there is decent evidence to suggest that this may well be a factor... but definitely along with other factors.
If that evidence turns out to support reality, it is certainly a very clumsy way indeed for Google to proceed. Talk about collateral damage.... it's the finesse of sledgehammer/nut metaphor.... but to the detriment of Google itself as well.
My feel is still that in the main what we generally see now will stick for a while. I base this solely on the evidence of the previous two problem dances. Google seems to need the public to start asking questions before they act to sort these matters.
Perhaps they need the competition of the Ink (or whatever) change at Yahoo. That would leave the searching public somewhere obvious to go to when Google lets them down in the quality stakes. It would also help resolve the eggs in basket issue for many people.
I never thought I'd find myself rooting for Ink, but it's hard not to when Google throws wobblers like this one.
Ref: detrimental anchor text:
I am not talking about my observations for a few sites, but rather around 1,000 keywords, relating to dvds to loans to software, etc (yes, I had a late night)
But there may be many other factors at work here - if these are huge networks of sites, all using a fixed anchor text, they may have been hit for being a huge network / cross-linking etc. In my small niche I haven't seen any such effect.
Anyway, if you know how to use the allinanchor command it should be pretty evident that this is where the majority of the "algo tweak" took place.
There's absolutely no doubt whatsoever that google would want to filter out unnatural anchor text linking (good), but whether or not that has happened in this update remains to be seen. Only google know what weighting is currently applied to allinanchor and what it's actually showing.
But looking for evidence now is like looking at fingerprints before a crime has been committed.
This update is still under-construction.
TJ
I did not/do not research particular sites. Just ranking trends for analysis across keywords (top 10).
Well I mdae my posts in forum3 for the next 2 months :)
Perhaps you should define the razor sharp line between "good" and "unnatural" - then I think I could live with your statement
Perhaps Google is trying to draw the distiction between solicited inbound links (which would tend to have keyword optimised anchor text) and unsolicited links (where the anchor text would tend to vary).
An unsolicited link is a genuine vote of approval while a solicited link is just SEO.
Hope that's right, because most of my inbound links are unsolicited and the anchor text sometimes bears no relation to the site. :)
All the data isn't being factored in yet. It is slowly getting factored in. When it is all factored in the results won't be much different than they were before, top sites will remain top sites, except for fewer spammers/cheaters.
For the keywords and keyphrases that I monitor, changes haven't yet been significant. A few of my pages (and a few competitors' pages) have gone up or down a bit, but I haven't seen any dramatic flipflops.
I do get the impression that titles (more specifically, word order within page titles) may have slightly greater weight than usual in this update.
I have been quite worried about the future of Google in the last few months, with more and more serps being taken over by spammy affiliate sites. I was wondering how long it would take before someone else stepped in to take Google's place. However, this update has given me renewed confidence in their ability to weed out spam. I haven't seen as clean results as these for the last 18 months, and that goes for all the areas that I check.
Every single one of the blogger and guest book spam sites appear to have been wiped off the first two pages, and in general the results seems to be very relevant. I realise that some, (many?), innocent sites probably have been hurt, but in general I think it would be hard for anyone to argue that the results are not cleaner, more relevant and better for the searcher now.
"www.mydomain.com/index.php" is an old copy which has been around for months, but "www.mydomain.com/" has regularly been updated to the latest version.
Yesterday it was a version less than a week old. But today in www, www-ex, and www-in, the cache has reverted to a version prior to that. Is there a database I should look at which would show the version likely to be the post-update version?
However, some searches are decidedly smaller. On a search that used to have 12 pages of results (at 50 results per page) now has 5. I hope this means that it is still having pages added to it (since I definitely have some very important pages missing).
As for the over-optimized anchor text issue. I'm torn. Some pages that I have missing get almost all their PR from internal site links through the site's menu so the anchor text is the same for all their incoming links.
However, other pages that are the exact same way are listed fine where they used to be.
Also... because of the fact that a site's internal menu usually doesn't change, I think that a change to penalize pages with large numbers of identical incoming links would be a bad move by Google because it would unfairly prejudice search results against anything other than site homepages.
Quite possible. For Google to work, PR has to be able to transfer through internal links. For many sites, all links are to the home page. However, it doesn't make sense that internal anchor text be given much weight. Too easy for a webmaster to manipulate that.
Otherwise, the rest of the datacenters seem to be fairly balanced across several dozen sites I've been monitoring. The biggest problem areas appear to be where total results have increased substantially (i.e. one went from 3.5 million to over 5 million results, and dropped me a couple positions).
For example:
On the search 'blue widget' you would think that the most relevant page from the site widget.com would be their index.htm page, or at least their Blue-Widget.htm page but it seems to go to the deepest level possible and ranks Round-HalfInch-Tapered-Smooth-Blue-Widget.htm page
This seems to be the case for most sites and then these chosen pages they are ranked by most relevant.
If someone wants to dig through the 450 post or so to get the exact quote, be my guest.
-mc may no be worth the investigation it's starting to get at this point in the discussion.
Here's a thought.........
Perhaps the sites that took a hit were somehow making it more difficult to implement the new algo(s) across the majority of sites indexed and were put on the "back burner". Perhaps these sites are easier to place once the new serps are in place. I have noticed a heightened interest by googlebot on a site of mine that tanked. Anyone else?
Quite possible indeed...
From our viewpoint, the question is: are those links being discounted completely, or are they actually causing *damage*?
I know I know this smacks of the semi-penalty theads. Still, we have 6 sites: One uses keyword text rather than "HOME" to link back to homepage (because name of site is also the two kewords).
The site in question was knocked from #3 on page one to somewhere on page 6 in SERPs for a competitive two word phrase. IF it is reasonable to assume that those keyword backlinks were not helping very much (since they are internal links), then just discounting them wouldn't lead to this sort of pain. Looks to me like they actually hurt us.
Two other similar sites using HOME to link back to homepage did not see much change.
Small sample size here of course. Not reaching conclusions yet, but this, combined with other comments, has me fearing it's a red flag. If linking back to a homepage with keywords is a problem (when the keywords equal the site name), then yes the dial has been turned too far. Some will say that is spam, but if my site is about blue widgets, then having bluewidgets.com as a site name seems entirely reasonalbe to me...
This really wouldn't make much sense because of innocent collateral damage. Why should a site be penalized for a "Back to Widget World Home Page" anchor text on a search for "widget?" More sensible is just to ignore the anchor text.