Forum Moderators: Robert Charlton & goodroi
I agree!
From what I see it's not too much related to page optimization; I see only a preference for CSS sites.
For the sites I monitor is something related to links.
Sites that get too much links in a short period of time seem penalized.
Sites that got the same amunt of links per month, seems not suffering.
I noted old sites that haven't got any link in the last two years that are maintaining the same rank.
Does that mean Google is correcting site wide on the amount of anchor text for those phrases?
Anybody has the same experience?
> are we talking about Over Optimization (and if so by what criteria? on page keywords, html optimization, titles, meta tags, off page irrelevant links, anchor text repetition) some sort of CIRCA/LSI stuff, duplicate content, a little of everything, or something else entirely?>
I have the impression that my site is corrected by the filter on keywords (incl. anchor text repetition). Duplicate content wouldn't explain why some other search phrases on another page are still doing well, even though it is scraped al lot and on this page I even found the meta description of the page being the same as the introduction on the page itself.
I've been seeing this kind of thing at work for a while, but I esp agree that this "filter" or "tweak" or "update" is definetly all about off-page stuff, initially i thought it was a devaluation of internal links vs external links but now i'm not so sure.
so perhaps this "filter" is looking at homogenity of links, IE are all the incoming links coming from the same or only a few domains? This would explain my previous internal vs external links observations.
Well I still think it has a lot to do with the home page not being thought of as the canonical page and the rest of the site suffering as the home page has lost some power. :(
Tests that I do to see if the Homepage is a bit scewiffy:-
Using Webmasterworld as an example :)
[google.com...]
A Search in the above format (domain name as a phrase - are you returned top - well - hmmmz - I think if the homepage is OK - you should be :)) If you are outranked on this type of search then surely there are problems with G?
[google.com...]
A search in the above format (Is your homepage returned top - hmmmmz - I think it should be - unless you have really optimised an internal page with your domain name in the text body)
Would love a bit of feedback by folks to see if their homepages look scewiffy.
As EFV is intrested in this thread - when your site had problems - you were not top for either of the above searches - now you are :) - and I assume that your site is problem free at the moment.
Sites that get too much links in a short period of time seem penalized.
That's true. GG has recommended websites build links organically, and those that don't, meaning too fast, get slammed by the algo.
I've seen it happen to me and others enough to know that it's very real. It's sort of a sandbox for links if you build them too fast. One has to go slow and steady when the build links.
[edited by: Freedom at 3:13 pm (utc) on Sep. 29, 2005]
Yep - well I would think the homepage should come in first, unless there is a very good reason not to - eg with Matt Cutts blog all the homepage power is redirected to the blog pages.
Although I do think that Matt blog could easily get a canonical url problem.
Hmmmz - just got stickied by someone who passes that test but has split pr on the non-www and the www - so that test does not encompass all the symptons. However, the site stickied to me is at an early stage of the problem and probably left untouched would end up with the homepage failing the above test.
Hi Jon,
I'm glad to hear that you're not affected by this new filter, but I don't think I'd call 608 posts in this thread (or for that matter the 663 posts in the previous thread about this) just a few sites affected. Plus I'm sure there are a lot of sites affected that either don't know about WebmasterWorld or do know about it and for whatever reason aren't posting. (I normally fall into the latter group.) In what market are the site/terms you're monitoring located?
I'm seeing a wide range of markets affected from ecommerce to travel to educational to affiliate marketing and so on... In my market (Online Coupons/Shopping) I'm seeing that tens of thousands of terms have been affected by the application of this filter with major changes in the Top 50 sites for each term.
In general the results look very much like pre-February 2005 when the results in this market were littered with scraper sites (and other Black Hat SEO sites), sites with little or no relevance, sites with dubious reputations, and outdated sites.
Again I'm happy that you're not affected. Hopefully it stays that way. Take care.
-- T
Has that other site only just had problems - or had problems for a while.
I have a couple of sites that now pass the test - only since the last week or so - but previously failed. Crawl not deep enough to know if rankings will return.
Which makes me think that it is something Google is working on.
I did ask on Matts Blog, about non-www and www issues, and he did say he could believe that the algo might not handle it aswell as it could but would not result in a penalty - but it is something they are working on.
Ok - bit vague, and rumour has it G has been working on a fix for ages......
But some of us have waited for ages with little progress ;)
Would also love to know what is going on with the UK PPC Search Engine Mirago:-
[google.com...]
According to above search the page mirago.co.uk does not exist
[google.com...]
But top of the rankings on this search is that very page with a fresh cache date.
1) It is site wide. I normally receive traffic from hundreds or thousands of search results - not just from targeted keywords but also phrases that I would never think of (nor necessarily want to target, since they're wordy phrases that a viewer types in), but none of my pages on my site get any of those results now. There are like 3 phrases where my site comes up on Google, nothing more. The whole site has been filtered.
2. If you type in the name of one site (name, not domain name or url), it still comes up #1. The other site (my cat site) comes up at the bottom of page 1 (interestingly, the Cafe Press store for the cat site comes up as #1 for that name). Both sites are in the Google index, they have just been filtered out of the results.
3. For my main site, this is really similar to what happened with Bourbon - in fact, the only key phrases I'm still appearing for are the very same three phrases I was appearing for during Bourbon from late May - late July. The only difference is I am getting even less traffic than I was during Bourbon.
4. My cat site has been particularly hard hit by this algo change, or filter or whatever you want to call it. I get maybe one or two people from Google traffic and that's it.
5. Like many others here, my main site is a well respected, content based site. I do reviews of exercise videos, among other things, and the reviews have been quoted on DVD boxes along with the likes of Shape magazine! Publicists with huge retainers send me DVDs for review and beg me to set up interviews with their clients.
6. My main site is 3 1/2 years old, the cat site is 2 1/2. Until Bourbon, I had no problems with Google. Traffic was up a little, down a little, with each algo change, but nothing that couldn't have been solved by a little effort - writing more articles, etc. But as of now, Google has pretty much killed this site since I can't spend more than a couple of hours a week, if that, on something that only brings in pennies on the dollar. This is the second time this year that traffic to my sites has been decimated, and I'm done with my main website. It's not worth it anymore. My cat's site can pretty much survive, with or without Google, although I don't think she'll be getting a second cat tree this year. :-)
BTW, all this talk about whether it's an update or not - LOOK, if it affects your site the way it's affecting mine, who gives a flying rat's behind? Bottom line, whatever is going on, it sucks for a whole lot of us.
Haven't posted here in awhile (I'm a foul weather poster and tend not to bother when things are going ok :)
I have experienced the identical thing as others reported: solid, original content driven site suddenly losing almost all PR (homepage down to zero; some sub pages retaining a 1 or 2 PR) and my backlinks to zero. SERP's droping from top 5 to unknown (in the 200+ range surely) but virtually all pages intact in the Google index.
One thing I am wondering as as read this thread. I see other travel sites mentioning this and I wonder if using one of the proprietary booking engines factors into this.
For me, reservations are handled through (made up URL) reserve.mysite.com as is hotel page info (info.mysite.com), etc. This results in a lot (a theoretical maximum of 100,000+) pages that Google indexes (tho only about 3,000 are in) but has always assigned a 0 PR to (presumably because they are, in essence, redirects using CNAME).
Each of these pages (not unreasonably in my view) has my logo and a link back to www.mysite.com. What I wonder if the effect of all these 0 PR pages which Google may or may not see as internal links swamps my legit links and causes PR to plummet?
I note that this was NEVER a problem until now. Are travel sites using this method (which is a lot) getting caught inadvertently in a spam filter?
Ironically, the top site (appearing in 2 of the top 5 position for one of my best key phrases) is a spammy pseudo-directory site containing only auto generated links out and zero original content.
I do, although I don't use cname but just link or do an iframe to them. The sites didn't loose PR though but are just filtered out. One of my URLS even ranks higher now then my page with the same phrase (because of my linking). If you use &filter=0 you are back in the SERPS?
Is Sparkle causing trouble again ;-).
Ya know you just can't keep us critters down.
I took a look at some of what is going on.
There is more than one thing happening.
A lot of links to the cat's site are on pages that are now marked as supplemental (whatever that means I haven't a clue).
I've seen a lot of this along with various possible canonical page problems.
There appears to be a rising use of scripts that do IP delivery, which page wins out and what the impact is on a given page. I also can't figure out.
Tho in a strange way.
When I do:
[google.com...]
I am all over the top BUT it is all the sub.domain pages (e.g. hotel.mysite.com) for the first 100+ NOT my actual site pages.
Without the filter, none of my pages are there.
Oh oh
2)orphaned pages
It might be worth noting that pages can appear orphaned to Google when they are not actually orphans.
If Google doesn't crawl a site in any depth, over time pages can drop out of the index. The links to deeper level pages therefore disappear, and if the deeper pages have no other incoming links, they can start showing up as Supplemental. Eventually they too will drop out of the index.
I was more concerned for the impact that happens to the links and thereby the site linked to when they come from a page marked supplemental.
I looked into this just last night. I selected a random phrase unique to my site and searched for it. I think I may have spotted a pattern. One page came up in the listings, when viewed via Google cache, it had scraped my entire title, meta tags, and then linked to my site. It was listed as supplemental.
When I clicked on the link, I got a blank page.
I kept searching, and found that if the page on my site being linked to in this manner had a higher PR, it didn't seem to affect it. However, if it was an internal page with a lower PR, Google only had URLs and no descriptions listed.
I'm not sure if this helps anyone make a connection or not, but I thought it was interesting. Of course, it could mean absolutely nothing.