|This could be because I have 1015 pages on the site. |
steveb - I've started a new topic on this very subject. I think we've got something in common here from what you're describing. My topic is still under review.
I have been pushed down 12 positions from no#1 for my brand term and above me google has placed 11 sites referencing me or about me.
So google is prefering the sites that link to me - so really is ranking directories and link farms.
This is making the out bound link more important than the inbound link
so higher PR for in links but higher serps for out links.
I'm quickly going to link to google,amazon and yahoo :)
|Everflux has never been this odd |
there are some strange goings on, some sectors have changed others have not
from what i have seen so far this is not an update IMO
Good morning Folks
ONE WEEK has passed since this update/no update has started on 22nd September 2005.
It is sad to see sites with original contents dropped out of the index or lost high degree of their rankings on the serps. Fellow members are loosing revenues. Searchers are presented with more spam on top of the serps.
A kind fellow member stikied me yesterday a keyphrase in German related to travel sector and asked me to run a query on google.de as well as google.com and see for myself how spam has emerged victorious as a result of this update (till now).
After a week from the start of this update/no update thing, <there are> maybe 1000īs of duplicate pages while at the same time <original pages of our fellow members have been removed>.
[edited by: lawman at 10:19 am (utc) on Sep. 29, 2005]
very nice comments,
My first post here.
I have noticed that aolsearch.aol.co.uk (Which google provide the results) looks to be giving out the old results. I typed in a keyword in google.co.uk and got zero results, yet in aohell same keyword I get 4 pages of results?
Dear GoogleGuy and Matt... Is it right to keep silent?
Listen ;) - I will say this only another 95 times. The canonical url for my site is the homepage with the www - I have done the 301 - this is the page with the most backlinks - it is the page that should rank for the company name search. Etc.
I painstakingly went through with Xenu and made sure there were no broken links. I also validated all 248 pages as XHTML 1.0.
This site is so clean it squeaks. But I really don't feel that these changes we're seeing are related to anything we're doing.
Im seeing the same results on both aol uk and google uk for my sector.
I am up <a> creek without a paddle since last Thursday like everyone else posting in this thread.
I have made several changes and will make my daily updates to my site that has lost 14,000 unique a day from Google though I will focus on other projects for Yahoo and just see what happens to my main site and Google. I feel there is nothing more I can do.
One last thing; I'm still astonished that Brett hasn't yet added a post to this thread. My reason for saying is that he owns the board, has sites other than this and has been around the block a few times and would have experienced threads of this sort when Google has changed things in the past.
[edited by: lawman at 10:24 am (utc) on Sep. 29, 2005]
I am still staring at the SERPS figuring out what kind of logic might be behind it. Two sites of mine are very much down but not for all keywords. Some search words combinations are placed 70 positions down (or more) compared to when I use filter=0, other combinations with one of the same search words are still on number one. I assume every page on my two troubled sites have a certain amount of filter points.
This is what I see on my pages:
The pages with current filter problems have a lot of the keywords on them compared with more successfull pages (also are most relevant but that is something else). The pages with less troubles have very less of the keyword combination on them but have internal links with those search words in them. Maybe the anchor text links have more weight now, disturbing the balance on every page which was optimal for the search words before?
I have two possible hypothesis to explain my filterproblems in the SERPs at the moment:
1. Filter is different per page (not site wide). Some pages have more filter points, some have less. The density of the search words on the pages with a lot of filter problems is to high according to Google. The search words used in links are added up to the ones on the pages so the pages most relevant on you site are filtered out (total result minus filter points). Pages with less search words on them but with some text links have less success in the unfiltered SERPS but since there are less penalty points score is better in the filtered version.
2. Filter is site wide. All pages on a site have the same amount of filter points. Some pages are doing still fine because they are so optimal for the search according the current Google that after deleting the filter points still enough points are left to place them high.
I have the impression that the filter problem is per page beacuse situation two can't explain the huge difference for some pages after filtering and leaving others with the same search words nearly unfiltered.
What do you think?
>>>>One last thing; I'm still astonished that Brett hasn't yet added a post to this thread.
Brett has known about canonical url problems for ages (cant find the thread at moment)
So he has got the protection for his older sites. The newer ones I guess he is more patient with.
Brett has probably been here, seen it before and is chilled about the situation.
Looking at things - it looks like a little test crawl may have gone out on Friday night, Saturday Morning (UK time) - they only scratched the surface of a couple of my sites - however it looks like Googlebot handled things correctly on that crawl. hmmmmz
'Brett has probably been here, seen it before and is chilled about the situation.'
Exactly, so even more reason to drop a post here.
1:47 pm on Sept 22, 2005 (utc 0)
"Lets put it on ice till something more definative comes up.!
And then the filter hit.
Maybe Brett has already spoken and you just werent listening?
I personally think the ice has melted -)
i was just before Brett, exited about one of my sites thats was reindexed by google.. and sinds that day it get about 100 referals from mr. G (before it was 0 for a while...)
So that takes me to WWW and post something about that great news.. and Yes there were more websites with Updates.
But after being set in the fridge by Brett i took some rest :) and thought about this;
Lets call this an Update....
Lets wait for the Huge SHAKE up! :)
A lot of query's has the same page 1 results as a month ago.. so imo this was a little update to algo's and not a cewl Shakeup like florida eetc..
just thinking bye
Very few people are reporting in this thread. This algo thingy doesn't seem to affect many sites at the moment. The 50 sites and 250 terms I monitor closely have but minor tweaks in top ten sites - nothing to talk about.
>>>>A lot of query's has the same page 1 results as a month ago.. so imo this was a little update to algo's and not a cewl Shakeup like florida eetc..
Hmmmz - perhaps it was a little test crawl that has caught and sorted out some sites. As I say I can only see Gbot scratching the surface.
However, not sure why other sites have gone down - so this test, or whatever it is might not get the green light.
Stabbing in the dark here though.
What do the vanished/punished sites have in common? There must be a reason...
Maybe Google modified its "site is able to pay for better listing rank"-filter? Are you now expected to pay for Google's service? See [webmasterworld.com...]
But it's only a thought.
" Very few people are reporting in this thread. This algo thingy doesn't seem to affect many sites at the moment. The 50 sites and 250 terms I monitor closely have but minor tweaks in top ten sites - nothing to talk about."
I will agree with that, the "filters" or "tweaks" seem very targetted, and in the large majority of SERPs I'm not seeing any movement to speak of. But everyonce in a while i see a definet targetted change that I think is worth studying.
It seems tho, that a few "pet keywords" of a couple of members here have been affected.
|Filter is different per page (not site wide). |
How about the possibility that the filter is site-wide but is applied through search phrases - remember Florida when all the changes were applied to 'money terms'. Also the sandbox doesn't seem to affect ranking on really obscure terms. It may also mean that the filter has only been applied to a percentage of search terms and so entire themes have not been hit. So a site-wide penalty may only have been applied to half of the phrases your site normally comes up for (there could be worse to come if this is the case).
How can we speak of these filters and whether they are by page or by site, or by keyword, if we do not attempt to define what criteria they are filtering?
<filter is site-wide but is applied through search phrases >
That is a third possibility. It fits with what I see on my two sites. A search term like "blue widgets" are doing the same without filter but "all widgets", which has most relevant term in common, is filtered downwards.
It would mean that Google is site wide calculating which search terms are used to much and tries to correct it with the filter. I use "all widgets" as text link in my navigation bar so there might be an overload. On the other hand: I also advertise on Google with most of the real heavy filtered out search terms but I don't want to believe this is a factor yet.
I took five days off to change the filtered sites but still I think it is to early to change text links and meta tags.
I still think the results we're seeing today began back in December 2004. I think that was just the first wave of whatever is going on.
I had a site hit hard back then, that still hasn't fully recovered and if I'm lucky this month will see about 50% of the traffic it was getting this time a year ago, prior to whatever happened in December.
I've gone through and removed duplicate text from meta description tags, I removed a news article that was widely covered on the Internet in December, 2004 thinking Google might have hit me with a penalty because so many sites had the same thing, and I checked on anything major I had done prior to the crash in Google referrals.
I added a new forum in November, 2004 and had accidentally linked to it using http:/ /forum. mydomain.com in one spot, (spaces added to break up link) then using http:/ /my domain.com/forum in other spots. The forum linked back to my site as well. I finally deleted the subdomain link and did a 301 to mysite. com/forum to eliminate this from the possibilities.
The site does have some affiliate links on it, but over 90% of the site has no advertising whatsoever, including most of the content pages. Ads mostly appear on navigational pages. No black hat.
I'm at a complete loss as to why this site has been hit so hard. All of these things seem very trivial in my book, but with fussy Google these days, who knows. I think Google is forcing Webmasters who normally wouldn't think that much about SEO, and merrily go along creating new content pages to add to their sites, to now obsess about SEO.
Hey Google, ease up a bit, OK? We're spending so much time trying to figure out what we've done wrong, we can't add any new pages, which means they don't exist for you to index and grow your results. Think about it. It's hard to get bigger and better when the people who provide the material aren't adding as much new content...
what if it was relative to an average optimisation score for each search term? this would mean the less competitve the term the easier it would be to get binned for over-optimisation..imagine trying work around that one..
Jon_king and Stargeek make a very good point that it seems only a small number of us seem affected by this.
Those affected feel free to sticky for urls to see if we can make any comparisons.
>>>Hey Google, ease up a bit, OK? We're spending so much time trying to figure out what we've done wrong, we can't add any new pages, which means they don't exist for you to index and grow your results.
I agree, with the above statement.
We have been adding new content. However, nothing will rank. This brings the point - the penalty must be sitewide. I have tried adding pages in many different formats.
We get our results back when adding "&filter=0".
I am trying to change some of our meta description and internal links. These have been scraped to death. May making them different than the scrapers will help.
<an average optimisation score for each search term?>
A correction on this site wide optimisation score fits with what I see on my sites indeed. Then it is an overcorrection because the text links within the site have a lot of weight.
<These have been scraped to death. May making them different than the scrapers will help>
When that is the case it will help, it will bring you back on the first page but then you will be scraped again and you have to start all over.
A lot of those scrapers have no problems with the filter on important search terms. Maybe here is somewhere a reason to be found. Then all high ranking pages should be changed every month, with the risk of loosing the high rank. Like running in front of the scrapers, that doesn't sound very appealing to me.
Those scapers have something we don't have on our pages to lift them up in front of us in the Serps.
I'm still not seeing any discussion about what the filters are actualling filtering.
are we talking about Over Optimization (and if so by what criteria? on page keywords, html optimization, titles, meta tags, off page irrelevant links, anchor text repetition) some sort of CIRCA/LSI stuff, duplicate content, a little of everything, or something else entirely?
One obvious observation I can make: the allinanchor rankings have been lowered by the same amount of spots in most cases as the rankings. So I the ranking is number 78, the allinanchor rank is also similar.
> are we talking about Over Optimization (and if so by what criteria? on page keywords, html optimization, titles, meta tags, off page irrelevant links, anchor text repetition) some sort of CIRCA/LSI stuff, duplicate content, a little of everything, or something else entirely?
From what I see it's not too much related to page optimization; I see only a preference for CSS sites.
For the sites I monitor is something related to links.
Sites that get too much links in a short period of time seem penalized.
Sites that got the same amunt of links per month, seems not suffering.
I noted old sites that haven't got any link in the last two years that are maintaining the same rank.