Welcome to WebmasterWorld Guest from 18.104.22.168
So google is prefering the sites that link to me - so really is ranking directories and link farms.
This is making the out bound link more important than the inbound link
so higher PR for in links but higher serps for out links.
I'm quickly going to link to google,amazon and yahoo :)
ONE WEEK has passed since this update/no update has started on 22nd September 2005.
It is sad to see sites with original contents dropped out of the index or lost high degree of their rankings on the serps. Fellow members are loosing revenues. Searchers are presented with more spam on top of the serps.
A kind fellow member stikied me yesterday a keyphrase in German related to travel sector and asked me to run a query on google.de as well as google.com and see for myself how spam has emerged victorious as a result of this update (till now).
After a week from the start of this update/no update thing, <there are> maybe 1000īs of duplicate pages while at the same time <original pages of our fellow members have been removed>.
[edited by: lawman at 10:19 am (utc) on Sep. 29, 2005]
Im seeing the same results on both aol uk and google uk for my sector.
I am up <a> creek without a paddle since last Thursday like everyone else posting in this thread.
I have made several changes and will make my daily updates to my site that has lost 14,000 unique a day from Google though I will focus on other projects for Yahoo and just see what happens to my main site and Google. I feel there is nothing more I can do.
One last thing; I'm still astonished that Brett hasn't yet added a post to this thread. My reason for saying is that he owns the board, has sites other than this and has been around the block a few times and would have experienced threads of this sort when Google has changed things in the past.
[edited by: lawman at 10:24 am (utc) on Sep. 29, 2005]
This is what I see on my pages:
The pages with current filter problems have a lot of the keywords on them compared with more successfull pages (also are most relevant but that is something else). The pages with less troubles have very less of the keyword combination on them but have internal links with those search words in them. Maybe the anchor text links have more weight now, disturbing the balance on every page which was optimal for the search words before?
I have two possible hypothesis to explain my filterproblems in the SERPs at the moment:
1. Filter is different per page (not site wide). Some pages have more filter points, some have less. The density of the search words on the pages with a lot of filter problems is to high according to Google. The search words used in links are added up to the ones on the pages so the pages most relevant on you site are filtered out (total result minus filter points). Pages with less search words on them but with some text links have less success in the unfiltered SERPS but since there are less penalty points score is better in the filtered version.
2. Filter is site wide. All pages on a site have the same amount of filter points. Some pages are doing still fine because they are so optimal for the search according the current Google that after deleting the filter points still enough points are left to place them high.
I have the impression that the filter problem is per page beacuse situation two can't explain the huge difference for some pages after filtering and leaving others with the same search words nearly unfiltered.
What do you think?
Brett has known about canonical url problems for ages (cant find the thread at moment)
So he has got the protection for his older sites. The newer ones I guess he is more patient with.
Brett has probably been here, seen it before and is chilled about the situation.
Looking at things - it looks like a little test crawl may have gone out on Friday night, Saturday Morning (UK time) - they only scratched the surface of a couple of my sites - however it looks like Googlebot handled things correctly on that crawl. hmmmmz
"Lets put it on ice till something more definative comes up.!
And then the filter hit.
Maybe Brett has already spoken and you just werent listening?
So that takes me to WWW and post something about that great news.. and Yes there were more websites with Updates.
But after being set in the fridge by Brett i took some rest :) and thought about this;
Lets call this an Update....
Lets wait for the Huge SHAKE up! :)
A lot of query's has the same page 1 results as a month ago.. so imo this was a little update to algo's and not a cewl Shakeup like florida eetc..
just thinking bye
Hmmmz - perhaps it was a little test crawl that has caught and sorted out some sites. As I say I can only see Gbot scratching the surface.
However, not sure why other sites have gone down - so this test, or whatever it is might not get the green light.
Stabbing in the dark here though.
Maybe Google modified its "site is able to pay for better listing rank"-filter? Are you now expected to pay for Google's service? See [webmasterworld.com...]
But it's only a thought.
I will agree with that, the "filters" or "tweaks" seem very targetted, and in the large majority of SERPs I'm not seeing any movement to speak of. But everyonce in a while i see a definet targetted change that I think is worth studying.
It seems tho, that a few "pet keywords" of a couple of members here have been affected.
Filter is different per page (not site wide).
How about the possibility that the filter is site-wide but is applied through search phrases - remember Florida when all the changes were applied to 'money terms'. Also the sandbox doesn't seem to affect ranking on really obscure terms. It may also mean that the filter has only been applied to a percentage of search terms and so entire themes have not been hit. So a site-wide penalty may only have been applied to half of the phrases your site normally comes up for (there could be worse to come if this is the case).
That is a third possibility. It fits with what I see on my two sites. A search term like "blue widgets" are doing the same without filter but "all widgets", which has most relevant term in common, is filtered downwards.
It would mean that Google is site wide calculating which search terms are used to much and tries to correct it with the filter. I use "all widgets" as text link in my navigation bar so there might be an overload. On the other hand: I also advertise on Google with most of the real heavy filtered out search terms but I don't want to believe this is a factor yet.
I took five days off to change the filtered sites but still I think it is to early to change text links and meta tags.
I had a site hit hard back then, that still hasn't fully recovered and if I'm lucky this month will see about 50% of the traffic it was getting this time a year ago, prior to whatever happened in December.
I've gone through and removed duplicate text from meta description tags, I removed a news article that was widely covered on the Internet in December, 2004 thinking Google might have hit me with a penalty because so many sites had the same thing, and I checked on anything major I had done prior to the crash in Google referrals.
I added a new forum in November, 2004 and had accidentally linked to it using http:/ /forum. mydomain.com in one spot, (spaces added to break up link) then using http:/ /my domain.com/forum in other spots. The forum linked back to my site as well. I finally deleted the subdomain link and did a 301 to mysite. com/forum to eliminate this from the possibilities.
The site does have some affiliate links on it, but over 90% of the site has no advertising whatsoever, including most of the content pages. Ads mostly appear on navigational pages. No black hat.
I'm at a complete loss as to why this site has been hit so hard. All of these things seem very trivial in my book, but with fussy Google these days, who knows. I think Google is forcing Webmasters who normally wouldn't think that much about SEO, and merrily go along creating new content pages to add to their sites, to now obsess about SEO.
Hey Google, ease up a bit, OK? We're spending so much time trying to figure out what we've done wrong, we can't add any new pages, which means they don't exist for you to index and grow your results. Think about it. It's hard to get bigger and better when the people who provide the material aren't adding as much new content...
I agree, with the above statement.
We have been adding new content. However, nothing will rank. This brings the point - the penalty must be sitewide. I have tried adding pages in many different formats.
We get our results back when adding "&filter=0".
I am trying to change some of our meta description and internal links. These have been scraped to death. May making them different than the scrapers will help.
When that is the case it will help, it will bring you back on the first page but then you will be scraped again and you have to start all over.
A lot of those scrapers have no problems with the filter on important search terms. Maybe here is somewhere a reason to be found. Then all high ranking pages should be changed every month, with the risk of loosing the high rank. Like running in front of the scrapers, that doesn't sound very appealing to me.
Those scapers have something we don't have on our pages to lift them up in front of us in the Serps.
are we talking about Over Optimization (and if so by what criteria? on page keywords, html optimization, titles, meta tags, off page irrelevant links, anchor text repetition) some sort of CIRCA/LSI stuff, duplicate content, a little of everything, or something else entirely?