Forum Moderators: Robert Charlton & goodroi
"I think some of us are seeing a different variation. I know my site and at least someone else on this thread ranks # 1 if you type in their domain name. Or domain.com. Back actual pages by title, those that used to be in the top 10 SERPS are now about 10 pages back, in the 100+ range. "
This is exactly what I'm seeing. I'm holding many keywords fine but all my big keywords are gone most of the time.
Our site is swinging up and down all day long. Sometimes I search for red widget and we are not on page 1 but we are in the top 4 pages then a search an hour later and we are blasted back to the bottom of the results.
Other keywords seem stuck at the bottom.
As a result our Google traffic is down more than 80%.
We have lost many keywords that we have held #1 on for years. How can we go from #1 to the bottom or not even showing over night?
I have done nothing to deserve this and cant seem to figure out why this happened.
[edited by: tedster at 7:17 pm (utc) on Mar. 2, 2007]
The problem with the theory is that the sites I am looking at seem to not rank for any term(s), every term across the board except for the domain name (and sometimes not even the domain name) the sites are gone and not always at the end of the results.
Also this first appeared in Jagger in 2005 around the time of the duplicate content penalty. It is also highly possible that Google would mix filters to throw any type analyzing off.
Ouch! And I used to have "related widgets".
Maybe I should remove everything but that doesn't help the user navigation/experience.
And if that is one of the issues that triggers the filter/penalty all the blog-style sites should be affected.
[edited by: Biggus_D at 3:02 pm (utc) on Mar. 1, 2007]
Just saw this and it seems borderline AI if not totally cool AI.
Just did a search for:
java widget to other type of widget
and the first result is:
widget — String: ToString() Returns a string representation of this widget object.
According to ....
I do similar searches regularly and this has got to be new.
OK, yes, come to think it, this looks like the "950 penalty" happening to our site. Thanks for posting that link, I'll read it in more detail later.
EDIT: OK, wait. I'm not seeing ANY of our results as low as 950.
I do see the bit where 3/4 of the pages we used to rank for dropped down about 10-17 pages (in the 100-200 results range). Nothing at the "end of results" though as far as I can tell.
We have the "red, blue and yellow widgets" pages gone, but orange widgets are fine bit about it.
It's not the "Minus 30" penalty, either.
Just for clarification.
[edited by: Undead_Hunter at 4:34 pm (utc) on Mar. 1, 2007]
Especially if a site has a long "laundry list" style of menu, it can be easy to overlook excessive repetitions.
Point well taken, Tedster. On Google Groups I'veseen cases of people complaining about Google not indexing all of their site and it appeared they had duplicate contents issues, like listing the same products in different permutations on different pages.
I would certainly expect Google to refrain from indexing duplicate content pages, instead of imposing a penalty. As Googlers have mentioned, penalties are becoming ever more rare, in favour of not indexing pages, or discounting links, etc.
[edited by: Martin40 at 5:11 pm (utc) on Mar. 1, 2007]
All those Linux distros are essentially dual content in large parts, yet they can be so different.
I have seen a Wikiclone for mobiles, completely useful. I have seen Wikiforks in special directions, changing the content in say the psychology directions, explaining why such and such event did occur.
Wikispecies, loads of dual content, but useful systematic zoologists.
Then can dual text content be enriched by medias like videos and flash.
I think 90% of all dual content issues are entirely legitimate.
I have a lot of pages, I can't track all of them but I've seen:
- nº1
- the infamous last position
- nº1 and the last position searching 3 keywords. This is with 2 different articles, but not always the most important/relevant for those keywords, I'm unable to locate the other articles doing the search.
- right now some keywords are not exactly in the last position, seems like a new victim has taken my place
- 500-700 range
- 200 range
And doing random searches:
- 41
- 81
BTW, I've also written to Google but I'm sure that I'll die before I get an answer.
[edited by: Biggus_D at 6:05 pm (utc) on Mar. 1, 2007]
Reading the main article I'm unable to find anything spammy.
The widget is named 3 times on an brief article with 107 words.
Does anyone know how many times can you use the widget name?
Unless of course they link to their Blue Widgets page with "Blue Widgets - Click here for Blue Widgets and the best in Blue Widgets Online from Blue-Widgets.biz" :)
Seems to work extremely well since last week end. Try it out I'd say. (no just kidding, enought with this mess :) )
Anyway, even from the user prospective I am extremely frustrated since I can't find what I am looking for. Especially is I search for 3-5 word keyphrases.
Since Google was the only way I could find quickly certain things, I'm left with asking on forums :) which is ok but wastes my time.
I still can't believe what I'm seeing but saying it won't help I guess.
As I said, I am the basic webmaster BUT I would like someone to explain the following to me:
I find very easily textlink brokers, and when I look around the search optimization industry I find mostly sites that would sell textlinks (and certainly don't hide it). Some have been moved to page 2 but are still there, it's not alway black and white of course.
>> But why are textlink brokers/buyers ranked at all? Me alone can retrieve whole networks in a few hours. Why not getting rid of them once and for all?
Here is my logic: writing good content takes time and efforts, being popular takes time and efforts, even link exchange takes time and efforts, buying links takes money only - Not fair to reward this with organic rankings. In addition from the pure financial/adwords prospective those buying textlinks would have cash to buy sponsored links.
Also and again, I read MC's blog and he seems to believe that Google is very good at recognizing paid textlinks.
>> Maybe, but why is that most buyer got a big jump ahead and rank great recently?
I'm not going to repeat what I said before but the content of the pages that moved up is really not what's best and certainly not worth demoting other sites as "collateral damage".
When I see pages with less than 50 words, that rank great for some competitive keywords, that are not even popular I don't see what's the idea behind?
That's a shortcut, there are many many results with pages that bring nothing to the searcher in my opinion, except frustration.
Does MC take private consultations? I'd like to show 50-60 results since last week (moreover since January) and see if there are any justifications to that :)
I am just kidding of course....it's probably my way of releasing the last 2 hours of frustration not being able to find some javascript/ajax article I was looking for! I pobably typed 70 times almost the same thing to finally forget about it and ask on a forum to a real human.
I probably expect too much from Google, but I got used to the comfort of typing 2-3 words and find what I am looking for on page 1. For a few days, for the first time I browsed page 3, 4, 5,...which actually was refreshing before being frustrating :)
The good part is that some sites remain stable and are really worth keeping. But overall it's a massacre in my opinion for lots of medium size websites that write their own content and try to be usable and interesting without cheating. Pretty depressing.
Ok not as bad as Yahoo, MSN but we were used to more logics and quality from Google...hey, it's Google! We are used to better than that.
I would certainly expect Google to refrain from indexing duplicate content pages, instead of imposing a penalty. As Googlers have mentioned, penalties are becoming ever more rare, in favour of not indexing pages, or discounting links, etc.
Yep, and it's a good move, but it needs to be done correctly. For a few weeks I see RSS feeds ranking higher than the HTML pages they are taken from. HTML pages which are gone supplemental at best.
Well, sorry, I (and probably most users) prefer reading from my browser than from an RSS feed reader.
Google is becoming good at one thing: refreshing. They are more dynamic than they used to, a bit like MSN.
I hope that in such case they will also be faster in understanding that a site finally doesn't deserve being filter when it comtains useful information and pleases users. Isn't it the point? pleasing users?
BTW, I've also written to Google but I'm sure that I'll die before I get an answer.
Many quality sites deserve a better communication than what Google search is providing right now.
Also I observe, that for many keywords good content is sinking down whilst less related sites get shoved to top ranks.
I agree. IF this is phrase based spam detection -- and we don't have certainty on that -- the the issue becomes how would a false positive (so considered by the site owner) be tripped. The patent appears to be looking for autogenerated pages (Markov chain stuff) and low value MFA content....
Tedster, this is great info. Thank you. I do feel like I may have been caught in the cross fire and have been hit with a "false positive". My site has great internal linking, but perhaps this was my downfall and I went to far with it. I am going to go out on a limb and do a major overhaul, I figure at this point I don't have anything to lose.
It seemed that as of lat Feb. all things have settle down and I even posted about it but for our site at least, it appears to have reared its head again. Anybody else. Let's keep our ear to the ground. Definately track and store your SERP data, not just on Google but also the smaller engines like Yahoo and Live. ;)
Has anyone else seen this trend again w/ Google?
[edited by: tedster at 7:22 pm (utc) on Mar. 2, 2007]
[edit reason] moved from another location [/edit]
I don't know what was turned on last week but this is really wrong.
If the goal is to generate PPC based clicks only, please let me know...
Organic Geo targeting is totally wrong, anti spam is damn wrong, CJK language filters are off, anti textlink brokerage if not working at all.
this is all encouraging spam, I hate spam, wastes my time.
I start thinking that the guy who sent me a PM on this forum telling me "he was a beta tester at Google, their algo is out of control" was damn right!
I even feel for some of our competitors that got their homepage hijacked or hacked and are nowhere to be seen anymore.
There are some niche markets where it's really looking bad now. But even on some extremely competitive keywords there are websites that got ranked on page one that make me feel dizzy - I am speechless.
(I would like not to complain anymore, but Google is not helping for a few weeks, hope that sometimes they don't take webmasters' comments as jokes and start facing what's going on)
I absolutely share your frustration with the low quality serps but we do have to remember that google is an advertising business and increasing revenues is now its top objective.
The fact that it controls the search market means its impact is felt more than it would do if search use was shared over all the search engines - quality would then matter more.
Currently, i notice that in a lot of serps results a lot of results no longer contain the keyword in the title tag, well certainly a lot less than we used to see. Yet the PPC adverts all do!
Yesterday i was looking for information and costs on a certain type of equipment. Not one of the top ten serps results was relevent. Some were about the subject matter but not one on target match was listed. Meanwhile every PPC advert was Keyword keyword widgets here, buy keyword keyword widgets, low cost keywords here, keyword Keyword information here, etc etc etc.
Meanwhile on Yahoo, the same search delivered on topic match results and msn searcg to be fair wasnt to bad either.
This is the first time in history i have seen this. In my mind google needs to roll back the serps to last september, currently they are not delivering the quality and it does look like the focus is on adwords not searcg quality, deliberate or not.
I am just kidding of course....it's probably my way of releasing the last 2 hours of frustration not being able to find some javascript/ajax article I was looking for! I pobably typed 70 times almost the same thing to finally forget about it and ask on a forum to a real human.
I probably expect too much from Google, but I got used to the comfort of typing 2-3 words and find what I am looking for on page 1.
As a test i just looked at a post of Matt Cutts:
[mattcutts.com...]
There is a link to [adwords.blogspot.com...]
On Live.com you cam find it [search.msn.nl...] bij searching on "actionable information"
In google you can't find Matt or adwords.blogspot
[edited by: Markoi at 1:24 pm (utc) on Mar. 3, 2007]
I agree, and I am not sure if they do it on purpose or not.
My opinion was that they were trying to pull up the cr*p for a few days in order to filter it later on.
Eventually they would monitor how users react over low quality, then put back filters on and use the "low quality site" visitor data to automatically re-order rankings in the upcoming months.
They would make sure that some human reviewed authority sites stay in place in order to keep their market share in the meantime - so the mess would go on especially on niche markets or/and long tail (is 3 really so long to generate such mess) type of keywords.
But now it's been some time that it's going on. This week end I was expecting things to be better, but no..nonono it's worst.
So is this a test and they need more data? If not a test then I think that things are going to be totally out of control for months, possibly until 2008.
It's weird...I feel like I am posting a message on the Yahoo or MSN forum 1 year ago, but nope I am talking about Google here :)
Even when I deeply look into our own site I still don't understand how our abandonned/less popular sites get great rankings while others a slammed and go to hell - I even looked at web analytics data and doesn't make any sense.
It looks also like spam sites hit during Jagger update (I think) are coming back, I am telling you guys it makes no sense at all only if this is a test. (I start thinking it's not)
Search engines can't be perfect but this is far beyond the worst I could expect from Google which I've learned to like and respect over time until now.
Well anyway it's their call, they will keep on making money anyway and they know it.
But from my prospective this is wrong and frustrating for both users and webmasters.
Oh well...who am I to dictate Phd's what they have to do, they must be smarter than their users...
...
Looking at the number "search results" for some of our keywords there must be a whole lot of webmasters out there sitting around scratching their heads right now. We ranked number 2 - 5 for one search term that was delivering around 30 million results - Currently it's about 18 million results and we are in the group that is not showing up - so did Google loose 12 millions sites?
In my honest opinion (which is a little biased): my website is the very best, most actual, user friendliest, functionality richest page in the industry.
Bad luck I assume...
Regards
itloc
site:www.example.com
returns say 45,000
but
site:www.example.com majorkeyword
returns 55,400
I will say this ... no where in our major navigation is that major keyword. and why would the site command show more with a keyword than without. I've never seen it do that before and there is no navigational changes in our site at all.
My site has started to alternate between the google uk and web searches. It disappears from one and immediately appears in the other. This exists for a few days and then vica versa. Not being an expert I'm completely baffled
personally I think google has us in a different timezome sometimes.... phones quiet most of the day some days until after a certain time, but last year phones were off the hook during peak traffic periods. Traffic is the same, but people not calling... wierd.
On another note I've noticed alot of fresh cache dates... I mean ALOT for March 3rd
[edited by: Bewenched at 2:10 pm (utc) on Mar. 5, 2007]