Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: open
We have collectively lurched between one conspiracy theory and another - got ourseleves in to a few disagreements - but essentially found ourselves nowhere!
Theories have involved Adwords (does anyone remember the 'dictionary' concept - now past history.)
A commercial filter, an OOP filter, a problem caused by mistaken duplicate content, theories based on the contents of the Directory (which is a mess), doorway pages (my fault mainly!) etc. etc.
Leading to the absurd concept that you might be forced to de-optimise, in order to optimise.
Which is a form of optimisation in itself.
But early on, someone posted a reference to Occam and his razor.
Perhaps - and this might sound too simple! - Google is experiencing difficulties.
Consider this, if Google is experiencing technical difficulties regarding the sheer number of pages to be indexed, then the affected pages will be the ones with many SERPs to sort. And the pages with many SERPs to sort are likely to be commercial ones - because there is so much competition.
So the proposal is this:
There is no commercial filter, there is no Adwords filter -Google is experiencing technical difficulties in a new algo due to the sheer number of pages to be considered in certain areas. On page factors havbe suffered, and the result is Florida.
You are all welcome to shoot me down in flames - but at least it is a simple solution.
My thoughts exactly - the filter needs to be given many examples of the 'good' as well as the 'bad.' If the number of examples is s very sophisticated but the effects that I've seen could be as easily explained by simply switching off allinanchor for sites that match a pre-determined search term.
Perhaps we are giving them (the Google engineers) too much credit.
I'm still floundering about trying to rationalise the motivations and the methods being employed. There are two main motivations that I think could be at the bottom of all of this.
1. Overcomming a problem which would otherwise affect the floatation.
2. Pushing advertising products into growth and profitability so that a major opportunity can be demonstrated prior to public floatation.
Or a mixture of the two.
The problem is the ease with which results can be manipulated by allinanchor link text. All of those articles about Google Bombing the term miserable failure ranking George Bush's resume on the White House web site #1. Must make investors nervous if the Google bubble is being filled with mischievous blogging when will it pop? might be a question in an investors mind.
If Google wanted to overcome this problem in the commercial areas that really count then they could simply remove allinanchor from the algorithm for "protected terms" and Google Bombing would be defused for those terms.
If Google wants to maximise the opportunities that it has bought with AppliedSemantics, broad matching of advertising, domain parks delivering Adsense ads then it can only do this if it reduces all of the crap floating about by virtue of the ease with which sites could be pushed up SERPs by manipulating anchor text.
So if you remove allinanchor from the equation in search terms with commercial value you address all of these things at the same time. You target this algo shift at the areas most open to abuse while at the same time opening up an opportunity for yourself in these same areas which co-incidentally are also the terms with the biggest ad spend.
It doesn’t have to be more complicated than that, but it probably is.
Just closed the quote box
[edited by: Hissingsid at 6:07 pm (utc) on Dec. 15, 2003]
I agree also. And, this includes commercial searches. If I search for any randomly selected product that tends to be sold on the Net, I have no trouble finding sellers. Not that there aren't some lousy SERPs. However, some of those existed 2 months ago also. It may be that in any update, it is the people whose sites did badly who tend to post here. The winners don't see any reason to change what they are doing, and post here trying to figure out what to do now?
What you're referring to is 'Winners justice' - and of course people who are happy are less likely to complain! It's an obvious argument, a facile argument, it is well understood - and I hope we don't have to listen to it any more! (worth noting that it has also held many dictators in power - "people who complain - well they're obviously the ones that are dissatisfied")
And your main site rfgdxm1 - unmolested by Florida? - hardly surprising as it is about recreational drug use as far as I can work out. You're hardly in a position to make this an e-commerce site are you?
I have no trouble finding sellers
I am not seeing the variety of quality sellers that I did before - which is frustrating. Google was the great "leveller" by letting us get a lot of great information in a compact area.
The shopping comparison sites, directories, etc. that are frequenting the SERPS now do not work for many products. They work great for brand name products with a specific model number - things you can buy at your local Wal-Mart - commoditites, in other words.
But there is so much else out there in the world to be bought that can't be objectified and plunked into a Froogle-like format. And these are many of the sites that no longer appear.
Applying semantics to an algo may destroy the access to many quality niche sites.
But I was specifically *not* referring to how my sites are doing. Or even other non-commercial info sites. What I was saying is that playing the role of the hypothetical consumer with credit card in hand searching on "buy widgets", replacing "widgets" with commonly sold items on the Net, I am no having problems finding numerous widget sellers. A lousy e-commerce SERP is when the searcher enters "buy widgets", and the top 10 are dominated by sites with all the sonnets of Shakespeare, the genetics of slugs, and other crud that nobody looking to buy widgets would want to visit. I'm just not seeing many lousy commercial SERPs like that on Google.
Then what SERPs are affected? If you are right odd, since I'd expect Google to try and tweak the algo for the most spammed out SERPs. If it isn't highly competitive, typically it isn't heavily spammed.
Then what SERPs are affected?
I think much of the commercial SERPs has been affected. Its just that on competitive terms, I think site owners are regrouping, bailing, etc. because, by definition, they were already in a highly competitive area.
But what does a niche commercial site owner do when her site is buried beneath marginally relevant to pathetic commercial SERPs for a multiple keyword phrase? Complaining is a legitimate avenue of expression in this case.
Some commercial searches are equivalently good. As a consumer, I've had good, successful shopping experiences this December. As a curious Google user, I compared a couple of serps with the -foo -foo thing back when that was working reliably and found that the searches post-Florida were different but equally good or slightly better. I've also been stickied two serps that were awful post-Florida and had been more relevant previously. One of these has since recovered itself to a good place, the other still looks pretty bad.
My opinion is that overall the new algorithm has created slightly better results. In most cases, the new results are equally good or better, but there has been a lot of shuffling around, so some sites fell out of the top 20 and others newly entered the top 20. Obviously the fallers are angry and the risers are keeping their glee to themselves so as not to get lynched. And in some cases, there are exceptions where the results are very poor indeed and the webmasters stuck in those niche markets are understandably hysterical while Google tries to work things out. Unfortunately, I'm still worried that the volume of the webmasters crying wolf is going to drown out the frantic webmasters trying to report genuinely bad serps. A surprising number of people are holding up perfectly good, relevant, useful to the user search results as lousy ones, and thus diminishing the attention paid to the presumably-buggy ones. :-o
They seem to be filtering out so many sites with relevant "maybe affiliate" content that whats left is totally irrelevant at times. What I have always liked about Google was the amount of pages it listed. Seems a shame to filter most of it out.
[edited by: Marcia at 6:58 pm (utc) on Dec. 15, 2003]
[edit reason] No pointing out specifics, please. [/edit]
I have maintained, risen and come back from obscurity, but the serps around me are bad. Being one of the few relevant results is great for me, but in my industry, the serps are still populated with psuedo-directories.
>Unfortunately, I'm still worried that the volume of the webmasters crying wolf
What about those crying that there is no wolf?
>A surprising number of people are holding up perfectly good, relevant, useful to the user search results as lousy ones, and thus diminishing the attention paid to the presumably-buggy ones. :-o
True, if for several cats Google just wants to be the search engine for directories .
They won't be e-mailing Google in large numbers saying how much they like the SERPs. Truth is Google likely mostly is worrying about, and looking for, people who are voting with their mouse and using competing search engines. If the searchers are unhappy, this should be visible in the trend of the number of daily searches. People do know about msn.com and ask.com, and if they think they are better than Google will switch. I remember when hardly anyone knew about Google, and Altavista and Excite were popular.
did you see the 10th result on that search for "<snipped>"? The title is "Page Title" and the site itself explains that it is under re-construction. It happens to be hosted here in Italy on Italia Online (IOL), sort of like Geocities which offers free hosting for its members.
Try searching for "<snipped>" and check out the results. After the first result the next 5 are the exact same site but registered with different domains (they also change the background theme to make things really attractive).
In fact, about 80 of the top 100 results are the exact same site!
What does that say about the quality of results Google is providing?
Travel and accommodation are only the tips of the iceberg, many have pointed out the disastrous ecommerce SERPs and the real estate farce.
I can understand Google trying to weed out spam and provide good results but they haven't succeeded in my opinion. The "<snipped>" searches as one of our senior members pointed out last week are still coming up roses, but where do you turn to when you need to find a hotel?
[edited by: DaveAtIFG at 7:07 pm (utc) on Dec. 15, 2003]
[edit reason] Snipped specifics [/edit]
I think you hit the nail on the head.
Remember the public outcry when AV started integrating paid listings into it's results? What a PR nightmare for them.
The Internet is not very forgiving, and change happens very quickly. It's probably on the leading edge of "What have you done for me lately?"
Everyone treated Google like they were God, that is until their traffic disintegrated at the peak of the most important traffic period of the year... :)
Now many have gone from "Google doesn't owe anybody anything", to "Google owes me because my traffic went away, and I might have to get a day job again"...
It's odd how affiliate driven sites such epinions, dealtime and kelkoo etc don't get hit? Maybe it's because of the advertsing money they pay Google. Just a guess.
I mean check out Kelkoo for example. Just about every backlink it has is from cross linking with it's German / UK sites. But because of it's advertising money it never gets hit. Google doesn't manually alter the serps, dream on.
custom widget where custom is the adjective and widget the noun.
Most of my phrases were set up this way and have noticably gone down in ranking.
Now, if I use 2 nouns as a phrase, it seems my sites are ranking higher, as in birthday widget (2 nouns).
Could it be a grammer type of algo (filter) that is in place?
2 nouns (plastic widget) -- good
2 nouns 1 adjective (custom plastic widget) -- pretty good
1 noun 1 adjective (custom widget) -- not good
Just some thoughts. I am not an English major by any stretch. Just going by my personal kws and was wondering if anyone else has seen this. I know plastic can be used as a noun or an adjective, but this is what I am seeing...
Truth is Google likely mostly is worrying about, and looking for, people who are voting with their mouse and using competing search engines. If the searchers are unhappy, this should be visible in the trend of the number of daily searches.
Just as important is keeping the webmasters happy. If we can't get results on Google then we will ban their robots and switch to Ink etc. Then Google will go the way of Alta Vista. Google relay on us to produce their results.
It does appear that there is a positive correlation between a webmaster's general satisfaction with Google SERPs, and how well their specific sites are doing in them. ;)
There is a large - or at least vocal - contingent of posters here who have no clue what happened in the travel industry with the most recent update. So I disagree with you about your interpretation of what people mean versus what they are saying. But that's life. I'll get over it :)
Brett's mention of the applied semantics purchase and the introduction of this technology into the algorithm is spot on.
While it is certainly not the only new change, many of the strange results can be attributed to it. The "commercial filter" for one, could easily be a direct result of the incorporation of the CIRCA technology into Google's algorithm.
If anyone hasn't read the white paper on this, here is a quote
Currently, the Applied Semantics Ontology consists of more than half a million distinct tokens, over two million unique terms, and approximately half a million distinct meanings.
Terms=one or more words strung together with a unique meaning
Applied Semantics originally was targeting the advertising market so obviously, this original db of terms, and meanings is going to be geared toward profitable (read "commercial") terms.
The "commercial filter" wasn't a filter, but the direct effect of a limited db of meanings originally affected by the application of the CIRCA technology.
Anyone who hasn't read the white paper may want to as it may definitely shed some light on some of the current oddities with Google.