homepage Welcome to WebmasterWorld Guest from 54.198.42.105
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 180 message thread spans 6 pages: < < 180 ( 1 2 3 [4] 5 6 > >     
-nonsense stopped working now
zacsg




msg:148854
 5:31 am on Dec 11, 2003 (gmt 0)

"kw -fwefer [webmasterworld.com]" now returns the same result as "kw". at least from here in Singapore. anyone else see it?

 

I_am_back




msg:148944
 2:54 am on Dec 13, 2003 (gmt 0)

What causes 90% of sites to disappear?

Why not ask Google? Time after time I hear "My site is completely gone from Google" then when checking 'the site' it is NOT gone at all, they are simply not ranking where, and for what, they want to.

Where are all these unproven silly theories coming from? If someone has proven something present it, otherwise make it clear it's only a HUNCH. Remember that there are 3.6 billion pages so any proof must consider that.

Filter and algorithm are 2 different things. While a page may be subjected to an algorithm to find if it meets the citeria of a Filter is not the Filter itself.

When you filter something you do so in the *hope* of removing it. Just like any of the many filters on most cars, e.g. air filter, fuel filter, oil filter etc.

When Google filters pages (e.g Safe Search etc) it too does so in the *hope* of ommiting the page from the SERP's. It if does so succefully the page WILL NOT be found on any page of the results. If it is not succefull (not sure of the %) then the page will be on one of the result pages.

[edited by: I_am_back at 3:02 am (utc) on Dec. 13, 2003]

More Traffic Please




msg:148945
 2:58 am on Dec 13, 2003 (gmt 0)

I'll say it again, Google is currently supplying on -in search results that are truest to the original intent of the internet: information sharing without a financial motive. None of the information sites that I look at have disappeared. If Google has suddenly raised the ante for commercial sites... so it goes. That would be your basic capitalist law of the jungle in action. May the best widget salesman win. You're probably all being herded into AdWords, my sympathies...

Now breaking that down ;-)

I'll say it again, Google is currently supplying on -in search results that are truest to the original intent of the internet: information sharing without a financial motive.

Without a financial motive for who?
You're probably all being herded into AdWords, my sympathies...

Once again, :-)

At this point I don't buy into the conspiracy, but I do have alot of questions for certain commercial searches.

[edited by: More_Traffic_Please at 3:09 am (utc) on Dec. 13, 2003]

frup




msg:148946
 3:02 am on Dec 13, 2003 (gmt 0)

People are trying to play semantics. Whether it is a "filter" or an over-optimization "penalty" that excludes sites from certain searches, the effect is the same.

I_am_back




msg:148947
 3:10 am on Dec 13, 2003 (gmt 0)

But there has always been an "effect" so nothing has changed in that concept.

Stefan




msg:148948
 3:14 am on Dec 13, 2003 (gmt 0)

Without a financial motive for who?

Yep, point taken. My apologies.

Remember, if you all stampede at the same time, in the right direction, you might break out of the corral.

(I have to stay out of these update threads, man. I'll get back on top of that starting now.)

steveb




msg:148949
 3:42 am on Dec 13, 2003 (gmt 0)

"People are trying to play semantics"

No, this has nothing to do with semantics. It has everything to do with inexperienced webmasters blaming the wrong cause for an effect.

Blaming your tires for losing a race when you have no gasoline in the car is foolish and not a matter of semantics, and that is precisely what some folks are trying to do.

frup




msg:148950
 4:39 am on Dec 13, 2003 (gmt 0)

steveb, you have no idea who people on this board are, what they know, where their sites rank for competitive keywords. And these people didn't get their by accident. But I guess we're all just inexperienced webmasters who don't get it...

europeforvisitors




msg:148951
 4:48 am on Dec 13, 2003 (gmt 0)

What causes 90% of sites to disappear?

Is anyone seriously suggesting that Google's index has shrunken to 10% of its pre-Florida size?

Powdork




msg:148952
 5:38 am on Dec 13, 2003 (gmt 0)

Is anyone seriously suggesting that Google's index has shrunken to 10% of its pre-Florida size?
My hunch is they were referring to the fact that for many searches upwards of 90 pages are missing from the top 100 results that were there previous to the 'switch to Noogle'.

Can we just say 'the switch to Noogle' instead of referring to the algo/filter/feature? e.g. the switch to Noogle seems to have been more prevalent for commercial queries

Spica




msg:148953
 7:07 am on Dec 13, 2003 (gmt 0)

Noo or Goo, new or goo, this switch has certainly eroded my trust in this search engine. I know that there are flower shops in my city, but none are listed in Google. Whatever the motif, the result is a form of data manipulation that does not serve the interest of the user.

Today, at work, I was searching for something about eukaryotic promoter databases. Out of old habit, I used Google first. I did not find what I was looking for, but I noticed that there were (obviously irrelevent) ads coming up at the right for some of my searches. I couldn't prevent myself from thinking that if Google was, for some stupid reason, treating this search as a commercial one, the absolute best results could be "filtered out". In the past, I would have simply decided that what I was looking for did not exist yet, but this time I had to use other search engines to be sure.

What I am saying is about trust as a user. I don't trust people who have lied to me. Lying by omission is lying. I can't trust Google anymore. Never will again. I know that there are flower shops in my city, and I expect a good search engine to find them for me. I don't care if their web sites say "flower shop mycity" too many times. I don't even care if they do massive link exchange (even though I think that it's a despicable practice). Let me, the user, decide whether I want to buy something from them. If I can't find these sites, it is censorship. And censorship is wrong. Very wrong.

[edited by: Spica at 9:04 am (utc) on Dec. 13, 2003]

I_am_back




msg:148954
 7:16 am on Dec 13, 2003 (gmt 0)

Getting back to the original question "Google finally decides to relax it's filter?" Which filter?

Spica




msg:148955
 7:32 am on Dec 13, 2003 (gmt 0)

The "filter" (or whatever you want to call it) that deletes from the SERPs the sites that are really on topic, so that you get to see instead the pages that link to them.

It is very strange to see some people here loving Google so unconditionally that they are ready to deny that there has been a drastic change in the way commercial searches are handled.

steveb




msg:148956
 7:48 am on Dec 13, 2003 (gmt 0)

"It is very strange to see some people here loving Google so unconditionally that they are ready to deny that there has been a drastic change in the way commercial searches are handled."

It's strange that people like to make up their own reality. Who here has said there haven't been drastic changes? Why would you make such a post?

I_am_back




msg:148957
 7:56 am on Dec 13, 2003 (gmt 0)

I doubt there is one. I would guess that it is more likley just an alogo change. I have never really tried to look into this black box and probably will not start now.

It is very strange to see some people here loving Google so unconditionally that they are ready to deny that there has been a drastic change in the way commercial searches are handled.

I for one still find Google the best SE for getting what I want. I will also NOT subscribe to unproven conspiracy theories that state Google has drastically changed the way it handles commercial searches.

I'm in the Software business and sell it online, if there was a "drastic change in the way commercial searches are handled" I believe I would notice it.

This whole forum seems to rapidly being filled with "You must hate Google or we'll hate you" types. I think the word "Professional" should be ommited from the forums description.

Spica




msg:148958
 8:29 am on Dec 13, 2003 (gmt 0)

"Who here has said there haven't been drastic changes? Why would you make such a post?"

"if there was a "drastic change in the way commercial searches are handled" I believe I would notice it."

Q.E.D.

allanp73




msg:148959
 8:30 am on Dec 13, 2003 (gmt 0)

I am not trying to promote the hatred of Google. I just feel they have lost perspective. When I said sites are filtered out I mean ranked out of existance. My sites were informational, back to basics kind of sites and were wiped from their top ten spots to beyond searchable. My competitors joined me. I can say with all honesty sites have been removed from the serps. If you looked at the double minus results, it produced many more results than without the double minus. This shows that the current results are removing sites from the serps.
Content is no longer king in the new Google order. Instead those who link to content are kings. My industry can not afford the costs of adwords. It takes 100 visitors to get an inquiry and 100 inquiries to get one sale. The profit on a sale is about $1000-2000 and Adwords sell for $5-10 per visitor. Fortunately, I prided myself on content. The loss of Google represented only a loss of 10-15% of my traffic because many government and educational sites linked to my content and other searches engines are providing more and more traffic as Google makes it impossible for users to find useful resources in my industry.

If Google believes this change will benefit them by selling more Adwords, this would be a very short term gain. When the quality of the results suffer, then Google will suffer.

I_am_back




msg:148960
 9:23 am on Dec 13, 2003 (gmt 0)

Content is no longer king in the new Google order.

I believe it still is. The more content you have the better your chances of having a page, or more, in the SERP's and the more pages you have in the SERP's the better your chance of ranking well. It just like going fishing with one rod or 2. While your chances are not doubled with 2, they are increased.

Instead those who link to content are kings.

Quite possibly correct. But this is all the more reason why content is still king, you link all your content pages to another of you own content pages. Links like "See Also Red Widgets" work very well.

My industry can not afford the costs of adwords. It takes 100 visitors to get an inquiry and 100 inquiries to get one sale. The profit on a sale is about $1000-2000 and Adwords sell for $5-10 per visitor.

WOW! I only make $13.00 profit on some of my Adword sales. I would of course NEVER bid for the no 1 position though! I bid the min 5c per click and bottom feed on a *LOT* of keyword phrase combinations. This helps keep the CTR high enough to NOT cut me off. I also only target the 'richer' countries. $20.00 is a months/years wages in some Countries. I also skip the option of having ads show on AdSense. All my ads omit words like Free, Shareware, Demo, cheap etc

Fortunately, I prided myself on content.

I would say to keep pride and do not let Googles latest changes take that away from you.

While I of course need to make sales to make a living, I by know means write content only for paid products and services. In fact, 80% of my content pages as simply free relevant (to my industry) content pages. This, I believe is what encourages Bookmarks and repeat visitors. Forums are also another great way to get thousands of free content pages in Google. Best of all your 'potential' customers write it for you. We should also not forget the opt in newsletters.

[edited by: I_am_back at 10:14 am (utc) on Dec. 13, 2003]

Bobby




msg:148961
 9:28 am on Dec 13, 2003 (gmt 0)

I don't care if their web sites say "flower shop mycity" too many times. I don't even care if they do massive link exchange... Let me, the user, decide whether I want to buy something from them.

Well put Spica.

Who knows what Google's long term strategy is, whether it's to move people over to adwords, prepare for an IPO or deliver better quality results to its users?

My feeling is they need to take the short term effects into account too, and they have underestimated the impact it would have on webmasters and small-fries (I say this because I am a small fry just trying to make ends meet) who count on part of their livelyhood coming in from sales on the net.

I'm not suggesting they have any moral responsability, just that they didn't do their homework and may have started a chain reaction which will have negative effects in the long run for their business.

When the quality of the results suffer, then Google will suffer

I agree with allanp73

Perhaps we should add "along with lots of family run online businesses and users".

Getting back on topic here...

I don't see them relaxing their filters at all, although some dc results are different from others, but this has always been the case. On the contrary, everything they've done in this last month is reinforcing the conspiracy notions (Shopping? Try Froogle). The bottom line is that Google is in the search engine game for Google and will do what is in their interest.

Hissingsid




msg:148962
 10:13 am on Dec 13, 2003 (gmt 0)

Lots of sites have lost rank due to different algorithmic ranking. At the same time an exact same number of sites have gained ranking.

Then also at the same time, lots of sites have been affected by Google's upping the attack on duplicate content. They have always tried to deal with duplicate content, looking for the canonical/real/main page. Clearly they have gotten a lot of sites in that net this time that they shouldn't.

Hi Steve,

I agree with your analysis here plus you quite often bang on about "no one's arguing that the previous allinanchor was good."

Perhaps the change in the algorithm is nothing more than a marked down grading of the allinanchor component of the calculation plus the other things that you mention.

I've gone back and looked at a comparison of allinanchor:widget insurance and post florida plain search widget insurance. One of the sites that have dropped out of the top of SERPs was #1 allinanchor is now #331 plain search and is a www.generic-hyphenated-url.com site. Some of the others around in the top 20 allinanchor have bably 50 (just guessing) other similar sites which are just hastily prepared template copies with a few links and bits of cobbled copy and every one using a hyphenated generic url.
%0suffered similarly and they are also hyphenated generic domain names. Except for the index.html page every page in the site that was #1 has the same text exactly in the <title> tags and that has been copied from another www.generic-hyphenated-url.com site where every page has exactly the same text enclosed by the title tags as the previous page and is on a virtual host in the same IP range as the owners proD
I had previously been in an ongoing battle with these guys and had changed my internal site structure to incorporate widget-insurance into many of the internal links both in the text and the url and I'm still at #3 allinanchor even though I have a www.brandname.com domain. Couple this with the duplicate issues that I've found and I can accept that I've been sucked into the trap of the allinanchor bait provided by Google previously.

If Google had just removed any weight from allinanchor in their algorithm then we would have seen something close to what we saw post Florida even if they had left everything else in the algo constant. If they did remove this component of the calculation this would make PageRank and other factors more important. This might look like a filter because it would have the effect of dropping pages that relied heavily on allinanchor for their SERPs position to drop down.

Perhaps sites that have dropped out just had too much duplication and from the evidence of a couple of cases I have seen this would be a very strong candidate for the reason for them not appearing in the 1000.

I thought that we were supposed to be having a discussion here to help everyone concerned. I think that folks who have not been affected or have been positively effected by Florida have a different insight to those of us who have been badly effected. They notice things and see things differently to those of us who are trying to find the answer to what to do next. Some of us have got past looking for someone or something to blame and have started to look for low risk things that we can do to try and get back in the next update.

looking for duplicate content, examining where your sites are hosted and how that might look in terms of close links, looking for links out to areas of the web that are bad neighbourhoods will not do you any harm whatever the cause of the problem that we are facing.

FWIW I think that folks like Steveb have benefited from following the recipe for successto the letter. They have not overdone one ingredient in order to fool Google, long term that has to be a very good strategy. In the short term I'm going to wait to see what the next update brings and if I'm not back in the top 10 then I'm going to have to do something.

Best wishes

Sid

nutsandbolts




msg:148963
 10:13 am on Dec 13, 2003 (gmt 0)

I_am_back: That was a great post with some nice tips for people who follow your advice.

superscript




msg:148964
 11:11 am on Dec 13, 2003 (gmt 0)

What causes 90% of sites to disappear?

The suggestion that Google may be broken isn't fashionable at the moment - but that doesn't mean it isn't a possible explanation - it is certainly one of the simplest contenders ;)

One big question is "Why are certain keywords and keyword combinations apparently 'penalised'?

It's a tough one, because whenever a pattern is spotted, contradictions can always be found. They certainly look, in the main like commercial terms, but are not restricted exclusively to these.

Technical difficulties with indexing at Google could explain this unfathomable mix of 'penalised' terms - perhaps they are simply the most popular ones in terms of number of sites and pages to crawl. This would explain the preponderance of 'commercial' terms that appear to have been 'penalised'.

There was some speculation after all, a while back now, that Google was approaching the mathematical limit of pages it could index without a major overhaul.

Yours unfashionably :)

claus




msg:148965
 11:44 am on Dec 13, 2003 (gmt 0)

Now that we are talking filters, if you want some insight into the efficiency of the duplicate filter, try typing "site temporarily unavailable" into Google (without quotes). (That search can't possibly harm anyone, it's as non-specific and general as possible)

You'll see 380,000 results spread across almost as many domains, of which only 21 are displayed. That's a pretty efficient filter.

I wrote this in another thread a few days back, but that thread was cut a number of posts back and closed (long overdue, so no problem), so that post disappeared. At that time, the same search showed 100,000 less results

/claus


Added: Nah, superscript (below), try the "repeat the search with the omitted results included." - option. These pages are really quite indexable. And quite identical too. Some even have dmoz links and all.

[edited by: claus at 11:57 am (utc) on Dec. 13, 2003]

superscript




msg:148966
 11:52 am on Dec 13, 2003 (gmt 0)

<le grande snip>

borisbaloney




msg:148967
 12:17 pm on Dec 13, 2003 (gmt 0)

It just like going fishing with one rod or 2. While your chances are not doubled with 2, they are increased.

Reminds me of a post when the original OOP or algorithm was tested for a couple of weeks six months back.

To paraphrase the original poster:
If Google reduces my rankings for one site - I'll build 10 more. If they reduce my rankings for 10 sites - I'll build 100 more.

Some of the best advice I've seen here yet.

Hissingsid




msg:148968
 12:19 pm on Dec 13, 2003 (gmt 0)

You'll see 380,000 results spread across almost as many domains, of which only 21 are displayed. That's a pretty efficient filter.

If you look at the source of the ones returned when you "repeat the search with the omitted results included." you will see that they have identical <title>text and body text.

This example and others I've happened upon myself lead me to believe that text in the <title> tag is the most important from the point of view of duplicate filtering. Duplicate text inside the body tags (but not in other tags within the body tags, possibly) can be ruled out as a strong factor in duplicate filtering because Google is clearly not seeing syndicated content as duplication.

If you remember the other example that led to the thread on this topic being pulled demonstrated the obverse. Almost identical text in the body but the title tags were not formed correctly and so would not match what the filter was looking for and would therefore be overlooked. ie not filtered.

Best wishes

Sid

PS When that thread was pulled I had the page open on another browser window and was able to save it to study further.

rainbow




msg:148969
 12:25 pm on Dec 13, 2003 (gmt 0)

Claus,

for me your example does not look like a result of a better filter. I strongly believe pre Florida the search will have returned almost the same results and was as well displaying only some few pages (before clicking on the display ommitted results). This is the 'Duplicate Content' filter known for a long time.

I still believe that G is broken. Yesterday I had to try to find some relevant information for an antibiotic for my youngest daughter. I typed the antibiotics name 'augmentine' in the G toolbar. From #1 to #250 all spam redirecting all to the same vendor. Could you explain to me what the benefit of G's new filter should be when the results are so f'''g bad? That's not a filter.

Either they are trying to make money through AW or it's simply a broken index. I don't believe the IPO or conspiracy theory. Think about the homepages having disapeared. Remember the jewlery / real estate searches. Why else should they refresh the index hourly since Nov 15?

One more argument against the consiracy theory: they know perfectly well that most Mum & Pap bizz can't afford paying for AW. They might try but most will stop their campaigns after a month or so.

G's Nov 15 changes are a good example for tech company investing more and more in engineering and programming instead of listening to users and webmasters (and other humans).

Dolemite




msg:148970
 12:43 pm on Dec 13, 2003 (gmt 0)

I'm still not convinced that a better duplicate content filter is a major component of this update...at least not in the sense that it explains why so many of our sites dropped from lofty positions for major keywords.

As I see it, if a an over-SEO filter catches one site in its net, then shouldn't all its duplicates go away as well?

superscript




msg:148971
 1:00 pm on Dec 13, 2003 (gmt 0)

Regarding the <title> tag. I also figured this could be the key to the 'penalty', and made it far less descriptive on one of my low income sites:

From

'Furry Blue Widgets'

to

'Welcome to our site' or wtte.

Result:

It has messed up my Alltheweb SERPs!

So I chickened out and changed it back, so not enough time to see what effect it had in Google. And besides, my site is about 'Furry Blue Widgets' - so it seems a perfectly reasonable title to me. My experiments with bending to the new 'algo' are over for at least a couple of months.

and I'm cross with myself, because I promised not to twiddle!

Bobby




msg:148972
 1:21 pm on Dec 13, 2003 (gmt 0)

Superscript I think most of us are reluctant to make changes to our site in the hope they will get re-indexed by google at the expense of other search engines.

Someone who has a site which is spidered by google everyday, or at least very frequently could "experiment" without risking his/her neck in other search engines.

I'm considering a few very subtle changes in title tags to just see what happens and then play it by ear.

To me the filter google is using is focusing on repeat phrases throughout a web site, most often found in title tags.

superscript




msg:148973
 1:33 pm on Dec 13, 2003 (gmt 0)

Hi Bobby,

Anecdotally, contiguous words in the <title> seem to harm my SERPs; whilst search terms well separated in the title return better results. But once again, there are frequent exceptions to this. My sites are spidered daily - but of course even a refreshed page might not provide any information until its position is re-calculated properly, and that could take weeks. So on balance I've decided not to experiment - I don't think the answer is that simple anyway. I'm increasingly inclined to believe it is a technical problem, or an algo that has behaved in a way not predicted by Google.

Bobby




msg:148974
 2:09 pm on Dec 13, 2003 (gmt 0)

Superscript,

I have repeated examples of some of my most competitive sites being filtered out for searches containing 3 words I have used in optimizing title tags.

Regardless of word order or proximity they are nowhere to be found unless I add 2 more words to the search, then they appear as they did before Florida.

I tend to agree with Daniel Brandt's analysis in The Great Google Filter Fiasco (sticky me and I'll send you the url) and think there is a deliberate attempt on Google's part to discourage optimization in commercial areas. I'll leave the other webmasters to speculate on why. The hit list was a real eye opener for me and supports the 'dictionary' theory.

If it IS a technical problem then there is hope that pre-florida results may return. With all due respect, I hope so...but I just don't believe Google is so incompetent technically.

I think they are throwing out the baby with the bath water as somebody so elequently once said, in their attempt to "beat the spammers". If Google really wants to improve their SERPs then they could start off by eliminating basic spamming and not the few webmasters who diligently spend hours on designing AND optimizing good websites.

This 180 message thread spans 6 pages: < < 180 ( 1 2 3 [4] 5 6 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved