Forum Moderators: open

Message Too Old, No Replies

Google Inconsistencies

         

nzmatt

3:35 am on Mar 17, 2004 (gmt 0)

10+ Year Member



Is there significant corruption in google's ranking results or are some result rankigs just random?

Why do some sites, that have no adherence to any SEO technique, achieve (enjoy) top rankings for a very competitive word/phrase? Does anyone know?

I don't expect an answer from anyone involved in these sites as they are either involved in corruption or judging by their SEO would never visit webmasterworld!

allanp73

4:32 am on Mar 18, 2004 (gmt 0)

10+ Year Member



I don't know if it is corruption or just a change in the way Google is doing business.
I want Googleguy to answer the following questions:
For some time I have seen "city widget" terms dominated by directories and generally irrelevant sites. My questions are:
1)Is Google attempting to remove commercial or content sites purposefully in order to encourage these companies to advertise?
2) Does Google believe the serps are good for "city widget" terms? When I say good I mean better than pre-Florida serps.
3) Google's new algo benefits directories content seems less important. Is content still king? Or does Google believe that sites linking to good content is better (the new king)?
4) How has the post-Florida change affected revenue and Adwords clickthru rates? Are more people giving up on the natural serps out of frustration (because it is difficult to find commercial sites) and therefore clicking the Adwords listings instead?
5) How does Google feel about Yahoo's jest about the Hotel search?

In my case I have sites that offer the best content (possibly only) for a city/town related subject, but because this information is using a "city widget" term these sites pages are nowhere to be found in top1000 results even when no other site has any information or relevency to the term. I really believe Google has gone too far. The filter is beatable, but I personally would prefer content rich sites being rewarded. Once enough people discover how to beat the filter, the spam will increase. The new system is great for spammers because it doesn't require actual content just proper linking strategies.

DJFlite

6:14 am on Mar 18, 2004 (gmt 0)

10+ Year Member



Back on Subject,

I run a site for a user of widgets. Lets call them "widgeteers". So my site has hundreds of reviews for "widgeteer equipment". Now there has been a site which has been in the top ten for the following search terms for over a year:

widgeteer equipment
widgeteer equipment reviews

This site also has not been updated since 2002 (I know because the first text on the site says "under construction - back in sept 2002" Now this site also has the words widgeteer, equipment, and review only in the title. Backlinks are nothing spectacular. Oh, and by the way, the links on the menu bar just send you to a DUPLICATE PAGE with a different url!

Now tell me, do you really think google cant figure out this page contains no relative information to the searches "widgeteer equipment" and "widgeteer equiment reviews"? It seems pretty obvious to me. Why is my page 50 and theirs in the top ten. Hmmm, either this page did some crazy things to trick the google ranking procedure or google is up to something.....

by the way, "widgeteer equipment" is almost exclusively sold throught the internet since not many specialty retail stores which sell them exist. In other words, Its very lucrative of a position they have.

experienced

7:08 am on Mar 18, 2004 (gmt 0)

10+ Year Member



Hi,

I think google doesn't want to filter all the sites like this but they have got some technical problem with there machine or whatever they have changed in the algo. The result of the mistake is, sites are going out of the search pages along with the index in the google. Bcoz I think this new filter is something which is hited everybody badly.

But the good thing is, even after the new OOP and filter, owner of the sites are not paying google for adwords. And i believe after this filter google has lost there actual position where they were...

Thanks
Exp...

ciml

1:24 pm on Mar 18, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



BigDave has it; Google's ethos has everything to do with providing relevant content when people search, and nothing to do with rewarding SEO techniques. As tedster points out, successful sites that don't seem to be pushing hard at SEO are well worth looking at.

I don't know if I agree with Patrick Taylor about the science being thin. There's plenty of scientific research conducted, but the methods described in his post won't get people very far. Certainly, Google have made advances over the last six months in terms of making their ranking methods more opaque.

In other words, Google SEO used to be easy, now it's hard. That could be very good or very bad for a professional SEO. :-)

nzmatt:
> so what's the long term key?

The best answer may be 26 steps to Google success [webmasterworld.com]. It was written two years ago and makes even more sense now than it did then, so I guess that's long term in this industry.

"Corruption in Google"? Nope; if GoogleGuy himself tried to start an aggressively SEOd affiliate site I'd bet on him tripping the keyphrase filter on his first few attempts. :-)

IITian

2:23 pm on Mar 18, 2004 (gmt 0)

10+ Year Member



"Corruption in Google"? Nope; if GoogleGuy himself tried to start an aggressively SEOd affiliate site I'd bet on him tripping the keyphrase filter on his first few attempts. :-)

I don't think the crux of the problem is that heavily SEOed affiliate sites are getting hit, as they rightfully should be. Main problem seems to be that genuinely relevant sites can get hit too if they are not able to maintain a delicate balance.

An analogy to current situation could be that an applicant for a job is rejected because he seemed too eager even though he was well qualified for the job. The mistake he did was to call up the hiring personnel and told them that he was really interested in that job!

The recruitment board, instead, decided to offer the job to a person who didn't apply for the job and whose resume they found on the internet and it had mentioned that "I am looking for a fashion modeling job," and "If you need more data about my portfolio then please contact me and I will be glad to offer you that." Obviously, in the board's eyes that applicant was the most qualified for the Portfolio Data Modeler's job. :-)

ciml

3:37 pm on Mar 18, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Heavily SEOed affiliate sites are often genuinely relevant, otherwise the owner wouldn't convert well (one of the paradoxes of search engine quality IMO).

I understand the idea of rejecting on the basis of eagerness, but there must be an incentive for Google to reduce the power of search engine marketers to influence their results. Google say that [google.com] "Many SEOs provide useful services for website owners [..]", but also that "a few unethical SEOs [...] unfairly manipulate search engine results".

Are they only concerned about fairness, or really is the danger that if a bunch of SEOs have the power to "place" pages in Google for phrases of their chosing, there are potential problems.

We all come across sites that rank for many phrases, where some of the 'pages' do not and have never had any useful contribution in terms of content or the products and services of the owner.

> Portfolio Data Modeler's job

OK IITian, you have to get credit for that. As well as being amusing, your hypothetical example does well describe the post-Florida results in some sectors (though things are a lot better now).

BigDave

3:55 pm on Mar 18, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I don't think the crux of the problem is that heavily SEOed affiliate sites are getting hit, as they rightfully should be. Main problem seems to be that genuinely relevant sites can get hit too if they are not able to maintain a delicate balance.

Here is a simple truth.

Google does not care if "genuinely relevant sites" are remvoed from some search results. They care about relevant search results.

If that seems like a contradiction, you aren't thinking about it hard enough.

Google wants good results for the searcher. They are unconcerned about *which* relevant site they send the searcher to, as long as they find what they need *somewhere*.

Just because a site moves from #1 to #1000 doesn't matter as long as the new #1 meets the needs of many of the searchers.

Another simple truth is that no search engine will ever have perfect results for every searcher on every search.

While google does care about your particular pet search, they have to be concerned about making the engine serve up the best possible results across the entire spectrum of searches. And if fixing a problem in 10 searches damages one other then that is a worthwhile tradeoff for now.

The vast majority of searchers are still happy with their current search engine. Whether it is Google, Yahoo, AOL, MSN, AV or whatever. If they don't get what they want with the first search, they will search on something else.

They will not switch search engines just because they get a few bad results in some particular area. And it is even less likely now than it was back in the days of AV when the average web user was more sophisticated.

An analogy to current situation could be that an applicant for a job is rejected because he seemed too eager even though he was well qualified for the job. The mistake he did was to call up the hiring personnel and told them that he was really interested in that job!

Yeah, that is a really big mistake. At least if you want to get paid well for a job.

A company should not hire you because you want the job, the company should hire you because they need you. So you are right, it is a great analogy.

The only job that I ever got where the company did not come looking for me was paperboy.

IITian

5:16 pm on Mar 18, 2004 (gmt 0)

10+ Year Member



I don't think the crux of the problem is that heavily SEOed affiliate sites are getting hit, as they rightfully should be. Main problem seems to be that genuinely relevant sites can get hit too if they are not able to maintain a delicate balance.

Let me elaborate on this. Assume NASA's webmaster has heard about SEO and wants to apply these techniques to NASA's website to consolidate its hold into #1 position. She inserts the keyword NASA at a few strategic places on the site - on the title, h1 tag etc. and manages to bring the kw density to, say 10%.

Googlebot visits the NASA site and does not like what it sees and the algo thrown this site to #457 position when searching for NASA.

Google is in the business of finding relevant pages and not penalizing webmasters who get a little caried away. If a site is relevant, and as long as SEO techniques don't decrease the quality of user experience, it should not be harmed (or helped), in my view.

In my experience, this "OOP" phenomenon is hurting creativity. I had a few pages in the top 10 list. One of them was in a heavily search serps. It was #10. I changed the title and a few words here and there, and now it is on the third page. Traffic has come to a halt. What do I do now? (It could have been because of the algo change but could have been because of the changes i made.) The other top 10 pages in which I wanted to add content and modify design are on hold. Don't fix that ain't broken for the fear Google might penalize one impedes creativity and progress.

seofreak

5:30 pm on Mar 18, 2004 (gmt 0)

10+ Year Member



Google is in the business of finding relevant pages and not penalizing webmasters who get a little caried away.

That for me is the quote of the day. I believe in that as much as most people here believe in OOP.

Robino

5:51 pm on Mar 18, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Read these testimonials. You'll see what Google thinks their search functions should be.

[google.com...]

AdWords and Froogle for commercial product listings and 'search' for research/info queries.

guddu

6:02 am on Mar 19, 2004 (gmt 0)

10+ Year Member



allanp73 I agree with you that google is providing some irrelevant results with actual relevant sites.

I too noticed it and sometimes, if you search for some keywords, it shows you 13 results in comparison to 10 results per page where it shows "Results 1 - 10 of about ....."

BallochBD

7:48 am on Mar 19, 2004 (gmt 0)

10+ Year Member



I am one of those who has site has been destroyed by a Google quirk and naturally I am very bitter about this. More so when they will not respond to my emails in any helpful way. I must however say that Big Dave hit the nail on the head with this.

Google wants good results for the searcher. They are unconcerned about *which* relevant site they send the searcher to, as long as they find what they need *somewhere*.

Many of use get tied up with the fact that our own sites, which may be the most relevant in the world, are not being found. If they are not being found then surfers don't know about them so their relevance is immaterial.

Having said that, I am still seeing sites that use banned techniques like same colour text appearing at the top of the rankings. This to me suggests major Google inconsistencies. If they cannot eliminate this what chance do we have?

This a war that will not end. As soon as Google slaps a ban on any technique someone somewhere will be trying to find a way to circumvent this - and they will! The fact that they still cannot find same colour text after all this time is good illustration of this.

planit

10:18 am on Mar 19, 2004 (gmt 0)

10+ Year Member



If someone searches 'widgets' on a search engine they might want different things.
To help everyone out it would be sensible to put
2 sites... Selling Widgets
2 sites... Repairing Widgets
2 sites... Selling Widget Spares
2 sites... Widget Encyclopedias
2 sies... No widget connection (incase the wrong search term was used ;)

The problem is that to provide this it might look as though the results were corrupted in some way.
But the user is going to be happy.

To achieve this all sites or pages would have to be catagorised.

BallochBD

10:35 am on Mar 19, 2004 (gmt 0)

10+ Year Member



Hmmmm ... I don't think so!

The algorithm has enough problems without trying to create five categories for the information it finds. And anyway, I don't think the users would be happy with this. All you would be doing is guaranteeing that 80% of the results would be wrong.

planit

10:44 am on Mar 19, 2004 (gmt 0)

10+ Year Member



I didnt mean it as a stict rule just that they are probably trying to give rounded results.

Patrick Taylor

11:09 am on Mar 19, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



In a specific (niche) area I'm involved in at present, there's no question - the top SERPS contain a high proportion of useless undeserving rubbish whose only effect will be to discourage people from using the net to find accurate information, perhaps as an alternative to the local library or a bookshop. I would say to BigDave that if one is looking for something general and there are lots of sites that qualify, then maybe who cares if there are a few worthy contenders missing. But let's not dumb down the www too much... there are lots of nichey sites with excellent content that only they provide, and it's no use if they're buried for no apparent reason - by which I mean they're sensibly designed and yet there's some illogical process at work within Google somewhere.

BigDave

4:41 pm on Mar 19, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Patrick,

did you read my entire post? That part where I said (emphasis added)

Another simple truth is that no search engine will ever have perfect results for every searcher on every search.

While google does care about your particular pet search, they have to be concerned about making the engine serve up the best possible results across the entire spectrum of searches. And if fixing a problem in 10 searches damages one other then that is a worthwhile tradeoff for now.

In other words, they know that there are (and always have been) areas where their algo falls short. But if screwing up your pet keyword improved 100 other searches, or even 1 other one that they consider more important, then it was a worthwhile change.

And believe it or not, in a war on bad results, there can be a tactical advantage to serving up even worse results for a month or two in some search categories.

BallochBD

4:59 pm on Mar 19, 2004 (gmt 0)

10+ Year Member



And believe it or not, in a war on bad results, there can be a tactical advantage to serving up even worse results for a month or two in some search categories.

I agree with most of what you say Dave but I am afraid that I just don't get the above. This afternoon I was using Google for research on "web based widgets". The results I got were absolute CR@P! All I got was lots of directories that had none of the information that I sought.

Generally Google is OK as a search engine, and I say that as someone whose legitimate site is currently dropped from the SERPS. But, depending on the search, I often now find myself ploughing through these pages of absolutely hopeless directories that offer NO INFORMATION AT ALL about the subject. My problem is that I tend to get lost in my search activity and forget that Google could be at fault. As a result I plough on and on until the penny drops and I take a reality check.

Quite often now the solution is to be found at other search engines. I don't really think that anyone wants to see these useless directories in the results. Do they?

kaled

5:08 pm on Mar 19, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I don't really think that anyone wants to see these useless directories in the results. Do they?

Well, I don't. If that's all I get, I change my search terms once or twice, and if that doesn't fix it, I use another search engine. I try not to waste my time ploughing through rubbish.

Kaled.

metrostang

5:42 pm on Mar 19, 2004 (gmt 0)

10+ Year Member



It would be interesting to see Googles web traffic trends over the past couple of months. Many of my pages are still at the top, but traffic and sales from G are down substantially. It appears the public is having the same experience using G as we are and is going elsewhere.

Inconsistant results with too many Florida type sites ranking high is turning off the searcher. Google is not bulletproof and when webmasters give up and start optomizing for Yahoo and other SEs, Google's search results will become worse over time. I realize it's possible to rank high with both, but the one you concentrate on will produce the best results.

Google has been my choice of SEs in the past, but now I'm finding much better results in Yahoo and MSN. Let's hope Google gets it act together soon.

BigDave

5:52 pm on Mar 19, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm not saying that this is happening, but it is a reasonable possibility:

Let us suppose there is a certain category of spam that google decides to target. Say, like keyowrd spamming.

If that keyword is above a certain percentage in the text, headings, title, domain and anchor text, then ding that page.

Two things are going to happen, it will hurt those that game the system, and it will hurt some innocent sites.

Those gaming the system will move on to the next "trick", never knowing what google will go after next. But in the mean time, they are down for a while. It will also make it so keyword spamming will go out of style.

Then there are the innocent sites where they simply can't help but use that keyword all over the place. In fact there are whole industries like that. Take the Spatula City commercial from the movie UHF. Every other word is spatula, because that is all they are about. Well, if they don't have enough other factors in their favor, they just might go down the tubes for a couple of months with the spammers.

Then after a couple of months they can start easing off on this part of the algo, because the problem has abated somewhat. It clueless innocent sites that have been wondering where their traffic went will start popping back up to the top.

The Google engineers would obviously look for additional keys as to whether it is a legit site or not, to try and limit the collateral damage, but there will always be casualties.

So there is how it can be to their tactical advantage to have bad results for a couple of months in some areas.

Sometime last year someone started a wonderful thread about building a robust site that could weather these sorts of changes. If you build a site where everything is in your favor, then no matter how google twists those hundred knobs, your site will always be near the top.

And one of best ways to do that is to make your site the best that you can while you forget that search engines even exist. Some people seem to forget that the site is for the user first and foremost. Then you optimize it for the Search Engines by making small tweaks here and there, that hopefully enhanse the experience for both the user and the search engine.

Those sites that never move from the top 10 are there because they are robust in the areas that count.

BallochBD

6:11 pm on Mar 19, 2004 (gmt 0)

10+ Year Member



Dave, I consider my website to be the best on the web for providing information on my chosen subject. This is because I specialise in something that others only have as part of their portfolio. I am not bragging about this but there is just no other site to touch mine for this sort of information.

As a reward for this my site has been dropped completely and I have had NO Google traffic for weeks now. The site is not getting crawled and I have lost all my title descriptions, etc.

How do you explain that?

Robino

6:17 pm on Mar 19, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Keyword stuffing can still get you into hot water right?

BigDave

6:42 pm on Mar 19, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



BallochBD,

I can't explain that, as neither of us know the details. But I did just explain how innocent sites like your's could possibly be collateral damage.

Yes, it sucks (for you, for the user, and even for google). But that is how it goes.

Just keep working on your site.

elgrande

6:47 pm on Mar 19, 2004 (gmt 0)

10+ Year Member



Then after a couple of months they can start easing off on this part of the algo, because the problem has abated somewhat.

BigDave, if this were the case then I might support such a system. However, when I look back at Florida to the present algo, I see just the reverse happening - that the filter is expanding and swallowing more good sites.

It appears that Google is becoming more zealous with their filters, and unless their algos improve we will continue to see more collateral damage with each update.

If the Google index gets to the point where it excludes enough quality sites, surfers will eventually notice the difference and switch to the other search engines. However, I do not believe that Google will "ease up" on their filters until they have lost signficant market share, which could take quite a long time.

Robert123

7:34 pm on Mar 19, 2004 (gmt 0)

10+ Year Member



I really think you all should stop thinking of "penatly this or filter that". Google guy has posted that it is a more logical(paraphrased) way to think about new changes as "what was once important is now weighted less".

Scapegoat

8:24 pm on Mar 19, 2004 (gmt 0)



Glad to discover I'm not the only one who's losing sleep/hair/sanity as a result of being Googled. Little condolence I know, but sharing misfortunes does take the pressure off personal blaming somewhat. I sympathise with all your tales of woe, and thought you may gleam something from a snippet of my own.
After ranking consistently #1 for main keywords, I too have been dropped into hyperspace, along with all my local competitors, who provide comprehensive, professional services with well designed, authoritative sites, and shared first page results with me on any location-relevant search. The first 3 pages of most key searches are now entirely dominated by monsterous global sites who offer no relevant content, other than the occasional ad promoting a single property.
The implications of Google's ECT treatment would seem to be:
Size does matter more than quality of content.
Sites paying for adwords are being ranked above those refusing to pay for protection.
Another intrigueing phenomenon is that, although not consistently listed on the first page, a national site that is heavily invested in Adwords, and has numerous affiliate and mirror sites, keeps being placed at #1 for a day or so, before being dropped. This has happened several times in the past few weeks, not just for the index page, but also for totally irrelevant pages, with no instances of the keywords visible.
Most disturbing is the fact that even my domain name search, which is very specific and relevant to my niche no longer produces a result in the top ten pages, with scores of totally unrelated sites being listed...
I hope that Google will stick a finger in the dyke before all is lost, and am fairly confident that the tide will turn soon.

idoc

1:57 am on Mar 20, 2004 (gmt 0)

10+ Year Member



"Sites paying for adwords are being ranked above those refusing to pay "

I don't think that is happening where you are seeing it for the reason you probably think. The large sites with the massive ad budgets also advertise everywhere, are in all the industry portals, buy news releases and newsfeeds, buy directory listings and essentially buy PR through link purchase. Between all of this as well as having a whole team of people maintain their sites instead of a single webmaster there are several legitimate reasons they are harder to compete with. The only way to compete is good original content as most of theirs is syndicated from elsewhere and hard clean seo for the long term.

BallochBD

12:19 pm on Mar 20, 2004 (gmt 0)

10+ Year Member



I really think you all should stop thinking of "penatly this or filter that". Google guy has posted that it is a more logical(paraphrased) way to think about new changes as "what was once important is now weighted less".

Robert, this thread is about "Google Inconsistencies" and it is obvious that there are quite a few of these. We have had thousands or perhaps millions of perfectly good sites dropped from the index while sites using stuff like same colour text are still featuring at the top of the rankings. Many of the results are polluted with useless and very obviously spammy directories. Dave says that we should concentrate on content, well that's all I have ever done. My own site is a CMMS and maintenance engineering information resource and it was created as such. It has been dropped, so yes, "what was once important is now weighted less".

The problem is that with the virtual monopoly that they currently enjoy Google do not have to worry about this and obviously they have no concern for the little guys. They don't have to respond to complaints or pleas for help and seldom do.

I just wish that they would start charging for inclusion so that we would have some means of redress. I have a dream!

Google can clean up the Internet and make quadrillions of bucks doing so. Announce that a one off charge is to be imposed for inclusion in the index, say $200. For this all sites would be screened by a human being before being included. Sites that are already in the index get dropped in three months or six months time if they don't pay, unless they are in DMOZ,which becomes the only means of getting in free.

To register you have to provide your name and verifiable, content details. Result? All criminal, paedophile, terrorist and other less savoury sites are quickly removed. If all search engines were regulated in this way we would soon see a better, cleaner Internet and people like us who get dropped for no reason would have some redress. Idealistic? I don't think so.

Patrick Taylor

12:54 pm on Mar 20, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The whole thing is still in its infancy. The idea that somebody searches 5 billion pages (and rising fast) for phrases they don't even bother to enclose in quotation marks, then relies on a list of ten to serve up exactly what they're looking for is a nonsense (especially when there are so many ways a webmaster can artificially boost the importance of a page - SEO). It will be mega nonsense in five years time. Personally I think Google is struggling as badly (in its own way) as some webmasters are, and we will be seeing this same kind of discussion for some time to come. It may well be that Yahoo have found some better way to deliver consistently good results, and maybe MSN will do even better, but I believe the basic format of one-click search delivery will have to change before consistency and quality become the norm.
This 74 message thread spans 3 pages: 74