Forum Moderators: open
Why do some sites, that have no adherence to any SEO technique, achieve (enjoy) top rankings for a very competitive word/phrase? Does anyone know?
I don't expect an answer from anyone involved in these sites as they are either involved in corruption or judging by their SEO would never visit webmasterworld!
In my case I have sites that offer the best content (possibly only) for a city/town related subject, but because this information is using a "city widget" term these sites pages are nowhere to be found in top1000 results even when no other site has any information or relevency to the term. I really believe Google has gone too far. The filter is beatable, but I personally would prefer content rich sites being rewarded. Once enough people discover how to beat the filter, the spam will increase. The new system is great for spammers because it doesn't require actual content just proper linking strategies.
I run a site for a user of widgets. Lets call them "widgeteers". So my site has hundreds of reviews for "widgeteer equipment". Now there has been a site which has been in the top ten for the following search terms for over a year:
widgeteer equipment
widgeteer equipment reviews
This site also has not been updated since 2002 (I know because the first text on the site says "under construction - back in sept 2002" Now this site also has the words widgeteer, equipment, and review only in the title. Backlinks are nothing spectacular. Oh, and by the way, the links on the menu bar just send you to a DUPLICATE PAGE with a different url!
Now tell me, do you really think google cant figure out this page contains no relative information to the searches "widgeteer equipment" and "widgeteer equiment reviews"? It seems pretty obvious to me. Why is my page 50 and theirs in the top ten. Hmmm, either this page did some crazy things to trick the google ranking procedure or google is up to something.....
by the way, "widgeteer equipment" is almost exclusively sold throught the internet since not many specialty retail stores which sell them exist. In other words, Its very lucrative of a position they have.
I think google doesn't want to filter all the sites like this but they have got some technical problem with there machine or whatever they have changed in the algo. The result of the mistake is, sites are going out of the search pages along with the index in the google. Bcoz I think this new filter is something which is hited everybody badly.
But the good thing is, even after the new OOP and filter, owner of the sites are not paying google for adwords. And i believe after this filter google has lost there actual position where they were...
Thanks
Exp...
I don't know if I agree with Patrick Taylor about the science being thin. There's plenty of scientific research conducted, but the methods described in his post won't get people very far. Certainly, Google have made advances over the last six months in terms of making their ranking methods more opaque.
In other words, Google SEO used to be easy, now it's hard. That could be very good or very bad for a professional SEO. :-)
nzmatt:
> so what's the long term key?
The best answer may be 26 steps to Google success [webmasterworld.com]. It was written two years ago and makes even more sense now than it did then, so I guess that's long term in this industry.
"Corruption in Google"? Nope; if GoogleGuy himself tried to start an aggressively SEOd affiliate site I'd bet on him tripping the keyphrase filter on his first few attempts. :-)
"Corruption in Google"? Nope; if GoogleGuy himself tried to start an aggressively SEOd affiliate site I'd bet on him tripping the keyphrase filter on his first few attempts. :-)
I don't think the crux of the problem is that heavily SEOed affiliate sites are getting hit, as they rightfully should be. Main problem seems to be that genuinely relevant sites can get hit too if they are not able to maintain a delicate balance.
An analogy to current situation could be that an applicant for a job is rejected because he seemed too eager even though he was well qualified for the job. The mistake he did was to call up the hiring personnel and told them that he was really interested in that job!
The recruitment board, instead, decided to offer the job to a person who didn't apply for the job and whose resume they found on the internet and it had mentioned that "I am looking for a fashion modeling job," and "If you need more data about my portfolio then please contact me and I will be glad to offer you that." Obviously, in the board's eyes that applicant was the most qualified for the Portfolio Data Modeler's job. :-)
I understand the idea of rejecting on the basis of eagerness, but there must be an incentive for Google to reduce the power of search engine marketers to influence their results. Google say that [google.com] "Many SEOs provide useful services for website owners [..]", but also that "a few unethical SEOs [...] unfairly manipulate search engine results".
Are they only concerned about fairness, or really is the danger that if a bunch of SEOs have the power to "place" pages in Google for phrases of their chosing, there are potential problems.
We all come across sites that rank for many phrases, where some of the 'pages' do not and have never had any useful contribution in terms of content or the products and services of the owner.
> Portfolio Data Modeler's job
OK IITian, you have to get credit for that. As well as being amusing, your hypothetical example does well describe the post-Florida results in some sectors (though things are a lot better now).
I don't think the crux of the problem is that heavily SEOed affiliate sites are getting hit, as they rightfully should be. Main problem seems to be that genuinely relevant sites can get hit too if they are not able to maintain a delicate balance.
Here is a simple truth.
Google does not care if "genuinely relevant sites" are remvoed from some search results. They care about relevant search results.
If that seems like a contradiction, you aren't thinking about it hard enough.
Google wants good results for the searcher. They are unconcerned about *which* relevant site they send the searcher to, as long as they find what they need *somewhere*.
Just because a site moves from #1 to #1000 doesn't matter as long as the new #1 meets the needs of many of the searchers.
Another simple truth is that no search engine will ever have perfect results for every searcher on every search.
While google does care about your particular pet search, they have to be concerned about making the engine serve up the best possible results across the entire spectrum of searches. And if fixing a problem in 10 searches damages one other then that is a worthwhile tradeoff for now.
The vast majority of searchers are still happy with their current search engine. Whether it is Google, Yahoo, AOL, MSN, AV or whatever. If they don't get what they want with the first search, they will search on something else.
They will not switch search engines just because they get a few bad results in some particular area. And it is even less likely now than it was back in the days of AV when the average web user was more sophisticated.
An analogy to current situation could be that an applicant for a job is rejected because he seemed too eager even though he was well qualified for the job. The mistake he did was to call up the hiring personnel and told them that he was really interested in that job!
Yeah, that is a really big mistake. At least if you want to get paid well for a job.
A company should not hire you because you want the job, the company should hire you because they need you. So you are right, it is a great analogy.
The only job that I ever got where the company did not come looking for me was paperboy.
I don't think the crux of the problem is that heavily SEOed affiliate sites are getting hit, as they rightfully should be. Main problem seems to be that genuinely relevant sites can get hit too if they are not able to maintain a delicate balance.
Let me elaborate on this. Assume NASA's webmaster has heard about SEO and wants to apply these techniques to NASA's website to consolidate its hold into #1 position. She inserts the keyword NASA at a few strategic places on the site - on the title, h1 tag etc. and manages to bring the kw density to, say 10%.
Googlebot visits the NASA site and does not like what it sees and the algo thrown this site to #457 position when searching for NASA.
Google is in the business of finding relevant pages and not penalizing webmasters who get a little caried away. If a site is relevant, and as long as SEO techniques don't decrease the quality of user experience, it should not be harmed (or helped), in my view.
In my experience, this "OOP" phenomenon is hurting creativity. I had a few pages in the top 10 list. One of them was in a heavily search serps. It was #10. I changed the title and a few words here and there, and now it is on the third page. Traffic has come to a halt. What do I do now? (It could have been because of the algo change but could have been because of the changes i made.) The other top 10 pages in which I wanted to add content and modify design are on hold. Don't fix that ain't broken for the fear Google might penalize one impedes creativity and progress.
Read these testimonials. You'll see what Google thinks their search functions should be.
[google.com...]
AdWords and Froogle for commercial product listings and 'search' for research/info queries.
Google wants good results for the searcher. They are unconcerned about *which* relevant site they send the searcher to, as long as they find what they need *somewhere*.
Many of use get tied up with the fact that our own sites, which may be the most relevant in the world, are not being found. If they are not being found then surfers don't know about them so their relevance is immaterial.
Having said that, I am still seeing sites that use banned techniques like same colour text appearing at the top of the rankings. This to me suggests major Google inconsistencies. If they cannot eliminate this what chance do we have?
This a war that will not end. As soon as Google slaps a ban on any technique someone somewhere will be trying to find a way to circumvent this - and they will! The fact that they still cannot find same colour text after all this time is good illustration of this.
The problem is that to provide this it might look as though the results were corrupted in some way.
But the user is going to be happy.
To achieve this all sites or pages would have to be catagorised.
did you read my entire post? That part where I said (emphasis added)
Another simple truth is that no search engine will ever have perfect results for every searcher on every search.While google does care about your particular pet search, they have to be concerned about making the engine serve up the best possible results across the entire spectrum of searches. And if fixing a problem in 10 searches damages one other then that is a worthwhile tradeoff for now.
In other words, they know that there are (and always have been) areas where their algo falls short. But if screwing up your pet keyword improved 100 other searches, or even 1 other one that they consider more important, then it was a worthwhile change.
And believe it or not, in a war on bad results, there can be a tactical advantage to serving up even worse results for a month or two in some search categories.
And believe it or not, in a war on bad results, there can be a tactical advantage to serving up even worse results for a month or two in some search categories.
I agree with most of what you say Dave but I am afraid that I just don't get the above. This afternoon I was using Google for research on "web based widgets". The results I got were absolute CR@P! All I got was lots of directories that had none of the information that I sought.
Generally Google is OK as a search engine, and I say that as someone whose legitimate site is currently dropped from the SERPS. But, depending on the search, I often now find myself ploughing through these pages of absolutely hopeless directories that offer NO INFORMATION AT ALL about the subject. My problem is that I tend to get lost in my search activity and forget that Google could be at fault. As a result I plough on and on until the penny drops and I take a reality check.
Quite often now the solution is to be found at other search engines. I don't really think that anyone wants to see these useless directories in the results. Do they?
Inconsistant results with too many Florida type sites ranking high is turning off the searcher. Google is not bulletproof and when webmasters give up and start optomizing for Yahoo and other SEs, Google's search results will become worse over time. I realize it's possible to rank high with both, but the one you concentrate on will produce the best results.
Google has been my choice of SEs in the past, but now I'm finding much better results in Yahoo and MSN. Let's hope Google gets it act together soon.
Let us suppose there is a certain category of spam that google decides to target. Say, like keyowrd spamming.
If that keyword is above a certain percentage in the text, headings, title, domain and anchor text, then ding that page.
Two things are going to happen, it will hurt those that game the system, and it will hurt some innocent sites.
Those gaming the system will move on to the next "trick", never knowing what google will go after next. But in the mean time, they are down for a while. It will also make it so keyword spamming will go out of style.
Then there are the innocent sites where they simply can't help but use that keyword all over the place. In fact there are whole industries like that. Take the Spatula City commercial from the movie UHF. Every other word is spatula, because that is all they are about. Well, if they don't have enough other factors in their favor, they just might go down the tubes for a couple of months with the spammers.
Then after a couple of months they can start easing off on this part of the algo, because the problem has abated somewhat. It clueless innocent sites that have been wondering where their traffic went will start popping back up to the top.
The Google engineers would obviously look for additional keys as to whether it is a legit site or not, to try and limit the collateral damage, but there will always be casualties.
So there is how it can be to their tactical advantage to have bad results for a couple of months in some areas.
Sometime last year someone started a wonderful thread about building a robust site that could weather these sorts of changes. If you build a site where everything is in your favor, then no matter how google twists those hundred knobs, your site will always be near the top.
And one of best ways to do that is to make your site the best that you can while you forget that search engines even exist. Some people seem to forget that the site is for the user first and foremost. Then you optimize it for the Search Engines by making small tweaks here and there, that hopefully enhanse the experience for both the user and the search engine.
Those sites that never move from the top 10 are there because they are robust in the areas that count.
As a reward for this my site has been dropped completely and I have had NO Google traffic for weeks now. The site is not getting crawled and I have lost all my title descriptions, etc.
How do you explain that?
Then after a couple of months they can start easing off on this part of the algo, because the problem has abated somewhat.
BigDave, if this were the case then I might support such a system. However, when I look back at Florida to the present algo, I see just the reverse happening - that the filter is expanding and swallowing more good sites.
It appears that Google is becoming more zealous with their filters, and unless their algos improve we will continue to see more collateral damage with each update.
If the Google index gets to the point where it excludes enough quality sites, surfers will eventually notice the difference and switch to the other search engines. However, I do not believe that Google will "ease up" on their filters until they have lost signficant market share, which could take quite a long time.
I don't think that is happening where you are seeing it for the reason you probably think. The large sites with the massive ad budgets also advertise everywhere, are in all the industry portals, buy news releases and newsfeeds, buy directory listings and essentially buy PR through link purchase. Between all of this as well as having a whole team of people maintain their sites instead of a single webmaster there are several legitimate reasons they are harder to compete with. The only way to compete is good original content as most of theirs is syndicated from elsewhere and hard clean seo for the long term.
I really think you all should stop thinking of "penatly this or filter that". Google guy has posted that it is a more logical(paraphrased) way to think about new changes as "what was once important is now weighted less".
Robert, this thread is about "Google Inconsistencies" and it is obvious that there are quite a few of these. We have had thousands or perhaps millions of perfectly good sites dropped from the index while sites using stuff like same colour text are still featuring at the top of the rankings. Many of the results are polluted with useless and very obviously spammy directories. Dave says that we should concentrate on content, well that's all I have ever done. My own site is a CMMS and maintenance engineering information resource and it was created as such. It has been dropped, so yes, "what was once important is now weighted less".
The problem is that with the virtual monopoly that they currently enjoy Google do not have to worry about this and obviously they have no concern for the little guys. They don't have to respond to complaints or pleas for help and seldom do.
I just wish that they would start charging for inclusion so that we would have some means of redress. I have a dream!
Google can clean up the Internet and make quadrillions of bucks doing so. Announce that a one off charge is to be imposed for inclusion in the index, say $200. For this all sites would be screened by a human being before being included. Sites that are already in the index get dropped in three months or six months time if they don't pay, unless they are in DMOZ,which becomes the only means of getting in free.
To register you have to provide your name and verifiable, content details. Result? All criminal, paedophile, terrorist and other less savoury sites are quickly removed. If all search engines were regulated in this way we would soon see a better, cleaner Internet and people like us who get dropped for no reason would have some redress. Idealistic? I don't think so.