Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: mademetop
I have seen garbage URLs, I've seen loads of text with no grammar whatsoever, and I've see link farms. My personal experience is that wherever I've seen them there are no "real" competitors, and that in general they are easy to displace in the SERPs.
Of course, I could be totally wrong, I may be just looking in the wrong places.
I see posts about sites with 100 domains creating artificial PR and being unbeatable, or sites dominating just by virtue of multi-hyphenated domain names. I see sites like that, but I never see any that can't be knocked out of their positions very easily.
So I guess my question is in relation to competition, not really search quality, I'm just curious to see some of these sites people think are unfairly topping the charts.
Morgan, I'm still a little confused as to your question.
But if you want to see sites that unfairly tops the charts search for God or discount brokers. Are these spammers, well as heini said, there is no such thing as spam!
Are they really damaging the economy of other sites, I don't think so, they are simply wasting users time and making Google look a little foolish.
You have that backwards. It isn't SEs that use spam as an excuse as it is a term used by webmasters for sites coming up higher than theirs. To me the only true spam is when the same webmaster dominates a SERP with multiple domains. If another site about widget information using better SEO comes up higher than mine, even if I think mine is much better, it ain't spam because it is relevant. Spam could also includes things like a porn site using SEO trickery to come up #1 for "growing petunias", but as a practical matter this is minimal because webmasters tend to SEO for irrelevant terms that will convert, not irrelevant ones.
Well I'm not sure that the former will bring many qualified prospects, seeing there are so many bodies claiming to represent versions of her, and the latter is far too vague for anybody intelligent to use ( discount brokers? in what industry, for what products, in what country? )
Ive yet to see major spam/irrelevant results dominating any first page SERPS for any two or more word query that intelligent USERS (or prospects) would use. Maybe terms that SEO's use, but not users so much...
What I was really asking was just if people think the tricky methods of SEO on Google actually work well. I don't think they do, I think it's usually just an excuse. That's all. It seems like there are a lot more complaints of this kind of thing lately, and I wanted to see some examples where I couldn't see any possible legitimate reason why one site was beating another.
Go to Overture.com and look at the bidding prices for "discount brokers"....$2.20 per click for a top 4 slot!
Try wordtracker and check the popularity of that term.
Chiyo, in the US these terms are popular (intelligent or not, people type them). Add Stock to Discount Brokers and you still get a similiar nonsence result at the top.
Should Google be wasting the user's time by showing this stuff? It doesn't really damage anyone else, but it may damage Google because the user's see it as irrelevant.
For the record I have nothing to do with discount brokers or God related sites :) These are just two examples of how Google is doing less that its best for common search terms (however dumb people may think they are, they are common and used often).
For longer more accurate search phrases Google seems to be doing a decent job.....all we need to do is educate the population of the world to use search engines correctly :)
[edited by: percentages at 10:54 am (utc) on May 29, 2003]
The worst results I have ever seen on a search - the first ten pages of Google with about twenty listings for other sites ( a couple OK, the rest dubious - it this ©ç?ç?©ƒ PR + link thing being emphasised FAR too heavily, it's OK up to a point, but get a load of affiliate sites interlinking with as many other sites as they can, and then every commercial type search on the planet starts to go crazy.
The first listing is relevant (just and poor quality crap/zero content) and the rest are totally unrelated widget colours not in the search phrase.
There are some serious serious problems currently with Google's algo and/or spam filtering, make no mistake.
When you have zero content pages turning up by virtue of the links pointing to it alone, you have to rethink the ranking weightings away from the links.
SEO tip of the month: go find the best deal you can on a thousand domains, copy paste pages till the cows come home, interlink them all in a string but mess with Google by not doing it totally obviously ie seperate IPs and also not all sites to all other sites, but rather mix and match. Expect to see your sites #1 for highly competed for two word searches and obliterating totally more focused three and four word searches.
In my opinion Google is failing quality content webmasters badly at the moment , I hope that they sort this mess out next update, it's depressing, I'll be abandonning the rules and playing the spam game myself if it continues, if that's what takes, then we can all join in in wrecking Google results even more I guess.
That's precisely what I meant, I want to see if I think it would be difficult to compete agains or if I can objectively understand why they do well.
In general, I tend to doubt that heavy interlinking somehow messes up Google. But I could be wrong, so I wanted to look.
[edited by: Marcia at 12:11 pm (utc) on May 29, 2003]
Random crosslinking in itself probably isn't going to be as wildly successful as one might hope, because the PR (and possibly relevancy) has to come from somewhere.
It took me a few minutes to appreciate what you were trying to say.
First... what is search quality? The quality of a SE result is a subjective thing. Different people are looking for different things, even when using the same search phrase. It might take a group of impartial, intelligent humans to create a good subjective score for each of the possible results for a given search phrase.
Search Engines don't want to use humans to grade quality. There are too many possible search phrases, and too many possible pages to score. They want to automate the process. This is inherently flawed. You simply can not grade the quality of the content on a page based on keyword density, the number of backlinks, the words in the title or meta description, etc.
I would say that there is SPAM. SPAM is the obvious and deliberate reverse-engineering of the Search Engine algorithm, in order to improve the result of a particular page for particular search phrases. It exists on a massive scale.
In my opinion, Google and other search engines will have a very difficult time combating SPAM, unless human factors are used to pluck the weeds. First, the punishment for obvious SPAM (verified by humans) should be more lengthy and severe. Second, Google can use actual click-through data and toolbar data to see what results are high quality, and what results are not.
For instance: a search is performed, and the user sees 10 results on the first page. The user scans the brief titles, descriptions, and urls and chooses result 2. The user views the page for 8 seconds, then hits the back button on the browser. This user chooses result 5, goes to this page, and browses 18 more pages on this site, following links from the first page. It should be obvious to Google that result 5 has higher relevance for this search phrase than result 2 in the SERPs... for this user. This is not artificial intelligence... this is real intelligence. Webmasters would find it much more difficult to SPAM this kind of ranking method... they have to produce higher quality content, or they drop in the rankings. This is especially true if there are millions of toolbar users, and each toolbar user gets one vote only.
Good point Heini!
Google Algo is way to slow to deal with spammers getting top ranks.
In the event, the 'nice guys' just get to wait for spammers to be kicked off until their own rankings improve. Problem is more spammers enter the index and dominate before justice is done.
hmmm....debatable situation. Lets take a user poll on this. How many of you guys want to go by any of the following statements -
1. I made $200,000 last year from my site but expoited the weakness of Google Algo - Not a legal crime, but I did it.
2. I made $2000 last year from my site and followed every rule in the Google Book.
If spamming makes money, people will spam, many non-spammers will just continue to be Google suckers who thought that listening to Googleguy, and reading the 'thou shall nots' on the Google webmaster pages would get them good ranking if they worked hard and diligently for the Googlegod, only to find their sites sacrificed at the SEO Altar at the next update!
DVDBurning yor suggestion is good but it will just fill the SE's ranks with portals that will try and become authorities for subjects.
I guess what ever solution (algo, filter) is found the commerse opportunities the net has to offer through SE's will give people insentives and reasons to reverse engineer and find new methods to exploit.
Google need to stop playing the high and mighty more SE pious than thou card if it wants webmasters to seriously not want to spam!
All Google has to do is give less weight to e-commerce pages. In other words, knock product order pages, hotel booking pages, etc. down 50 or 100 places in the SERPs for general search strings like "widgetco wc-1 digital camera" or "pastaville italy." Users would immediately see less duplication of search results, and businesses would have a strong incentive to buy advertising as they do in every other medium except the Web.
I'm not saying that Google should do this, so no flames, please. I'm simply commenting on what Google could easily do if it wanted to improve the quality of its search results, reduce the incentive for spamming the index, and encourage purchases of AdWords in one fell swoop.
In english language SERPs, I usually don't have a problem finding stuff. But when I'm searching in german language for almost arbitrary household items (eg. furniture), then 2 out of 3 listings tend to be e-bay/amazon/etc. affiliate sites with zero content (I know how to search on e-bay directly, thank you very much).
I think this is mainly because the german speaking web is still to small for Google to weed out the cheap link farms, and there are not enough valid topical sites and directories to base a relevancy matrix on. Similarly for english language searches: If the number of throwaway sites in an industry is too large relative to the actual information/business sites, then it gets really difficult for any algo to seperate the wheat from the chaff.