Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
[google.com...] - "Google prefers developing scalable and automated solutions to problems, so we attempt to minimize hand-to-hand spam fighting. The spam reports we receive are used to create scalable algorithms that recognize and block future spam attempts."
I suppose that makes sense as spammers can generated pages a million times faster than Google can delete them.
From that I guess their recent announcement:
that they do not have a filters list also makes sense.
Banning all cloaked sites would remove millions of sites that cloak 'legitimately'. Removing sites for the recently popular 'over-optimization' theories is also a rather silly idea as the majority of sites in the Google index will never have passed through the hands of and SEO specialist so keyword density, excess header tags, too many alt tags etc. etc. which people currently blame for their 'penalties' would not make sense as filters as this happens in regular page building. Banning sites for doing anything that also has a legitimate use is not something I would imagine a couple of PHD's would come up with, they've got the brains and know-how to work around that.
I think that's why Google is looking 'outside the box' for ranking factors these days. Arguments over whether too much bold text has got you banned is as useless as waterproof-towel these days. Look to stemming, latent semantic indexing and latent semantic 'anchor text' (I've not worked out what the right name for that would be!).
Google is looking 'outside the box'I've been trying to "think like an SE" for a few years. It seems to me that in order to produce good SERPs, SEs need to determine a site's legitimacy/quality/sincerity/motivation. In other words:
Using quality directory listings as a starting point for SEs is a good idea and it's been used for years. I have a few other ideas but I'd like to see what you guys think.
Excluding traditional on page factors, what could an SE use to determine a site's quality? (That implies we have a pretty clear definition of what quality means to an SE and that it's very broad and inclusive. Maybe we need to define what "quality" means to an SE first...)
What do ya'll think?
Does a site exist because of...
Totally agreed, motivation and information accuracy are two key elements.
I would imagine the possibility, with some serious effort involved, to categorize and mark who stands behind results, such as:
• Manufacturer, trademark / brand owner
• Affiliate / advertiser
• Information provider
• Reseller / distributor
• Hobbyist / fan club
G directory categorizes information accuracy but does not indicate (rate) motivation behind the site. It is ok to create *me too* affiliate directories but they need proper indication on SERP.
Small icons or letters beside results could immediately indicate to a customer what to expect by clicking on a link.
To draw a parallel, the results would be equivalent to postings in newsgroups, moderated versus un-moderated.
The problem I see with this is that SE would go into different business model and would not be able to follow the changes. Simply, there would be too many. Site owners could easily switch motivations and if not monitored closely the results could become misleading.
To effectively track and tag changes, directory groups need to be small and closely moderated or may need more moderators per group (in a similar fashion as WW). This may be how G directory operates today. I don't know that.
To summarize, I would prefer if results would indicate owners: (manufacturer vs. affiliate vs. hobbyist). This implicates that advanced search option would be to filter “resellers only”, etc…