Page is a not externally linkable
-- Google SEO News and Discussion
---- prediction: Florida-like Update Before End of Year
jmccormac - 1:31 am on Sep 9, 2011 (gmt 0)Thread source:: http://www.webmasterworld.com/google/4359046.htm
Web directories hit a scalability wall very quickly because they rely on user submissions to gain content. The alternative is to have a website detection/acquisition setup and this is beyond the expertise of most people who build web directories as it effectively is the precursor to a search engine.
|I'd say the biggest reason directories fell off was that the model cannot scale with the growth of the web. |
Actually I don't think that the problem is scale. This is based on building search engine indexes and doing monthly website usage surveys of about a million websites. The problem is getting fresh content and identifying derelict websites that have not been touched in years. There is also a sub-problem of distinguishing actively updated content from spam sites. Scale is easy - it is a technological problem. Quality, timeliness and relevance are far harder problems to solve.
|And the current search challenge is still scale. That's what makes truly competitive entry into the search market so problematic. |
I think it is more a question of quality. I've seen a lot of easily identifable holding pages in Google so it is a possiblility that Google's quality control sucks. It seems that Google relies on its algorithms to keep these sites down in the SERPs but it really is an example of poor programming that they should even make it into the Google index in the first place. (This may upset the Google fanboys and fangirls but it is a simple truth of building search engine indices that it is far easier to stop junk going into the index than remove junk from a live index.
|I hope Brett is right - and I wait for the change with a degree of expectation, in fact. Google holds maybe 8 to 10 times more data in their active index than Bing does, from what I see. And that difference in scale seems to be another part of what Google tends to choke on. |
The other thing that is really after screwing up Google is that the fear of penalties from linking is making people think twice about putting links on sites or even asking for links. This is a headshot for the Google because without links, it is stuck with gTLD websites. Most ccTLD registries do not provide zone file access in the same was as the gTLD registries. This means that Google is completely vulnerable in ccTLD markets because it cannot detect new ccTLD websites until they have inbound links.
Panda may yet turn out to be Google's Altavista moment and if they don't do something quickly, they face losing more traffic to Bing.
Brought to you by WebmasterWorld: http://www.webmasterworld.com