jmccormac - 12:35 pm on Jan 16, 2012 (gmt 0)
@incrediBILL I think that Google is going to do a lot more of this because of the growth of ccTLDs. I was reading that stuff that Amit Singhal wrote about Panda ( [googlewebmastercentral.blogspot.com...] )and it is so utterly clueless it is a sick joke. If this is the best that Google can do, it really needs to start fishing in the deep end of the clue pool.
With gTLDS, the zonefiles are readily available and it is easy to detect new websites. However with the braindead blind crawling method that Google and the other search engines use, detection is dependent on their being an inbound link to the new site. The reality is that most new sites don't have inbound links and therefore they are invisible to blind crawlers like Google's.
The big problem for Google is that its credibility is rapidly diminishing. It is no longer a cheeky startup with a bunch of hardworking people. It is a mature corporation. No matter how hard Google's PR flunkies and fanboys (and girls) try, there's the problem that Google has become "them". For directory owners who have had their sites plundered, there's going to be a more aggressive attitude taken. Google may well find itself being blocked by robots.txt from deep content as directory owners decide to limit the damage to their business model. Google Analytics will be ripped out of sites in increasing numbers - after all, the thinking would go, why give these people intelligence? This will possibly lead to the growth of authority type directories in various countries that will compete with Google for traffic. It may also provide the basis for new country-level search engines that concentrate on Google's serious vulnerability (ccTLD coverage). It simply could be Google's KT Event.