rico_suarez - 11:38 am on Apr 20, 2011 (gmt 0)
you are right but there is a simple technical catch. imagine hundreds of millions of posts, images, sounds etc being published every day. can you imagine an algo that could compare all those sitemaps with each other and then make a decision on who was first for what. I think many people are overstating the power of google algo. because of amount of data it must take into consideration, it must make shortcuts and it's certainly doing so. it's simply not viable to take an article and compare it to billion pages and then do the same thing for every new article on the Internet just to establish the original author. if Google could do that, then DMCA complaint wouldn't exist. Google would know who is stealing what. That's why they created PR and other parameters to give weight to sites. And sometimes scraper sites outrank original sites. That's sad but true.