Welcome to WebmasterWorld Guest from 54.163.68.15

Message Too Old, No Replies

Yahoo Site Explorer's Inbound Link Data's Appropriateness For Google

   
11:14 am on May 12, 2010 (gmt 0)

5+ Year Member



Do folks believe that utilizing Yahoo! Site Explorer's Inbound Link data is in anyway counterproductive when one's goal is to mine from that data insights that allow for better results within Google?
4:43 pm on May 12, 2010 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I can't see how it would be counterproductive. As with any source of data, you want to balance it and take it with a grain of salt and common sense.

For example, one thing no data source for backlinks will tell you is the position on the page where the link appears -- and that makes a BIG difference in how valuable a backlink is.
6:38 am on May 13, 2010 (gmt 0)

5+ Year Member



Thanks. Yes that is where we come in actually - and are tasked with manually giving a 1-5 rating for the (subjective) strength of the link. I just find it rather potentially accuracy blurring/denting that their is such a huge divide between the total number of webpages Google and Yahoo! claim to have crawled - a billion pages here or there would constitute a quantitatively if not qualitatively disparate data source...?
8:09 am on May 13, 2010 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Good luck with that 1 to 5 ranking project. You'll need a relatively complex formula to even come close to weighting backlinks the way Google seems to. I'm still boggled by it regularly. I can see some patterns that seem to hold true, but then I run into those mysterious exceptions.

One thing Google has told us is that on their back end, they can zero out the effect of links on a site-wide basis, a page-wide basis, or even at the level of an individual link. And if you're analyzing a huge pile of backlinks, it will pretty much be a wild guess as to whether one of those three possibilities is happening.

Even if you have your own independent link graph of a big chunk of the web, I still doubt you could reverse engineer everything that Google does to modify link strength. For example, there are historical factors - and they vary by the kind of site and keyword area.

Some links GROW in strength when they've been in place for a longer time. Some links WEAKEN in strength when they've been in place for a longer time. If a site has a high trust value with Google, then its links may carry more clout - especially if they use keyword anchor text.

And it seems to me that even pure PageRank calculations are now modified by where on the page a link appears. Check out this newly granted patent [seobythesea.com] which was filed back in 2004. The original idea of a random surfer is replaced by a "reasonable" surfer. In other words, a link that a page visitor is less likely to click on (for any number of reasons) will vote less power to its target website.
8:16 am on May 13, 2010 (gmt 0)

5+ Year Member



So as their techniques and inclusion/exclusion criteria become more sophisticated and we move ever closer toward a form of legitimate artificial intelligence is their even scope for the endeavor of human implemented search engine optimization beyond perhaps only the most basic of guidelines regarding best practices in page, link and content lay-out?