| 11:06 pm on Sep 17, 2009 (gmt 0)|
Totally agree - it's been a long time coming, but related sites are now much more related in topic, rather than some spastic collection of loosely interlinked sites in an odd cluster that does not give a user any value.
| 12:54 pm on Sep 18, 2009 (gmt 0)|
1) how Google improved this part of the algo (maybe a deeper analysis of searchers' and surfers' behaviors, rather than a simple link graph analysis);
2) how this improvement may affect rankings (e.g., the ranking power passed by a link could be proportional to how closely related the linked pages are).
| 5:14 pm on Sep 18, 2009 (gmt 0)|
Has Google also gone and reduced the amount of links shown on the 'link:' page?
| 8:06 pm on Sep 18, 2009 (gmt 0)|
Wow, I may actually start using that feature instead of just laughing about it :)
| 8:13 pm on Sep 18, 2009 (gmt 0)|
I have to say, I'm impressed by this improvement.
While they're at it, they ought to fix "link:" next, because that's long returned pathetically incomplete results.
| 8:23 pm on Sep 18, 2009 (gmt 0)|
Google's not going to "fix" the link: operator. They give much fuller link data in your WebmasterTools, but that's for the verified site owner - and that's intentional. They are just not going to surface large amounts of link data for people to research any website at all.
| 9:32 am on Sep 19, 2009 (gmt 0)|
|tedster They give much fuller link data in your WebmasterTools, but that's for the verified site owner - and that's intentional. They are just not going to surface large amounts of link data for people to research any website at all. |
Even what is shown in Webmaster tools is only 30% of the links in my case.
What would be interesting would be if we could work out on what basis the selection for link: and in Webmaster tools is selected. Are those selections indicative of something?
| 10:38 am on Sep 19, 2009 (gmt 0)|
In my Webmaster Tools, the lists of external links used to show about 60% of them, but the number has been steadily dropping in recent months and is now down to about 20%. There doesn't appear to be any logic as to which links are selected to be shown, but it isn't a random sample because typically half of the links to a given page could be from a blog roll.
I suspect that Google is intentionally restricting this information even to the site owners themselves, to prevent them from using it to evaluate the success of their various artificial link-building efforts.
| 7:38 am on Sep 21, 2009 (gmt 0)|
I've always analysed my top 10 competition backlinks and on page profile. I'm now wondering if I should be doing the same for my top 10 related sites.
If the #1 result deserves its place as I think it is in my case ;-) then IMO the related: search actually produces a better top 10 than are returned by standard SERPS. It could be a good way of producing a new search engine technique. First find the best site/page for a search then find others that are similar. Someone from Google recently let slip that they would like to move towards providing one correct answer to a search. This would be better, one correct answer and 30 similar ones, take your pick.