shadowdaddy - 12:55 pm on Apr 9, 2013 (gmt 0)
So I've put this very question to one of my agencies - is a robots.txt level "disallow" sufficient as a method of removing links from the link profile? Their response has been very different than
Because when you block their access they don't know if anything has been removed or not, because they can't access it.
Agency has suggested that if the content of a page is not cached thanks to a robots.txt disallow, the content essentially doesn't exist anymore and therefore the links don't either. But the more I read the less convinced I am by this argument.