if that is true whoaaa.. and it is welcomed.. not because of comment spam, but because it shows a first, small but hopefully important step to consolidating the industry :)
[added] posted before links.. didn't mean to sound like I doubted you ;))[/added]
Though the implementation is different, I suggested that Google implement this ages ago. However, I was not thinking of blogs and spam, I was mostly thinking of links to dynamic pages.
(confirms that it was Google's idea and that MSN Spaces will be using it)
Google announcement (not live at the time of posting):
(remove space between "google" and "blog" to get the link to work)
SuzyUK: I agree that that is the real story - the giants in search and blogs working (somewhat) together. It was still leaked a little early (still leaking now w/more info?), but it seems they can work together somewhat...
Fix blogger.com, blogspot and whatever else - there are still _plenty_ of blogs left in the world, and plenty more being created such that this solution will have little effect on the problem. Kudos to MSN and Google for taking steps forward to eliminate spam. I think the nofollow tag is amusing but at least it shows some motivation on their part. BTW, google tried this before with the redirects through google.com and we all see how well that worked.
I do like Jake's insightful analysis of the situation. I'm not expert enough to think of good ways to abuse the attribute, but as Brett says, the biggest trouble is that Google and the others are admitting that their algos are broken and they need such an attribute.
Clearly, this system is going live by default in many major blogging applications: in future versions and updates of downloadable products such as Movable Type or WordPress, and inserted automatically into hosted solutions such as MSN Spaces and TypePad. But after that, it may very well spread into general CMSs, wikis, forums...
What will the SEs do then? Great swathes of nofollow links produced by default by a thousand CMSs. If they truly respect it and don't follow the link at all, then spidering will be severely hampered. If they follow the link but diminish its value in terms of PR or other such considerations, then PR no longer becomes a mathematical calculation, but rather something which can be controlled (and therefore manipulated) by webmasters.
This attribute breaks the web, because it breaks the value of links.
<added>From the Google Blog announcement (now live):
|From now on, when Google sees the attribute (rel="nofollow") on hyperlinks, those links won't get any credit when we rank websites in our search results. |
|Q: Is this a blog-only change? |
A: No. We think any piece of software that allows others to add links to an author's site (including guestbooks, visitor stats, or referrer lists) can use this attribute.
I like the way livejournal is implementing it. A comment from someone on your friends list will not have the nofollow, but if they are not on your friends list it will have the nofollow.
I really don't buy the "it's admitting they can't fix it" argument, though you could certainly take it that way. Why can't this be another tool for them to use in addition to trying to deal with it algorithmically?
There sure are problems with just ignoring comments in a blanket fashion, though that really was the only option up till now. This might allow them to follow links that they would have otherwise had to ignore. They can still ignore comments in blogs that do not implement this.
It seems that the only differences that will happen are that it will make it so that some site owners will feel like they can do something to fight back, and that it will cause a small percentage of the comment spammers to avoid those sites that implement this kludge.
I don't see it helping the search engines all that much.
i like the transparency - finally letting us know that they are devaluing it instead of us just guessing that's what they're doing.
also, i'd already thought about how to add it or not (depending on user status) to my blog hybrid software. ;) i don't think i'm gonna allow anonymous comments for a while, though, if ever. just because they don't count, i'm sure there are clueless scriptkiddie SEOs out there with software running on their grandmother's computer. ;)
we'll see, though. i do like the transparency, though, or at least the attempt at it...
encyclo, you missed a few. Besides Google, Yahoo, and Microsoft, the blog software makers signing on include
Brad Fitzpatrick - LiveJournal
Dave Winer - Scripting News
Anil Dash - Six Apart <-- This is TypePad and Movable Type and LiveJournal
Steve Jenson - Blogger
Matt Mullenweg - WordPress
Stewart Butterfield - Flickr
Anthony Batt - Buzznet
David Czarnecki - blojsom
Rael Dornfest - Blosxom
I believe we're adding MSN Spaces to that list ASAP as well. I've been amazed at how many people are cooperating. This is gaining momentum very quickly.
Full post at [google.com...] blog/2005/01/preventing-comment-spam.html
(remove the space between google and blog to get the url)
Need a business opportunity? Start a company who brokers text links in the main body of blogs, the way Hollywood sells product placement in movies.
It's not admitting they can't fix it to recognize the Internet is not stuck in the 20th century. In the future thay may be all sorts of ways to interactively insert text into webpages. How the engines view such text needn't and shouldn't be governed by or viewed in the context of archaic principles (meaning principles from more than a couple years ago).
Something new came along. Some new way to handle it came along. The reality is that user-inserted links on a page are different than owner-inserted links. Most folks already knew that. The engine's algorithms were intended to be built around valuing owner inserted links, so this change really makes no statement on the algorithms. It merely makes a statement that the engines were staggeringly slow in implementing an obvious step.
graywolf: i think that's already happening. remember last year (a popular weird-widget-link site) fessed up after an advertiser went public with the info they were selling links (in content) without disclosing that it was paid for. it was pretty big news at the time for some people. (sticky me for the site if you don't know...)
i'm sure others are doing it
|GoogleGuy: How on earth did you guys keep it under wraps for so long with so many bloggers knowing about it? ;) |
So are there any appropriate uses for us regular webmasters, not runnings blogs or worried about comment spam on our own sites? For example, I have a stats counter link at the bottom of each of my pages. There's no reason for it to be followed by a spider - nothing secret, it's just irrelevant. If I put rel=nofollow on that link am I contributing to cleaning up the databases? Would be glad to do so, if so.
This will be another tool shady webmasters can use after they setup a reciprocal link exchange.
rjohara: from what i can tell, it will help let the search engines know not to index it, although you might still get scriptkiddie SEOs spamming your site for backlinks...
i'm seriously thinking about compiling a master list of all the domains from these clowns, but it's so easy for them to get-a-new-domain.com and start all over again.
Blogs are the fastest growing segment of the web. Doing this doesn't solve all the spamming problems now but helps prevent larger problems in the future.
If you trade many reciprocal links, though, this gives your link trading partners a new way to cheat you, and adds a fairly complex additional step to the process of checking whether any given link partner has left your link up in honest fashion.
Of course, I wouldn't expect the search engines to lose sleep over this "unintended" consequence.
This looks really interesting. I think this tag could also be put to good use in forum signatures. I hope someone comes up with a mod for forum softwares soon to insert this tag into links posted on forums. :)
I hadn't thought until encyclo's post how this will affect other link sources (particularly message boards -- and I mean for legitimate links that people are sharing with each other, not just spammy links).
Another effect, of course, will be that publishers (i.e., website owners and operators) will have much, much more power in deciding what is and what is not valuable on the web (in terms of search results).
[edited by: HughMungus at 5:41 am (utc) on Jan. 19, 2005]
Googleguys link is broken in his post - maybe someone can fix it?
<added> dont forget to add the rel=nofollow tag</added>
Now just to get the attribute added to w3c html/xhtml recommendations. You would think with all 3 engines behind it, they would be able to pull the strings to get it done.
huh? it seems to me it would be easier for webmasters (of various levels) to view source on the page and apple-F for ref=nofollow...
the really bad guys already (imho) use worse methods to try to trick googlebot...
While I understand the bandwagon effect here - desparate blog owners and software providers wanting to stem the tide of blogspam - it seems to me that with very little thought, a multitude of new opportunities present themselves for manipulating SE rankings.
In order to implement this with a sense of achievement and optimism, G and the other SE's would have needed already to have thought between three and thirteen steps ahead.
Black hat site operators must be drooling at this. Either that or I must be missing something.
I take it back. You need not even be black hat to see the opportunities.
|brotherhood of LAN|
Shocking that MSN refer to the attribute as a "tag" ;)
Blogging software will have surely to generate HTML *exactly* the same way as an SE parses/sees it to ensure that the links are treated the same?
Does this mean that when we do reciprical linking we are going to have to view the source code to ensure that tag is not added?
This is good stuff, not just for blogspam, but for internal use also. We have several internal links on several sites that do not need to be followed.
If this spec works, and works properly, it allows webmasters to create pages that effectively target onsite links precisely.
well I see this as a bad day.
Search engines are supposed to be tools to help us find content. I would expect them to change as the web does. I woulden't expect search engines to want to reshape the way the web is forming to suit their own goals.
For example with Google a link from one site to another is a vote. True, but only if it comes from the sort of site that google approves of. The rest of the web doesnt matter?
also as shri points out this tag can be used to control internal linkage, also external linkage. So now it's very easy to change your reciprical links to one way links?
I see a whole new can of worms about to be opened.
|You need not even be black hat to see the opportunities. |
| This 135 message thread spans 5 pages: 135 (  2 3 4 5 ) > > |