Welcome to WebmasterWorld Guest from 54.226.183.49

Forum Moderators: mademetop

Message Too Old, No Replies

Group of SE's Introduce New Tag Support

<a rel="nofollow">

     

encyclo

12:20 am on Jan 19, 2005 (gmt 0)

WebmasterWorld Senior Member encyclo is a WebmasterWorld Top Contributor of All Time 10+ Year Member



OMG the details are coming through. The guilty involved in this initiative are:

  • Google (Search and Blogger)
  • Yahoo (Yahoo Search)
  • Microsoft (MSN Search and presumably MSN Spaces)
  • Six Apart (for Movable Type and Livejournal)
  • WordPress

    Not just Google, but all the major players in the search and blogging spheres. However, this is a Google initiative.

    The specific method described is correct, and the rel attribute is designed to mark "untrusted links".

    blog announcements:
    [blogs.msdn.com...]
    [sixapart.com...]
    [ysearchblog.com...]

  • PhraSEOlogy

    6:30 am on Jan 19, 2005 (gmt 0)

    10+ Year Member



    True, but only if it comes from the sort of site that google approves of.

    I dont think its meant to be that black and white - not all links will or should have the nofollow tag attribute. obviously, it will be up to individual blog owners to decide which links out are valid and which are just spammy links. Google will index those links denoted to be valid (not spam). So the shaping of the web falls upon the blog owners.

    I think that this (tag) attribute could be abused by people wanting to set up fake reciprocal links etc. But, to me, the whole web is "caveat emptor".

    When you set up reciprocal links, its wise to monitor them automatically. The new tag attribute just means an additional step in that verification process.

    <edit>change tag to tag attribute</edit>

    BroadProspect

    6:59 am on Jan 19, 2005 (gmt 0)

    10+ Year Member



    As Shari wisely said, it will allow better control of internal (and external linkage structure) which IMHO is wrong, it is just like another way of cloaking where you present one thing to the user (you link e.g. recommend) a destination page and one thing to the SE crawler (listen crawler, let me tell you, this link is worthless ...)

    and in regards to BAD use of this, let me share a few steps forward thinking with you, the SE will be able to detect blogs spammers by seeing how many backlinks from that TAG they have, sounds ok, doesn't it, WRONG! by doing that the spammers will be able to spam using a legitimate site url and make the SE "THINK" it belongs to a spammer and take it of the SERPs.

    /BP

    caveman

    7:08 am on Jan 19, 2005 (gmt 0)

    WebmasterWorld Senior Member caveman is a WebmasterWorld Top Contributor of All Time 10+ Year Member



    I dont think its meant to be that black and white

    Doesn't matter what it's meant to be; only what it will be if this idea is really unleashed.

    such as

    1) Manipulating recip links
    As already thought of and noted by posters. This is no small issue in a world that's getting increasingly sneaky, as noted in this thread [webmasterworld.com] started by martinibuster

    2) Artifical link manipulation
    In essense, giving a SE a different picture of a web site by closing off pages you care less about for ranking reasons. This coin obviously has two sides since there are legit reaons. But those were easily handled with noindex instructions. Now we can subdivide individual links and reroute PR in increasingly artificial ways. Just more power to the SEO, which is not really what G wants, and certainly puts the innocent webmasters at huge disadvantages. And don't think about single sites here. Think about the profound effect it would/will have on the entire Web.

    What if a news site decides some lesser pages are not worth finding, in favor of bigger pages that draw more ad dollars. Result: poorer quality.

    Of course, gee, now that I think about it...my affiliate links no longer need be java redirected. Hmmm.

    3) Nevermind, I've got work to do. :/
    OMG, this could radically and negatively change the face of the entire web, since the web no longer has a face except thru the eyes of SE's, for the most part.

    With only a few minutes to consider this, seems that PR manipulation just became a sport for the novice.

    Oh, and BTW, does anyone think this will stop the majority of blogspam? Puhleese. Something might. This won't.

    <added>I need to take a deep breath. I'm sure that I'm just not getting it somehow. So many smart SE's couldn't be wrong.</added>

    BroadProspect

    7:23 am on Jan 19, 2005 (gmt 0)

    10+ Year Member



    it will change the nature of the web, consider a news site like CNN which decides that all external links in his articles will carry this tag, this will create a situation where you will never find the websites all the press talked about, the only thing you will find is news articles which contain links to it.
    this is not what the world wants ...
    /BP

    nuevojefe

    7:24 am on Jan 19, 2005 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    So many smart SE's couldn't be wrong.

    You'd think. I guess it must not be wrong for everyone...

    shri

    7:28 am on Jan 19, 2005 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    >> this could radically and negatively change the face of the entire web

    Nah. It changes the SEO game a little bit.

    How many people really look at tag attributes when they use CMS driven sites? How many people use HTML authoring packages like Frontpage?

    One of the key lessons I learnt from the senior members here is to not look at the web from your own myopic view. We tend to live on the bleeding edges and have a very skewed view.

    For those whose world revolves around blog links, Six Apart could roll out a plug in which allows a visitor to donate a configurable amount via paypal. If the said donation is made the links for that user would not be made nofollow. ;)

    shri

    7:32 am on Jan 19, 2005 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    >> a news site like CNN which decides that all external links in his articles will carry this tag

    Engines can still follow the tag and remove the link from its ranking algorithms. From little I know, crawl/index and rank algorithms should use a different set of criteria.

    PhraSEOlogy

    7:32 am on Jan 19, 2005 (gmt 0)

    10+ Year Member



    shri,

    Like PPB - (pay per blog).

    Nice idea!

    caveman

    8:19 am on Jan 19, 2005 (gmt 0)

    WebmasterWorld Senior Member caveman is a WebmasterWorld Top Contributor of All Time 10+ Year Member



    >Nah. It changes the SEO game a little bit.

    Possibly more than a little...but I'll grant that "radically" was perhaps too strong a word. :-)

    I'm no tech guy but I"m wondering if this can be confined for use in just blogs, forums, etc.? I doubt it. If not, this seems to offer a tool for manipulating entire websites on a micro level (one link at a time). Smaller innocents can't / don't have the knowledge / time / patience to do this or learn the tools that can help them do this. It puts another signficant tool in the hands of the SEO at the expense of the little guy.

    As one recent example, Florida was a blow to the little guy, albeit from an entirely different angle. This is quite possibly another if it's real and can be manipulated as easily as it appears on first glance. The little guy just doesn't know it yet. Florida changed the accessible web. This could too. Little guys get pushed further and further and further down in the SERP's. Anyone seen a trend towards smaller doing better in the SERP's lately? Ummm, not.

    This new tag seems to have the potential to be good for larger companies/sites with resources...and bad for the little guy. If it comes to pass, that's a substantial shift, and not a good one IMHO/. OTOH, nothing new as far as the trend goes. :-/

    I can already see the day when G will need to decide whether or not to punish non-blog sites that use this tag for PR manipulation. Good grief.

    I'm not ready to *any* make changes just yet. We'll wait and see just exactly how this is implemented, and how the SE's bend its use contextually, if at all. But it strikes me that this has the possibility of forcing us - for the first time since Florida - to look hard at the way our sites and links are sructured....or lose the advantage to those who do it while we sit idly by.

    BigDave

    8:40 am on Jan 19, 2005 (gmt 0)

    WebmasterWorld Senior Member bigdave is a WebmasterWorld Top Contributor of All Time 10+ Year Member



    Does this mean that when we do reciprical linking we are going to have to view the source code to ensure that tag is not added?

    Uh, you mean that you don't currently check the source of the google cache of your page?

    I don't do link exchanges, but if I did, I would want to check what Google sees, not what looks like a link to me. This new attribute has noting to do with this, I've been cloaking or using javascript to hide links from the search engines for years.

    sem4u

    8:44 am on Jan 19, 2005 (gmt 0)

    WebmasterWorld Senior Member sem4u is a WebmasterWorld Top Contributor of All Time 10+ Year Member



    Good news for bloggers and I am pleased for them. Will it clean up blog spam though? Only time will tell I guess.

    The new tag is likely to be abused by some webmasters on their link exchange pages. Another thing to look out for then...

    Nikke

    9:01 am on Jan 19, 2005 (gmt 0)

    10+ Year Member



    This will be another tool shady webmasters can use after they setup a reciprocal link exchange.

    As if they aren't doing it already, only using other techniques. Maybe, just maybe, this will put real links, put up by interest and the will to inform others, back on the web.

    iThink

    9:05 am on Jan 19, 2005 (gmt 0)

    10+ Year Member



    I can't understand the need for a new tag. Why they couldn't have modified the popular blog scripts like MT, wordpress etc. to convert all regular links in blog comments to the Java script links?

    Also this new tag will help only those blogs where the blog is updated by its owner. There are huge number of blogs, message boards and guest books out there that are abandoned. Their owners have simply not updated those sites in last many months or even years. Such blogs are the target of comment spammers and they will continue to remain so for a long time to come.

    GerBot

    9:05 am on Jan 19, 2005 (gmt 0)

    10+ Year Member



    I don't know about anyone else but I see this as a backward step.
    If the search engines had to come up with this solution it means their algorithms could not handle blog links.

    When search engines start making manual adjustments and specific algo adjustments to fix one problem rather than tackling the issues within the basic premise of their algo the battle has been lost and the door is open for a new contender.

    stever

    9:16 am on Jan 19, 2005 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    Surely it wouldn't be beyond the realms of intelligence of the blogs and search engines (especially if they are sitting down together) to note identifying marks for each blog (eg. "blueline.gif") in order to note when this is happening for blog comments and when not.

    It's slightly bemusing that this is being seen as an attack on comment spam, when IMO it's clearly a PR exercise for search engines and blog software owners.

    As others have mentioned, it does nothing to solve the problems of large-scale comment abuse - but it does allow the two sides to address the negative PR that they were receiving on this issue from "important" blog users. (And it has allowed some critical bloggers to increase their self-importance to such a stage that their heads might well explode...)

    victor

    9:18 am on Jan 19, 2005 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    It seems every time I see a thread like this, I go back and look at the original paper that introduced pagerank.

    And, every time, the change is in line with the original description of the purpose.....which was not, ever, simply counting backlinks. Backlinks are a way of arriving at pagerank, but not the exclusive way.

    Tbe paper says:

    Academic citation literature has been applied to the web, largely by counting citations or backlinks to a given page. This gives some approximation of a page's importance or quality. PageRank extends this idea by not counting links from all pages equally, and by normalizing by the number of links on a page.

    Academic citation indexes have long discounted self-citation and circular citations.

    The addition of rel="nofollow" simply allows a site owner to state unambiguously if the citation is a recommendation endorsed by the site owner or not.

    Links were never meant to be counted equally (see above quote), so adding distinguishing marks is a smart idea.

    DGBrown

    9:27 am on Jan 19, 2005 (gmt 0)

    10+ Year Member



    I say: "It's about time! :)"

    It makes me wonder if I should dust off my old guestbook and give it another go. Maybe after seeing if this does anything to help.

    Of major blogs, I did notice Xanga missing from that list (unless I missed something)

    GerBot

    9:45 am on Jan 19, 2005 (gmt 0)

    10+ Year Member



    Links were never meant to be counted equally (see above quote), so adding distinguishing marks is a smart idea.

    But the engines are simply moving the onus onto the blogs/site owners to implement this tag because they could not do it from their remote location.

    Surly the purest view is that most sites are not built with Google PageRank in mind.

    landmark

    9:45 am on Jan 19, 2005 (gmt 0)

    10+ Year Member



    So what if all webmasters start using this tag for all their external links? Then the web ceases to be a set of interconnected pages, but an isolated collection of islands.

    You could ask, why would webmasters start using this tag as the default? The answer is, why wouldn't they? They can use this tag to block PageRank transfer and stop giving credit to other sites.

    If use of the tag on regular websites starts to become widespread then it will rollercoaster and there will be hardly any "real" links left. What happens then?

    Hester

    9:46 am on Jan 19, 2005 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    I can't understand the need for a new tag. Why they couldn't have modified the popular blog scripts like MT, wordpress etc. to convert all regular links in blog comments to the Java script links?

    Not everyone has JavaScript turned on. A pure HTML approach is always best.

    If the search engines had to come up with this solution it means their algorithms could not handle blog links.

    No it doesn't. (Unless you meant that they couldn't filter blog links into good and bad, which would be very difficult I imagine.) It means that they have found a way to improve links. The web is always changing, new code and software coming along. It doesn't mean the old code didn't work, but there are always ways to improve on it. This is just one improvement.

    It's slightly bemusing that this is being seen as an attack on comment spam, when IMO it's clearly a PR exercise for search engines and blog software owners.

    If you seriously believe that they're doing this just to make the people involved look good, I think you are mistaken. Any change of this kind can only be successful with every major player working together.

    The bottom line is: anything we can do to halt spam should be welcomed.

    victor

    9:59 am on Jan 19, 2005 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    Then the web ceases to be a set of interconnected pages, but an isolated collection of islands

    Not at all.

    That would degrade the value of search engines. But it wouldn't change the web at all. Every single link would still work.

    But the engines are simply moving the onus onto the blogs/site owners to implement this tag because they could not do it from their remote location.

    And it would have been even better if they'd added an endorsement factor:

  • rel="endorse=10" -- I really do endorse this other site
  • rel="endorse=-10" -- I really don't approve of this

    Then site owners could make it clear what the strength of a link is -- ie what they intended when they made the link.

    The "all links are equally important endorsements" view is clearly so 20th Century in its naivity that it is time for a change.

  • zeus

    10:29 am on Jan 19, 2005 (gmt 0)

    WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member



    Great news, because that would have got a big problem as time gone by.

    fjpapaleo

    10:30 am on Jan 19, 2005 (gmt 0)

    10+ Year Member



    <Of course, gee, now that I think about it...my affiliate links no longer need be java redirected. Hmmm>

    My first thoughts exactly. I think a lot of merchants just woke up with a major headache. Not to mention the endless opportunities this opens up for manipulation.

    I see a lot of busy webmasters over the next few weeks.

    emomilk

    10:34 am on Jan 19, 2005 (gmt 0)

    10+ Year Member



    I wonder why they're not demonstrating the new tag attribute on GoogleBlog, instead of the meta-refreshes they've placed on all of their external links..

    whoisgregg

    10:43 am on Jan 19, 2005 (gmt 0)

    WebmasterWorld Senior Member whoisgregg is a WebmasterWorld Top Contributor of All Time 10+ Year Member



    This tag attribute can be used by any website -- it's not just about blogs. It will change how knowledgable webmasters arrange their internal and external links to show (manipulate) the intent of the links as they relate to link popularity.

    News flash! Knowledgable webmasters are already implementing this same feature in a hundred different ways and the search engines see it in our code. They're letting us accomplish the same thing our javascript links accomplish in a standards based and simpler way.

    If you think this means that "suddenly search engines are deciding how the web works" then you've missed the last five years.

    If you think this is a new way for reciprocal linkers to lie to you, then you have no idea how bad you're already being lied to. This isn't shady, it's in the HTML source of the page.

    "But it will be manipulated!" Yes, it will. Once a site starts to manipulate it, then each search engine will likely turn off that attribute's effect on a per site basis. Sites which use the tag for it's intended purpose will have the attribute taken into account. Just like semantic markup, it only matters if you use it.

    uncle_bob

    10:55 am on Jan 19, 2005 (gmt 0)

    10+ Year Member



    While I can see the benefit to search engines, I doubt it will stop comment spam (have spam filters, rbl lists etc stopped email spam?) which makes me wonder what the collateral damage to the blogging community will be?

    If the new attribute was only used for un-moderated comments, then that would seem ok, but tarring all comments with the same brush seems to degrade the community building aspect of blogging. I've always felt blogging is different from ordinary websites, as it encourages trackbacks and commenting. This would seem to be a step backwards for blogging, to solve a problem that is better solved by moderating comments first.

    amznVibe

    12:06 pm on Jan 19, 2005 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    Why not also allow this attribute on the body tag to affect the entire page and in the meta headers?

    Oh wait, it is ALREADY available in the headers, bloggers are just not using it.

    I also forsee a new kind of cloaking - humans get regular links, engines get rel="nofollow" added to all anchors.
    Link exchange? No problem, because it won't weigh the page down.

    docrahul

    12:13 pm on Jan 19, 2005 (gmt 0)

    10+ Year Member



    How about using this new tag to rationalise our own site's internal linking?

    Would it not help if we put this tag in our links to pages like disclaimer & privacy policy, etc?

    Brett_Tabke

    12:27 pm on Jan 19, 2005 (gmt 0)

    WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



    > 3 engines

    Ya, that is a whole different story than if just Google alone were doing it. I am really surprised G let the story set out there the way it was without correction - (they had ample opportunity to correrct it too).

    > humans get regular links, engines
    > get rel="nofollow" added to all anchors

    Bot content for bots - human content for humans. Been doing that for years.

    > Way to block links.

    Exactly. Now you can link out without having to hide it from the bot.

    How about short changing your recip linking partners with cloaking? Is this the rebirth or death of organic linking?

    the_nerd

    12:32 pm on Jan 19, 2005 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    Exactly. Now you can link out without having to hide it from the bot.

    How about Affiliate Links? Would you trust the SEs and use the new attribute on a plain link instead of some fancy redirect?

    This 135 message thread spans 5 pages: 135
     

    Featured Threads

    Hot Threads This Week

    Hot Threads This Month