homepage Welcome to WebmasterWorld Guest from 54.227.171.163
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Marketing and Biz Dev / Link Development
Forum Library, Charter, Moderators: martinibuster

Link Development Forum

    
Are Links Discovery Time Sensitive?
jk3210

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4480783 posted 2:32 am on Aug 1, 2012 (gmt 0)

Example...you publish an article that includes no outbound links, and Google spiders the page. Then two months later you place a few outbound links in the article.

Is there any evidence or opinions as to whether or not those links pull the same weight as if the links were originally in place when Google initially spidered the page?

 

ZydoSEO

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4480783 posted 3:11 am on Aug 1, 2012 (gmt 0)

One might ask the same question of content. What if you change/update the content on a page two months after its first spidered? Does that content become less valuable once the engines notice it has changed? I doubt very seriously any such devaluing occurs.

And links (whether navigational or contextual) are part of the content of a page. And I have seen zero evidence that they get devalued somehow because they were not there with the page was initially crawled.

If you think about what you're asking, adding links to content that has already been crawled is absolutely natural... in fact... it's an absolute necessity.

Think about how you build a new site. As you add new pages you HAVE to go back and link older pages to them somehow in order for the new pages to get crawled (unless you plan to have a site with a bunch of orphaned, non-interconnected pages using only a sitemap.xml to help the engines discover those pages... but then the "web" would not be a "web". I wouldn't recommend it! LOL)

Additionally, very few pages on the web remain indefinitely as originally published. In general pages tend to change over time... Even articles/posts which might remain the same, the templates around them (sidebar widget contents, navigation, etc.) change.

If such devaluing existed, I would think pretty much every site on the web would be guilty of doing this. Very few sites (or pages for that matter) get published and never change. Wikipedia an extreme case of where this very thing happens all of the time on massive scale. They don't seem to be hurting from it.

jk3210

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4480783 posted 4:43 am on Aug 1, 2012 (gmt 0)

All good points. Thanks.

martinibuster

WebmasterWorld Administrator martinibuster us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4480783 posted 5:28 am on Aug 1, 2012 (gmt 0)

...adding links to content that has already been crawled is absolutely natural...


Under what context?

When an article is published in a Wordpress CMS, how often do you think the author changes their mind and returns to add links to the article/post?

When a newsletter is published, how often are links naturally added after the fact?

When a forum post is created, how often are links naturally added to the post?

This is an important question that needs to be answered: In what context would a link naturally be added to an article after it has been published?

martinibuster

WebmasterWorld Administrator martinibuster us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4480783 posted 2:16 pm on Aug 1, 2012 (gmt 0)

To answer my own question, one time where links may pop up naturally is when an entire site is converted from non-advertising (for link building purposes) to advertising driven.

But the above scenario is a rare one. I think links added to existing content that has existed without links may look unnatural.

Thoughts?

Marketing Guy

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4480783 posted 2:33 pm on Aug 1, 2012 (gmt 0)

Agree in principle, but I think there are too many exceptions to the rule for search engines to factor it in.

Popular posts widget that updates regularly = constantly changing links on a page.

Any recent items news feed = constantly changing links.

Job listings being removed / expiring (or events listings) = constantly changing links.

Article, post or whatever being edited after the fact (more info, change in circumstance, etc happens regularly) = changing links.

Also, forum posts, blog posts etc are all pieces of content that are naturally extended over time with comments and replies - most of which contain new links (profile URLs, sigs, etc).

How many new *pages* discovered on WebmasterWorld will have new links due to replies the next time they are crawled?

ZydoSEO

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4480783 posted 2:43 pm on Aug 1, 2012 (gmt 0)

When an article is published on a WP blog, I think authors frequently go back and add links to older articles/posts as they add more and more related content. Perhaps an original post about topic X mentioned topic Y in passing but didn't discuss topic Y in detail. The author might later write another post about topic Y in more detail and go back to the original post on topic X and link to it so that those readers of the old post can get more detail on topic Y from the new post if desired. What is unnatural about that? It's how a web site's content naturally evolves over time.

Posts on forums often get indexed in a matter of a few minutes. So if I post a question, it gets indexed, and then others respond with a great discussion including links to very relevant pages to support the discussion around that question, what is more valuable? Just the original question? Or the subsequent discussion including related links within that discussion to support what is being talked about?

What if a forum thread first gets indexed after 2 not so great comments... but then there are many more comments that follow later with great content and links? Are those 2 comments and any associated links contained in them in some way more valuable to users of search engines than the 100 great comments and associated links that followed simply because they were there at the random time that the engine crawled the content?

The very nature of growing a web site by adding additional pages over time absolutely demands that you add links to older, previously crawled pages. Sometimes these get added to navigational structures that make up part of the architecture of the site. But often times, not all pages on a site can be included in navigation, sub-navigation, etc. The ONLY way to make these pages crawlable is to add contectual links to them on pages previously crawled. Does it seem logical or fair that every page added to a site after the site's initial crawl have it's internal inbound links devalued?

While I agree that there "might" be certain mediums like newsletters, press releases, etc. where it may be less natural to add links later, these are likely corner cases and do not apply to most of the web. And any rule to deal with certain corner cases (i.e. a very small percentage of pages on the web) applied broadly to an entire index of the web just asking for trouble.

I could also see that there might be certain situations where they might want to implement such a rule, possibly when a particular blog or forum is known for being exploited by spammers to get followed links and the webmaster refuses to do anything about it... But in those cases it seems it would just be easier to devalue ALL outbound links from such a site rather than trying to determine based on a page's initial crawl time which to count, which to devalue.

Such a crawl time relative rule would essentually devalue all UGC and links within UCG (blog comments, forum discussions, reviews, etc.) added by users after an initial crawl of the URL. It would promote maintaining old, static, out-of-date content on the web and site designs because changes to the content, contextual links, and or internal linking structures would certainly negatively affect rankings. Essentially it would say don't publish a new site or page/post/article until it is absolutely perfect and never change it because changing it later will screw you.

I don't see any such rule as doing anything to "give users better search results" which should be the goal of any algo rule like this. And since very few pages on the web (I would guess some miniscule, tiny, tiny fraction of 1%) remain static forever after being initially crawled, such a rule would do far more harm than good to search results.

But hey... I could be wrong... I have seen engines implement stupid stuff before.

CainIV

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4480783 posted 6:00 am on Aug 16, 2012 (gmt 0)

Studies we have done have showed that links added to pure content on pages where the length and makeup of the content stayed the same had less effect on rankings than adding new fresh content with the same anchor text in the page.

We tried to measure and create controls so that we could eliminate any impact of social media and links to the content when testing. The results were significantly different.

wolfadeus

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4480783 posted 8:03 am on Aug 16, 2012 (gmt 0)

ChainIV, was the effect stronger in the positive or negative direction? I.e., is it more likely to harm rankings that benefit them?

CainIV

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4480783 posted 6:12 am on Aug 17, 2012 (gmt 0)

We found that links added to fresh content impacted website rankings positively in a statistically significant way.

Often links added to "old" pages did appear to impact rankings very little.

That isn't to say that the process isn't query and content type-dependent, along with other factors.

jk3210

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4480783 posted 3:05 am on Aug 19, 2012 (gmt 0)

To be clear, are you saying that (for example) adding a new paragraph that included a new link to an existing post/page/body of text, showed a greater positive effect on ranking?

Or are you referring to a complete re-write of the text on an existing post/page/body of text?

martinibuster

WebmasterWorld Administrator martinibuster us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 4480783 posted 5:43 pm on Aug 19, 2012 (gmt 0)

My question asking whether the activity is natural is important. If it's not something that happens naturally then you have reason to look over your shoulder. Take a look at this important article written by Bill Slawski about a patent granted to Google titled, The Google Rank-Modifying Spammers Patent? [seobythesea.com]

The patent is related to signals passed to Google that indicate a web page is being altered in order to influence rankings:

A Google patent granted this week describes a few ways in which the search engine might respond when it believes there’s a possibility that such practices might be taking place on a page...

Those practices, referred to in the patent as “rank-modifying spamming techniques,” may involve techniques such as:

•Keyword stuffing,
•Invisible text,
•Tiny text,
•Page redirects,
•Meta tags stuffing, and
Link-based manipulation.


This is what I was getting at when I posted:

I think links added to existing content that has existed without links may look unnatural.


Now it is confirmed that Google makes a determination of what is statistically within the range of normal. How correct is besides the point. The point is that Google is comparing against what is normal and what is not.

So when you make a big change to your web page, be sure it's a normal change. I try to make my pages good the first time out then don't change them except to remove dead links, including affiliate links.

If most sites within a certain category type don't change their content, then a site within that certain category type that does change its content in ways that are associated with influencing ranks will stand out. This is what the patent is about, that's what I was referring to. How often does the addition of links to a published web page happen naturally? If not often then you are standing out as trying to spam.

What do I mean by a certain category type? The type of site it is, like a forum, a directory, a WP site, an ecommerce site... For example, in a forum, the addition of a link can be considered to happen naturally as people come along and add to a discussion. As we all know, Google understands when a site is a forum and treats it differently in the SERPs. That's a smoking gun example of how Google understands the context of a site and treats it differently. But what of other contexts? The above patent describes what I was talking about, that Google is measuring what is normal behavior and taking into account what is abnormal.

Does this cover links added to a site as the site expands and grows? Absolutely not. That is normal behavior. Would this cover links added as anchor text to an existing document? I think yes, that would be unnatural.

Getting back to the original post, the answer could depend on how often the addition of links happen to existing web pages within a certain category type, across the entire web. As I asked above: In what context would a link naturally be added to an article after it has been published?

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / Link Development
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved