homepage Welcome to WebmasterWorld Guest from 54.166.228.100
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 40 message thread spans 2 pages: < < 40 ( 1 [2]     
How to avoid Google's misunderstanding of unpaid links?
artefaqs




msg:4601644
 10:30 pm on Aug 13, 2013 (gmt 0)

One of my web sites is a moderately successful local news site. Enough to support a small team of part-time writers and myself.

In addition to the articles written by the staff I occasionally (maybe every six months) publish an article by a good friend. His company has a web site that is in an industry related to the topic of the site in question, so the content is relevant, and I'm happy to help out a friend with a little free publicity for his business.

Necessarily, there will be a keyword-rich link or two in the content of his article that links to his site, amid the other links. I'm happy to have some of the Google juice flow his way if it helps him.

Here's the thing -- How is it possible to signal to Google that these outbound links aren't paid? I could nofollow them, but I don't want to. I'm happy to help his site. No money is changing hands, because I'm just helping out a friend, so I'm within Google's TOS.

Any thoughts? I'd hate to get some kind of Google penalty for not disclosing paid links when they're not paid.

 

Robert Charlton




msg:4602838
 6:41 pm on Aug 17, 2013 (gmt 0)

I could way off base, but as long as his friend isn't doing too much other link building with keyword rich anchor text after glancing over (will read closely later) that article he should be okay.

The issue isn't only what his friend has been doing... it's perhaps also patterns on his site.

He mentioned in his first post (my emphasis added)...
In addition to the articles written by the staff I occasionally (maybe every six months) publish an article by a good friend.

How long have they been friends? ;)

The sincerity of the friendship or the quality of the friend's articles have nothing to do with this. It's got more to do with the intent of manipulation, and the pattern which will become more visible over time. Depending on various factors, that could be more noticeable on the source site than on the destination site. The length of time is obviously a factor wrt the size of the footprint.

I haven't yet read the pdf that martinibuster links to, but I'm sure it's valuable. I'd also dig out a copy of Google's Historical Data Patent and study it, keeping in mind that not everything in it has been implemented, and that information technology has evolved.

With regard to this current discussion, the patent's comments about coordinated behavior are well worth studying. Here's one of several threads that will lead you to the patent and to earlier discussions about it....

Google's Historical Data Patent Gets an Update
Sep 18, 2008
http://www.webmasterworld.com/google/3747164.htm [webmasterworld.com]
.

bsterz




msg:4602865
 9:19 pm on Aug 17, 2013 (gmt 0)

Very good points Robert. Thanks for getting me thinking.

1script




msg:4602866
 9:42 pm on Aug 17, 2013 (gmt 0)

Abnormal Outbound Linking patterns
Stop the press! Google FUD campaign is working!

Are you guys serious, ONE link in 6 months is a cause for concern? How much noise do you think will accumulate in these 6 months that it would not completely drown out any kind of signal Google might have been able to compile? They would have trouble identifying any signal that is not substantiated by the number of occurrences that has at least (my conjecture) three digits in it. Otherwise what kind of ungodly computational power / storage do they really have?

<RANT_MODE>

We (myself included) all seem to fall into this trap of assigning sentient powers to inanimate objects, like search engines for example - it's a known paradox in robotics. I hear a lot of speculation about what Google would have done if they really wanted good search results. Problem is: a lot of that talk sounds more like what a human would have done, and in some cases no less than what the all-knowing omnipresent God would. Unless Google starts talking to me in Morgan Freeman's voice, I will not believe it actually does what you all say it SHOULD be doing.

A bunch of fallible humans (who don't even communicate all that cohesively to each other in huge companies like Google) have to program all these functions, and then have to figure out how different parts interact (or counteract - just as likely) with each other, the existing code etc. Then MC's team throws a curveball trying to manually override things etc. So, clearly, Google does not do what's logical (to a human being) - it does what the messy 15-years-in-the-making software produces. Besides, 90% of what goes into Google is automatically generated garbage. What do you think comes out of the other end, gold?


</RANT_MODE>


Anyway, sorry for the side rant. I just thought that obsessing over one link in 6 months is silly. He is your friend, an actual warm bodied human being, friendship with whom you value - go ahead, link to his site ONCE IN 6 MONTHS. Google can go 123k themselves.

Planet13




msg:4602869
 9:54 pm on Aug 17, 2013 (gmt 0)

I think a few people pointed out already that they would not be concerned about one link in 6 months.

The concern is with a series of patterns that the linking site - and the target site - exhibit that would be analyzed by google.

Again, the pdf file that martinibuster linked to from his blog post is an excellent example of HOW search engines look at web sites.

martinibuster




msg:4602905
 3:35 am on Aug 18, 2013 (gmt 0)

It doesn't sound like he is making much of a pattern that could be detected (IPs, URLs, Auto Content, Poor Content).


The metrics you cited are the OLD way of doing things. That's pretty much done, finished. The algo is years beyond that now.

I agree with you though. One link every six months probably won't get flagged. But really, is helping a friend manipulate Google's algo really worth possibly drawing scrutiny? I've had businesses come to for help dealing with unnatural links warnings and I am impressed with how granular Google's gotten on catching manipulation- not just link buying, but manipulation.

The focus is on finding manipulation. What the OP is doing is algo manipulation. A violation of guidelines. Always has been. It's just getting caught more often (not absolute, more often).

The OLD SCHOOL way of thinking is that no money changed hands so it's not link buying and it's all good. The reaction reflecting reality is that this is an attempt to manipulate the algo.

Good luck.

Wilburforce




msg:4602936
 8:03 am on Aug 18, 2013 (gmt 0)

is helping a friend manipulate Google's algo really worth possibly drawing scrutiny?


The problem isn't scrutiny, it is penalty.

Applying negative weighting to off-site content may on the face of it look like a way to discourage spammers, but in practice it is a spammers charter: by generating artificial and unnatural content I can kill off my competitors. Instead of spamming my own site, I can spam a hundred others with no risk to myself.

What the OP is doing is algo manipulation. A violation of guidelines.


On-page violations, however, have always been risky. Nevertheless, crediting the author with a link to their site shouldn't (almost certainly wouldn't) of itself cause a problem: it is the way anchor-text is used that might be an issue. Nevertheless, it probably wouldn't have any effect on either site unless it was part of a pattern (i.e. if it was one of a number of links web-wide pointing to the recipient with the same or similar anchor-text).

It looks to me as if a large number of links with the same or similar anchor-text can foul Google's algorithm whether they are manipulated or not. I have a couple of pages with wholly organic backlinks that have fallen 40+ pages from page 1 in Google SERPS for Pagesubject.

ColourOfSpring




msg:4602946
 8:38 am on Aug 18, 2013 (gmt 0)

Instead of spamming my own site, I can spam a hundred others with no risk to myself.


This happens all the time. I've seen so much churn online over the last 18 months. I run a number of niche directories, and I ping 10s of thousands of sites to see if they're online or not to see if they should still be listed in my directories. I have a general 3-strikes-and-you're out policy - 3 pings over 6 days - no sign of life, automatically remove from the directory. The number of sites being removed has increased significantly over the last 18 months - perhaps 3 to 4 times what it normally was. It tells me that a lot of sites are being hit with penalties AND they were too reliant on Google. Were ALL of them thin content / building spammy links? I doubt it. We're talking about 18 months after Penguin 1.0 and a lot of people know not to build such links. Yet their sites seemingly attract such links, so who is building these links? Why, someone with commonly-available software that costs a few dollars and costs zero dollars and a few seconds of their time to blast out thousands of links to any given domain name. Multiply that "someone" by 100,000 and there's your problem - spammy links are easy and free to build, thousands built in seconds. If it's easy to do, and there's monetary gain to be made, then it will be done.

martinibuster




msg:4602967
 1:45 pm on Aug 18, 2013 (gmt 0)

it is the way anchor-text is used that might be an issue


Right. That point needs to be stressed to avoid misunderstanding creeping in.

atlrus




msg:4602982
 2:38 pm on Aug 18, 2013 (gmt 0)


The focus is on finding manipulation. What the OP is doing is algo manipulation. A violation of guidelines. Always has been. It's just getting caught more often (not absolute, more often).


Lol, have you seen the search results lately?!? Don't know where you got the info that "it's getting caught more often", but in my niches it's the exact opposite - there is not a single website in the top 20 that has natural links, not one. I was FORCED by Google for the past couple of years to buy links, even though I did not want to (not because of moral reasons, it just cost money), Google has simply been unable to catch smart link buyers, so you either keep up or keep out.

And the guy links to a website and you got him running around scared about "patterns" and penalties?!? Is this real life?

What the OP is doing is NOT manipulation, but the way the web actually works - people like a website and link to it. Sure, it's an arbitrary vote, but that's the way it's supposed to be. If you are willing to part with some of your traffic without being paid - that's a true vote, no matter what anchor you use.

Geez, if you read this forum for a while you would think that end of the world is near. Luckily this attitude towards links <snip> is not what actually happens out in the real world, where people are still linking out freely.

[edited by: aakk9999 at 3:28 pm (utc) on Aug 18, 2013]
[edit reason] ToS [/edit]

jmccormac




msg:4603114
 12:30 am on Aug 19, 2013 (gmt 0)

Here's a link to the Microsoft PDF paper [webdb2004.cs.columbia.edu] that I summarized. It describes a method for identifying linking patterns, which easily spots attempts to manipulate the search algorithm, something the OP is doing.
It is an old paper but a good one, Martinibuster,
If Google is still relying on such methodology, then it is obvious why they are having a lot of problems. The paper predates Made For Adsense sites and PPC parking to a great extent. It looks like these researchers identified some PPC parking content but didn't recognise it.

The main problem that Google seems to be having at the moment (amongst a lot of others) seems to be in distinguishing the difference between a junk link (warez/pron/drugs) and a valid link at an index+1 level (at levels deeper than the index page).

@bsterz
I am earnestly asking, how easily do you think that they could identify the OP's pattern? It seems to me that it would have a really small footprint. In my opinion, if he never said, "I'm happy to have some of the Google juice flow his way if it helps him." Google would be okay with a dofollow link.


I don't know how Google does it but I don't have a very high opinion of their abilities in this respect based on the SERPs (they can't tell compromised Wordpress/Joomla sites from clean WP/J sites). From a search engine developer point of view, it is very easy. I wrote a few posts here about Link Velocity (the rate at which a site gains new inbound links) in the past. There is also an Link Velocity rate for outbound links. Normally the graph is spikey on a very long period (a year or two). What happens when a site is hacked and dodgy links are injected is that the number of outbound links increases and continues to increase and change if the hole that allowed the injection attack is not patched - like many Wordpress blogs suffering from that compromise in March 2013.

Now if you take the basic link types (internal, subdomain, inbound, outbound and reciprocal) some patterns will appear. The Social Media links tend to appear in most SEOed site's link graph so they can be excluded. Sites linking to each other will have either a page specific link (deeper than index) or an index level reciprocal link. Periodic links to the same site will show up on the timeline. The tricky part is the anchor text of the links. A precise and unusual combination of words would probably be safer than a category killer, non-specific or highly generic phrase. That would require testing the phrase against a dictionary/frequency table. Not impossible on a small scale or even a medium scale but where there is a site flagged for manual review, it is not a difficult problem to solve. But that's just my opinion - Google might do things differently.

Regards...jmcc

This 40 message thread spans 2 pages: < < 40 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved