| 1:43 am on Jun 8, 2003 (gmt 0)|
Does it need to be more severe? Maybe because you only have one site you might think so. But for webmasters with multiple sites it might be good to have some clear guidelines about what constitutes 'cross-linking' if google is going to treat it like a capital offence.
BTW, internal-link spam is becoming as much of a problem as cross-linking now with one site I saw having 200,000 dynamically created internal links putting it at number 1 (there was a thread about this here but it was deleted so as not to give any of you naughty webmasters any ideas).
| 2:08 am on Jun 8, 2003 (gmt 0)|
What makes you think I have just one domain?
I also cross link domains, but I try to keep it sensible. When it is no longer useful for the user I think it's time to call it a day.
| 2:21 am on Jun 8, 2003 (gmt 0)|
Yes you have pinpointed the major problem with the cross-linking penalty. Our instructions are to 'try to keep it sensible' and 'useful for the user', both inherently nebulous concepts.
| 12:03 pm on Jun 8, 2003 (gmt 0)|
There is a heavily cross-linked family of sites in my part of the world, who find it necessary to also create the links in white text on a snow.gif background. Gee, I wonder why?
They are still carving up the seerps many months after having been dobbed... so my guess is that cross linking has not kicked in nor has the more problematic of the hidden stuff filters.
On the subject of penalties and their deterrent factor.. a mirror merchant who also dabbled in some crude hidden stuff on all pages recently did 30 days in the sinbin. His get-out-of-jail solution was to do up one squeaky clean new home page and get it spidered, then resume the links to all the original internal pages that were still exactly as they were at the time of penalty.
The original inbound links were still in place so he has more or less resumed exactly where he left off, all for the cost of a new home page.
I'm trying to make up my mind if this is a very stupid operator or an ineffectual review system for releasing sites back into circulation that obviously have no intention of cleaning up their act.
| 12:18 pm on Jun 8, 2003 (gmt 0)|
>> I have a feeling Google is tuning their cross-link spam filter and they haven't quite decided where to set it yet <<
You'd think they would be better off concentrating on bringing the index up to date. If they are are focusing on spam traps instead, they really have lost their way.
>> months after having been dobbed... <<
Here we go again.... the number of people who think the way to make progress is to file 'teachers pet' reports is frankly depressing.
| 12:21 pm on Jun 8, 2003 (gmt 0)|
We are seeing the exact same thing in our industry sector. Approx 2 days ago sites which were heavily crosslinking with their other sites were ranked considerably worse than they are now. These sites had only approx 1 - 5% of non company related external links.
Right now they are back ranking well in the serps.
| 12:31 pm on Jun 8, 2003 (gmt 0)|
|Here we go again.... the number of people who think the way to make progress is to file 'teachers pet' reports is frankly depressing. |
Just keep your sites clean .. no need for depression ;)
| 5:30 am on Jun 8, 2003 (gmt 0)|
if you have 10 + 1 site, and you place links on those 10 sites and link just with/from these 10 sites, to the "1" site, is the risky? Is this crosslinking?
I know that creating links to all those 11 sites, where every site has 10 links to the other 11 sites is a not-done idea..
| 11:22 am on Jun 8, 2003 (gmt 0)|
In a nutshell, if your link structure is of any use to the visitors, that's fine. But if you are going to link across unrelated sites simply for the purpose of playing around with PR, better not.
| 11:44 am on Jun 8, 2003 (gmt 0)|
but to answer,that's not cross-linking .. as the word suggests the linked sites should be interlinked to be called crosslinked.
| 11:51 pm on Jun 8, 2003 (gmt 0)|
eh - Anyone care to sticky me with an approx idea of how to dynamically create 200,000 internal links?
Purely for research purposes I hasten to add....
| 11:58 pm on Jun 8, 2003 (gmt 0)|
>>sticky me with an approx idea of how to dynamically create 200,000 internal links?
>>Purely for research purposes
| 12:13 am on Jun 9, 2003 (gmt 0)|
>>You'd think they would be better off concentrating on bringing the index up to date. If they are are focusing on spam traps instead, they really have lost their way.
You're talking like there are just 3 guys working there. I doubt everyone in the office could work on the same problem together ;)
>>if you have 10 + 1 site, and you place links on those 10 sites and link just with/from these 10 sites, to the "1" site, is the risky? Is this crosslinking?
I know that creating links to all those 11 sites, where every site has 10 links to the other 11 sites is a not-done idea..
Istvan - yes, that's what I was talking about when I discussed heavy cross-linking. This is obviously not for the benefit of the user!
I have my personal view of what is ok and what's not. But there are bounds to what's reasonable, and I think if 95% of people say something is spam, then it most likely is. The SEO world is too abstract for Google to set quantitative rules.
| 12:50 am on Jun 9, 2003 (gmt 0)|
SlyOldDog>>Istvan - yes, that's what I was talking about when I discussed heavy cross-linking. This is obviously not for the benefit of the user!
But this is the "hub and spoke" model discussed multiple times here on webmaster world, this has viable uses.
Take a web design company as an example.
Each site they designed has a link back to their corporate site and on their corporate site they have a link back to the sites they designed as examples of their work.
This is of benefit to the user on both counts is it not
| 1:02 am on Jun 9, 2003 (gmt 0)|
I think I've just been penalised for exactly that depthcharge. Loads of links from every page of my client's sites & I link to them as part of our "portfolio".
A well respected UK seo I met last week recounted a similar story.
| 1:03 am on Jun 9, 2003 (gmt 0)|
This is not a big issue. The issue is when there are say, 10 domains, and each domian carries links to all other domains from multiple pages within each site. Let's choose an example:
10 Sites, 100 pages each.
Each site links to all the others.
Thats 10 sites, with 9 external links per page * 100 pages = 9000 links of questionable benefit to the user.
And that would be a small network. The one that I have an eye on has dozens of domains with hundreds of pages each, and each of those with around a hundred external links per page. You do the maths...
| 3:48 am on Jun 9, 2003 (gmt 0)|
In this instance I would agree SlyOldDog. Theres no way thats beneficial to the user and is obviously spam
Question is how is google gonna tell automatically via their algorithm what is acceptable and whats not? When does a cross link between sites become spam?
| 7:31 am on Jun 9, 2003 (gmt 0)|
I guess that's why it's taken so long for Google to implement an automatic filter ;)
One way is to target domains who link to the same external pages over and over from within their site. Obviously the example of the web page construction company receiving a link from the bottom of every page would be caught by this filter, but it could be tuned so it only kicks in when the spam site is linking to several sites in this way. In any case, I'm not sure why web page construction companies should have immunity. One link from a site's index page is enough for branding.
I'm not saying the spam site should receive a penalty - just that the links should be ignored.
| 7:40 am on Jun 9, 2003 (gmt 0)|
Some helpful information on cross linking can be found here
Read through to the end.
| 10:15 pm on Jun 9, 2003 (gmt 0)|
A little off-topic, but can anyone explain to me why cross-linking domians is more beneficial than having the same number of links internally in my site?
So far as I remember the value of a link is proportional to the pagerank of the PAGE it comes from, irrelevant of the domain. The only one benefit I see of multiple domains is that each domain can get independent incoming links more easily because each domain is on a different topic, but even then, I have serveral ODP links to my one site, because it covers several topics.
| 12:22 am on Jun 10, 2003 (gmt 0)|
An SEO that has a competitor of mine as a client has a very simple, yet effective, linking strategy that has paid off in a big way in Google SERPs: adding inconspicuous links to other clients' domains (and in some cases the SEO's site) on the bottom of every page of each client's site.
This has collectively added up to several thousand backlinks (measured by ATW) and solid PR7's for the sites that have been clients for a few months. The SEO site itself is now a PR8 with over 50,000 ATW backlinks.
In my mind this is something that has no value to the user, and only designed to boost SERPs, and therefore should be considered spam. It seems to have evaded any crosslinking filter thus far, in part I think because he's working with a large number of sites and is able to crosslink them very "loosely". I.E., not every site links to every other site, though each individual site links to a select set of sites. So some links are one-way, others are reciprocated, but speaking in web-graph terms, there are too many nodes and too few closed cycles to reliably detect the crosslinking with an algorithm.
I'm sure everybody's thinking "this is what the spam report was made for" and believe me, I intend to report this extensively, but I also feel it bears mentioning here as a cautionary tale to webmasters/SEOs and as a notice to Google.
The sad truth is that artificial off-page SEO-spam is working and Google's spam filters are not. I'm sure that's part of what this update is about, and I'm confident that this will never be a good long-term strategy thanks to Google's diligence, but its very frustrating to see it paying off in the meantime. Sometimes I wonder if the real lesson is "if you can't beat 'em, join 'em" and I'm often tempted to buy a bunch of domains and do something similar, but I just can't bring myself to knowingly deceive the user or put so much effort into anything that will later be cast aside by a few lines of spam-detection code. The problem is you'll always have some competitors who have no such morals, and some SEOs just take the cake in that category.
| 1:48 am on Jun 10, 2003 (gmt 0)|
I don't know if I have understood you correctly:
That guy puts links back to the sites he optimize, or even to his own site (maybe as a part of the terms and conditions of the service he offers to his clients) at the bottom of them?
I'm sorry (I'm new at this) but if that is what he does, I don't know where the problem is. It's his own work,... a lot of sites do it, www.terra.com
| 1:59 am on Jun 10, 2003 (gmt 0)|
Yes, being new at something often gives one a clearer view, free of prejudice and jealousy.
So long as the links are visible there is nothing wrong with it.
[edited by: DavidT at 2:02 am (utc) on June 10, 2003]
| 2:01 am on Jun 10, 2003 (gmt 0)|
It usually tends to work at least for a while but IMO it also makes the client sites look funny with a bunch of irrelevant links on the pages.
Spam works, at least for a while.
| 2:02 am on Jun 10, 2003 (gmt 0)|
It seems like some sites get away with it. Others get banned.
| 2:16 am on Jun 10, 2003 (gmt 0)|
>"this is what the spam report was made for" and believe me, I intend to report this extensively
Oh my good God! I'm so glad I'm not the offender. I need my sleep at night.
| 2:21 am on Jun 10, 2003 (gmt 0)|
>>there is nothing wrong with it.
That's just it, there is nothing wrong with it. Jeezus, how did we go from links are good to links equal spam?
| 2:34 am on Jun 10, 2003 (gmt 0)|
|how did we go from links are good to links equal spam? |
when we decided to live by the ubiquitous ambiguous rules of G.
you know...in this case, the rule that says if the link makes sense for the user than your probably ok.
G's algo is having considerable trouble addressing cross-linking. Just imagine what would happen if the G applied this rule: what would the tens of thousands of web designers do with global link footers? how in the world would G's algo know if you placed the link on the third party site, or the webmaster did by choice. on and on. there are certain things that just can't be policied effectively.
what can be effective is reporting this tactic to G. It is kind of ironic that at the end of the day, G will always need a human to deal with above said *rule* contradictions, the pace of change within the spamming community, and the innocent ignorance of the marketplace it serves relative to these rules.
| 2:48 am on Jun 10, 2003 (gmt 0)|
Doesn't look like spam to me. Looks like good SEO. Correction....all of those sites are PR7? Great SEO!
They should be allowed to link however they wish. You said yourself the cross-linking is not excessive. So the flaw is in Google.(This is the type of problem you occasionally run against when you're dealing with a search engine that values off-site factors more-so than on-site factors.)
Of course, if that person was my compititor, I would report them as well. :-)
| This 77 message thread spans 3 pages: 77 (  2 3 ) > > |