| 2:16 am on Jun 8, 2007 (gmt 0)|
--- the key to future success in Google ---
why so much determination?
There is an old Russian Song-Joke to so extent that goes:
"We put our stakes up high, and wanted throw a fight,
And then we changed our minds and started .... everybody"
| 2:49 am on Jun 8, 2007 (gmt 0)|
I've read this entire thread. I find it amazing what is being said.
I have a lot of direct experience with reciprocation, in all kinds of situations. Enough to know how it really works, again and again.
We do it all with relevance and only with other sites that offer to reciprocate. It's all up front, nothing hidden, and only with sites that want to do it. We ignore PR.
And, we do it via directory-to-directory reciprocation. That's the nature of it. That's the established protocol that was around even before Google existed. I've been doing it that long. If you want the link, then reciprocate.
Reciprocation is often the only way to get a link from some very good, well-ranking, relevant sites that are most certainly "authorities" in their field. A lot of the sites that we link with offer no other means of getting a link. They don't have blogs. It's gracious reciprocation alone that gets the link. That's a very small price to pay for some very good links.
Here are some actual facts:
There is virtually no real evidence that legitimate reciprocation has been nullified whatsoever. The new sites that we work with respond to it exactly as sites did several years ago. We've changed virtually nothing about our approach. By and large, sites that begin to reciprocate gradually climb the rankings, then get there, and stay there. For proof, just look at real serps. Don't take my word for it. The evidence is pervasive.
In some very competitive cases, reciprocation alone will not get a site there. Reciprocation is not a magic key. However, we are intimately aware of many many sites that rely on reciprocation as their only method of building links, and they are dominant in the serps for some very competitive terms. Some for years. Others are realtively new.
People can take advice about this from people who don't do it, don't understand it, imagine all kinds of things about it, and spend a lot of time disparaging it, or they can listen to people who actually do it right, and understand it, and keep doing it because it works.
It's a very simple choice.
[edited by: DomainDrivers at 2:50 am (utc) on June 8, 2007]
| 3:00 am on Jun 8, 2007 (gmt 0)|
From my point of view, I've used that kind of reciprocal link with great success. In the past. But not now. The day of the reciprocal link is long gone, at least for Google. You may ask how I know this - I know this because I used to set a fixed budget per site for reciprocal link building, using just the methods described by DomainDrivers - and so I have as close to an A - B test as you're likely to get, with repeats.
It's no surprise that Google now see fit to underline the death of the reciprocal link. It wasn't really a controversial move when they started discounting them, and their undesirability has been made clear by Google for such a long time now that it's not controversial that they now penalise sites for extensive reciprocal linking instead of just discounting those links.
SEO was always about good semantics, authority reference, clearly defined alignment of a page to appropriate key terms and the more general development and desirability of the site. It was so before reciprocal linking boomed, and it is so now. It is safe to regard reciprocal linking as a dirty smudge on the lens or a blip on the radar which is now gone.
Google can now tell me such an amazing array of information about my site and my visitors. Does anyone really believe that they are unable to recognise a reciprocal linking pattern with their eyes shut, whilst deep in sleep?
[edited by: tedster at 3:16 am (utc) on June 8, 2007]
| 3:06 am on Jun 8, 2007 (gmt 0)|
It's a very simple choice. --
The funny part that it works on the overage webmaster who wants to be there whether he/she works on self improvement skills or working on a fly by project. The whole mentality "I should" and "here is why I should not" is stretched so much and by reading the Hyped and "here is why not" thingies straight from the horses mouth leads to confusion.
and then there is TBPR thingy...
| 3:50 am on Jun 8, 2007 (gmt 0)|
Out of curiosity, what percentage of site owners know about the Google Webmaster Guidelines?
And, for those who are aware of them, what percentage of those site owners really understand them?
| 3:55 am on Jun 8, 2007 (gmt 0)|
pageoneresults... the follow on question must be:
"What percentage of webmasters who've acquired the knowledge required to flout the guidelines with the aim of improving ranking are still unaware of those guidelines?"
I believe that for everyone who's been told to use hidden text, reciprocal links, doorway pages, etc. there is someone in their knowledge chain who is aware of the guidelines - whether it is the person who gave the shoddy advice or the webmaster himself who decided to ignore the guidelines.
| 4:05 am on Jun 8, 2007 (gmt 0)|
One situation I have compassion for is the "not-very technical" business owner who has an employee or two who violate the guidelines -- but they don't let the boss know what they're doing. And sometimes the boss might know, but also might not understand the full impact.
I know of one case with a strong PR8 site and a business with over thousand employees. One IT guy got just a little bit of SEO knowledge and did some really poor user-agent cloaking for googlebot. Another case I know of where a good sized company had an employee who was running a paid link scheme to fatten his own wallet. Both these situations saw their search traffic fade away at some point. That was the wake up call.
The CEO who can't even use his own email needs to be dying breed today. The reality is that some technical savvy is required if you're doing business on the web, just like some real estate savvy is required to build or maintain a shop on the street.
| 4:17 am on Jun 8, 2007 (gmt 0)|
|If the publication of Page & Brin's academic paper about PageRank resulted in an increase in reciprocal or bought links, that isn't the fault of Page & Brin or of Google; it's the fault of greedy businesspeople and SEOs. |
Ever hear of an Attractive Nuisance [en.wikipedia.org]?
If I put out a bowl of milk every night, I should not be surprised to find lots of cats on my front porch.
| 4:20 am on Jun 8, 2007 (gmt 0)|
|Let's face it. Google writes the rules for the web. Don't spend your money into buying links elsewhere, bring all your pennies to AdWords. Don't like the idea? Well, sorry, you've been warned. Face the consequences: you're gonna wiped out of big G's serps. You're 0wned. |
And I predict that someday, G will face antitrust challenges related to these 'policies' - you can't be as big as G, enact 'policies' that have the effect of discouraging people from using competing services, and encourage people to use yours instead. Note that I'm not saying that there would be merit in the argument, just that someone's gonna claim 'UNFAIR' loud and clear, in US courts, if G isn't very very careful.
| 4:57 am on Jun 8, 2007 (gmt 0)|
|So, most sites are forced to do something "unnatural" like offering to reciprocate, or purchasing a few ads, or sending out emails, etc. |
Before the internet, if you wanted to start a new business, you had to pay for advertising. There was no free advertising. The implicit expectation that you should be able to start a business online and get free advertising really isn't reasonable.
If you want to do business online, and can't afford AdWords, don't start a business online. There are still too many unreasonable expectations, but there are no more free rides; it's 2007, not 1993.
| 6:07 am on Jun 8, 2007 (gmt 0)|
|No matter what they continue to say, i still see considerable evidence of link exchanges working and working well, i see evidence of buying links working and working well, until these methods become pointless then SEO's are going to continue as before. |
If you assume
1. that G will never detect all link exchanges by algorithm
2. Many people will stop exchanging links out of fear
You might conclude that the relative worth of (undetected) recips should rise, not fall.
| 10:18 am on Jun 8, 2007 (gmt 0)|
the part with the exchange links, I dont believe that one bit, ALL site at the top in ANY category have a lot exchanged links and I also see those who gets more are also crawling up the serps. I have not done any for a year now and Im slowly step by step going down the serps.
| 11:54 am on Jun 8, 2007 (gmt 0)|
Yep, Zeus, noticed this too. You can't rest on your laurels, gotta keep getting links for that site. Hard to do when you have so many sites and work to do, the link building kinda goes by the wayside.
[edited by: Pico_Train at 11:56 am (utc) on June 8, 2007]
| 12:44 pm on Jun 8, 2007 (gmt 0)|
ditto. i see sites with every kind of paid links, link exchanges, networked links doing very well indeed. it appears that links, links, and more links is still the way to go for google.
| 1:07 pm on Jun 8, 2007 (gmt 0)|
"Out of curiosity, what percentage of site owners know about the Google Webmaster Guidelines? "
I would be willing to place money that number is in single digits.
At the same time though, how many people that do not know about the guidelines really have a grasp of proper linking?
Thats where the gray comes in. Sure, we might all know a thing or two about stuff, but the vast majority of people that own web sites do not. That can be a problem for a lot of people later down the road because I would think that if the majority of web sites are not maintained by people "in the know" then the chances for mistakes that they may or may not have could cause things to be devalued and possibly be wide spread. I understand working in probability, but when the percentage of people that actually know about policy is low it seems a little foolish to devalue sites that have no idea what they are doing.
I agree that there is a problem with the way links are gamed, but I also think that in the grand scheme of things the majority does not even know about it. Sure, we see it here and we all know about it, but most do not imho.
| 4:16 pm on Jun 8, 2007 (gmt 0)|
|Out of curiosity, what percentage of site owners know about the Google Webmaster Guidelines? |
|And, for those who are aware of them, what percentage of those site owners really understand them? |
| 10:25 pm on Jun 8, 2007 (gmt 0)|
|Forget what they say. look at what they do. |
This is classic “let’s try and scare webmasters into believing that we’re onto them” Google bull.
The web is built on links: reciprocal, one way, bought, rented, free, permanent, temporary, authoritative or otherwise.
In their wildest dreams can Google classify all the links on the web and then put a value to them accordingly.
| 12:16 pm on Jun 9, 2007 (gmt 0)|
a very valid point, but at the moment I don't think these reciprocal exchanges are getting hit, or if they are, the penalty isn't zero value to all links, it might just be reduced value.
As mentioned here, you just have to surf for a few hours and virtually all the big players in some very key markets have link exchanges.
If google want to stop link manipulation, they're going to have to come up with a way of ranking sites that doesn't weight links as highly as currently.
I expect traffic analysis thanks to toolbar data, analytics data and so on will be the driving factors going forward.
| 1:56 pm on Jun 9, 2007 (gmt 0)|
|If google want to stop link manipulation, they're going to have to come up with a way of ranking sites that doesn't weight links as highly as currently. |
lol! As soon as they do, one of us will figure out a way to reverse engineer those ranking methods. ;)
| 2:16 pm on Jun 9, 2007 (gmt 0)|
|...one of us will figure out a way to reverse engineer those ranking methods |
That's probably true.
But, reverse engineering will only be profitable for ordinary webmasters (as opposed to competing search engines) as long as Google's algorithms continue to rely on data (like links) that is being used as a proxy for quality.
One of these days, someone will figure out how to directly measure quality.
Once that happens, the incentives change dramatically.
Webmasters will no long have an incentive to game the system by creating false, or exaggerated, "signals of quality" -- the incentive will instead be to create actual quality (the real deal, not just a false signal).
| 2:33 pm on Jun 9, 2007 (gmt 0)|
i have a feeling that future ranking will involve traffic levels (it may already), and thus those new sites wanting to achieve the traffic to rank will be forced to advertise in certain places..... It'll certainly keep the shareholders happy.
| 4:12 pm on Jun 9, 2007 (gmt 0)|
Traffic may already be a useful "signal of quality" but it certainly isn't adequate.
At most, traffic is a proxy for, or rough indicator of, popularity. Popularity may be correlated with quality in some cases, but not always -- and it can be faked (e.g. advertise heavily, in order to obtain enough traffic to appear to be popular).
| 4:27 pm on Jun 9, 2007 (gmt 0)|
<If google want to stop link manipulation, they're going to have to come up with a way of ranking sites that doesn't weight links as highly as currently.>
I think Google should create a copyrights website that we could submit new material too, like a registry for new articles. Then we could take a minute before we upload and publicize it, to register it as a an orginal document with them. They could run a test once a month or whatever and cross referance with these articles in the database and sort who is original author and who is duplicating copy so as not to pluto me, the author, as the duplicator. It also guides them to new fresh content and they could have it set up to auto crawl a suggested page when someone submits an article.
Is this a good idea or am i missing something?
| 6:59 pm on Jun 9, 2007 (gmt 0)|
And a bloodsucker who realises site B doesn't know about this, takes advantage of it and submits other people's content as his own and gains 10 million pages a year of original content.
Strike! That was the 1-0 pitch. Throw me another curve, wack it out of the park!
| 11:34 pm on Jun 9, 2007 (gmt 0)|
<And a bloodsucker who realises site B doesn't know about this, takes advantage of it and submits other people's content as his own and gains 10 million pages a year of original content.>
Hehe, yeah, very good point, no doubt there would be some complications to work out but for that type of thing thre could be a registration involved like they do with anlytics or webmaster tools, to prevent certain abuse. For the example you described they would eventually know who the offending webmaster was when the originator complained and remove his sites and possibly any other sites owned by them.
Also limit it to a few articles a day so the abuse cant get to out of hand and larger sites with tons of their own articles might able to submit whole site in one shot.
| 9:36 pm on Jun 12, 2007 (gmt 0)|
Just wish to mention that Google has added a Paid Link Reporting option within Google Webmaster Tools.
You can find it under Tools
- Download data for all sites
- Report spam in our index
- Submit a reconsideration request
- Report paid links
| 9:47 pm on Jun 12, 2007 (gmt 0)|
Seems they are taking this paid links thing seriously. :) I'd bet they are attacking the organized ones first (TLA, DP coop) etc. Easier to find algorithimically.
Have to wonder where these paid links penalities will/do fall in line with -30, -950, and complete bannation.
| 12:44 am on Jun 13, 2007 (gmt 0)|
Google said that the "penalty" will be that the seller may face is not being able to pass PageRank. Nothing extreme about their own rankings at all.
| 1:43 am on Jun 13, 2007 (gmt 0)|
True tedster but they don't say it there. Instead, 'selling' is only mentioned once on the form field and they seem to focus on buying:
|Unfortunately, not all websites have users' best interests at heart. Some site owners attempt to "buy PageRank™" in the form of paid links to their sites. Buying links to improve PageRank violates our quality guidelines. |
I agree, probably nothing extreme, but they put this link where it would garner the most webmaster eyes possible and they must know that most of the reports will be on competitors buying. Interesting they didn't put up a link to report recips (yet).
| 3:01 am on Jun 13, 2007 (gmt 0)|
- Tedster -
|Google said that the "penalty" will be that the seller may face is not being able to pass PageRank. Nothing extreme about their own rankings at all. |
What is your source for this info?
You say not passing PR. Do you mean, therefore by default, that the link text will be passed over to the receiving site?
...and if so, i wonder at what strength the link text will go over.
| 5:20 am on Jun 13, 2007 (gmt 0)|
It's more than PR that may not pass -- all kinds of backlink influence may be stopped. Matt Cutts has clarified this several times in recent weeks, and been rather consistent on it since 2005. Here's one reference, with a link to a second:
|Q: I’m worried that someone will buy links to my site and then report that. |
A: We’ve always tried very hard to prevent site A from hurting site B. That’s why these reports aren’t being fed directly into algorithms, and are being used as the starting point rather than being used directly. You might also want to review the policy mentioned in my 2005 post [mattcutts.com] (individual links can be discounted and sellers can lose their ability to pass on PageRank/anchortext/etc., which doesn’t allow site A to hurt site B).
| This 62 message thread spans 3 pages: < < 62 ( 1  3 ) > > |