homepage Welcome to WebmasterWorld Guest from 54.205.119.163
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Penalizing and Massive Links
webdude




msg:159187
 6:50 pm on Jun 23, 2004 (gmt 0)

So how long does it really take to get relative links back to your site? Case in point...

I keep reading posts about how someone added 3000 incoming links to their site in a short period of time, and lo and behold, they think they got sandboxed. 3000 links? Come on!

I have a new site that I made live in March. I spent the last 3 months garnering RELATIVE links back to my site. I searched, contacted, phoned, conjoled, etc. The way I see it, I can maybe contact and email 50 to 100 webmasters per day. Of those, I get maybe 20 responses. After exchanging info, that gives me about 20 links I can possibly get per day.

Now granted, I did not work at this full time. I have 40 other web sites out there. But, the sum total of all the RELATIVE links I have garnered is about 200. That is in the last 2 months and I haven't even worked on the tip of the iceberg yet.

So what am I saying? If I was the G and suddenly a brand new site popped up with 3000 incoming links, I think I would do much more then sandbox it. I would probably ban it. The only way you could physically get that many links coming in would be from some pretty devious methods, ie. link farms, non-relative links, heavy cross-linking within subnet blocks, etc.

So do you really want to allow sites that suddenly have 3000, 4000 or 10,000 brand new links coming in to top the SERPS? I think not. Sites designed and implemented like this exist for only one reason. And that is to top the SERPs. Shame on the webmasters who do this. Haven't you got it yet? The number of links coming in doesn't matter, it's the relativity of those links and the relativity of your site.

Another case in point...

I have 4 sites currently in the #1 spot for many very competative key phrases. Of those 4 sites only 1 has more then 200 incoming links that I busted my you know what to get. The other 3 have maybe 200 between them. So why do my sites do so well? You better start looking at unique and relative content, relative incoming links, and something new to offer your users. That is the only way. It takes a lot of hard work, and a lot of time. It doesn't happen overnight.

So get to it...

WebDude

 

rivi2k




msg:159188
 7:00 pm on Jun 23, 2004 (gmt 0)

i agree totally, you bring up some very good points. I am also a firm believer in getting quality, not quantity, and it has had some great results for many of my sites.

webdude




msg:159189
 7:55 pm on Jun 23, 2004 (gmt 0)

Sorry to rant, but I had to get that off my chest :-)

archnizzle




msg:159190
 7:58 pm on Jun 23, 2004 (gmt 0)

Two points:

1) I don't think getting 1000's of inbound links all of a sudden is necessarily something that sites should be "banned" for doing. There are certainly a lot of legitimate and traditional non-seo marketing partnerships that could produce those numbers of backlinks (where both parties are oblivious of their effect on Google).

2) What about fads, breaking news, or the latest email fwd...? These types of sites probably go from a few links to hundreds (if not thousands) in a short period of time. Should Google ban those sites? Obviously not. Should Google rank them highly -- probably.

It sounds like the best way to avoid penalties is to mimic organic linking structures. Thousands of links from one or two domains overnight might draw a flag from G, but shouldn't get a site banned.

webdude




msg:159191
 8:10 pm on Jun 23, 2004 (gmt 0)

1) I don't think getting 1000's of inbound links all of a sudden is necessarily something that sites should be "banned" for doing. There are certainly a lot of legitimate and traditional non-seo marketing partnerships that could produce those numbers of backlinks (where both parties are oblivious of their effect on Google).

So give me an example of this. And if the parties are oblivious... who cares?

2) What about fads, breaking news, or the latest email fwd...? These types of sites probably go from a few links to hundreds (if not thousands) in a short period of time. Should Google ban those sites? Obviously not. Should Google rank them highly -- probably.

Okay, I will give you this one. But news sites will never get sandboxed because the latest news story is just a drop in the bucket compared to the total amount of backlinks.

Fads? Maybe. not sure what you mean by email fwd...

It sounds like the best way to avoid penalties is to mimic organic linking structures. Thousands of links from one or two domains overnight might draw a flag from G, but shouldn't get a site banned.

As I understand, these sites are not banned, they are sandboxed. Sort of like a temporary penalty. The flag that is being drawn is sandboxing.

mfishy




msg:159192
 8:11 pm on Jun 23, 2004 (gmt 0)

It's funny how you somehow see contacting webmasters for links and reciprocal linking for the purpose of ranking as fine but gathering "lots" of links is somehow bad.

On more than one occasion, a few years ago, we launched freeware software products that gained thousands of links within a few weeks - NONE of them solicited in any way.

AS far as "sandboxing" goes, the amount or type of links does not seem to matter. When this first became apparent I noted a charity site that had around 50 links from very nice sources, including a PR9 with no other links, that got sandboxed.

None of this matters as you will still rank in time. If you need quick traffic use pages from exisiting domains.

webdude




msg:159193
 8:15 pm on Jun 23, 2004 (gmt 0)

On more than one occasion, a few years ago, we launched freeware software products that gained thousands of links within a few weeks - NONE of them solicited in any way.

So tell me, how can you get links unsolicited. I would think that the polite thing to do would to ask for the link. In fact, you would pretty much have to.

trillianjedi




msg:159194
 8:19 pm on Jun 23, 2004 (gmt 0)

So tell me, how can you get links unsolicited.

If you have decent content, you'll get lots of unsolicited links, it just takes a little time.

People naturally link to good resources.

TJ

webdude




msg:159195
 8:32 pm on Jun 23, 2004 (gmt 0)

I am talking about unsolicited links from a software program. Granted, I understand that if you have good content, the links come unsolicited naturally, but using software to place links on thousands of sites unsolicited is hardly what I would call linking to good content, This is one of the reasons that FFA link pages fail so miserably.

If it was this easy, anyone could do it.

Look at the sites that have PR9 and 10. They did not get there by blasting their link to thousands of sites. It was a natural progression of other sites finding useful content and linking to them.

And a lot of these sites did not garner the backlinks overnight. It was a natural progreesion that took time. Granted, there are sites that were just always there that are the grand daddies of the web world. But I am talking about sites that joined the game later and earned their right to the top.

Teshka




msg:159196
 8:34 pm on Jun 23, 2004 (gmt 0)

I keep reading posts about how someone added 3000 incoming links to their site in a short period of time, and lo and behold, they think they got sandboxed. 3000 links? Come on!

I would be suspicious of 3000 links from different domains, but just 3000 links... If I add someone to the recommended sites side menu on my blog, they're guaranteed 1000 links right there since that shows up on every entry. I assume Google recognizes that, though, and diminishes the weight. Or does it?

pleeker




msg:159197
 8:37 pm on Jun 23, 2004 (gmt 0)

using software to place links on thousands of sites unsolicited is hardly what I would call linking to good content

webdude, I think you're misunderstanding what mfishy said.

The statement was NOT that they use software to acquire links. It's that they offer a piece of freeware on their web site and as soon as it's available, many other sites link to the download page so people can get that software.

Is that right, mfishy?

(It's like people linking to Apple's site when the new version of iTunes comes out, for example. Should Apple be penalized for getting thousands of links in a matter of hours or days?)

webdude




msg:159198
 8:40 pm on Jun 23, 2004 (gmt 0)

I would be suspicious of 3000 links from different domains, but just 3000 links... If I add someone to the recommended sites side menu on my blog, they're guaranteed 1000 links right there since that shows up on every entry. I assume Google recognizes that, though, and diminishes the weight. Or does it?

I have mixed feelings about blogs. In some cases it seems to work, in others it does not. I have never studied blogs nor am I inclined to try it.

And I think you are right, G recognizes this and diminishes the weight. And if G doesn't now, I think in the near future G will.

trillianjedi




msg:159199
 8:41 pm on Jun 23, 2004 (gmt 0)

Webdude, I think you misunderstand mfishy.

He launched a freeware software product which attracted unsolicited links to the site it was launched from.

Actually, winzip is a great example of this kind of linking.

TJ

<Edit>Blimey, pleekers fast on the draw tonight ;-)</Edit>

[edited by: trillianjedi at 8:42 pm (utc) on June 23, 2004]

webdude




msg:159200
 8:42 pm on Jun 23, 2004 (gmt 0)

webdude, I think you're misunderstanding what mfishy said.

The statement was NOT that they use software to acquire links. It's that they offer a piece of freeware on their web site and as soon as it's available, many other sites link to the download page so people can get that software.

OOPS!

I reread the post and you are right. I apologize. mfishy makes a good point.

TomJ




msg:159201
 8:57 pm on Jun 23, 2004 (gmt 0)

I'm a newbie webmaster and when you have not yet reached a critical mass in order to start driving some real traffic, this sodding sandbox really gives you a kick in the nuts!

So I have a plan.....I'm just gonna make my new sites and cross link them (not too heavily) and wait.....

If i'm sandboxed ...so be it....I'll rank eventually.

I got different IP's, whois, page design, theme of website etc etc.

I'm determined to get to the top of serps in my fields. nothing will stop me, only delay me.

;-)

webdude




msg:159202
 9:02 pm on Jun 23, 2004 (gmt 0)

Well,

That's one way to look at it.

my3cents




msg:159203
 9:06 pm on Jun 23, 2004 (gmt 0)

It has been said by googleguy many times that your website cannot be hurt by an incoming link. If it could be, then you could take your competitors down by getting them thousands of crappy backlinks.

I can clearly see that my problems are with the new google bot, in it's attempt to spioder dynamic content better they have confused tracking urls and dynamic urls generated by other search engines with the real urls.

I suggest that google sees these as duplicate content instead of incorrect urls.

TomJ




msg:159204
 9:49 pm on Jun 23, 2004 (gmt 0)

"It has been said by googleguy many times that your website cannot be hurt by an incoming link. If it could be, then you could take your competitors down by getting them thousands of crappy backlinks. "

yeah but what about networking my sites to the top of the listings.....?

nuevojefe




msg:159205
 10:14 pm on Jun 23, 2004 (gmt 0)

I really haven't seen any first hand evidence of massive links hurting a site. I've just simply seen the links not having the intended effect right away. And, usually after some time it does have the intended effect.

What has been getting a lot of sites nuked (IMO) is obvious PR buying WITH the combination of obvious PR selling. IE, buy links build PR, then start selling them.

t2dman




msg:159206
 10:40 pm on Jun 23, 2004 (gmt 0)

Google has a very good duplicate content filter, and can see even duplicate sentences and paragraphs both within sites, and between sites. So if you have an outbound link in the same spot on every page, Google can see that. Do a "keyword site:domain.com" search for that link and you will see the Google message "repeat the search with the similar results included".

To really do a sitewide link well, you need some sort of random text generator with the words in front and behind so that no link is the same.

Links in signitures of forums can similarly be duplicate, since they always reside in the same sentence, despite being on different parts of many pages of the same site. If the link is at the front or back of the sig, or is particularly short, the words around can be different enough that the filter is not applied.

If Google can see dup content that we find out about with the "repeat the search..." filter, then it makes sense that it also takes such content into account with its ranking of pages.

Too many links between similar sites also sets off the "same site" penalty, where Google only shows one of those sites.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved