homepage Welcome to WebmasterWorld Guest from 54.226.173.169
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Link Growth Rate, the Sandbox, & the Flaw
If Google look for excessive growth to detech spam then...
Chico_Loco




msg:770479
 1:04 am on Jul 9, 2005 (gmt 0)

As we know, Google uses some kind of a sandbox for new sites. Yahoo & MSN don't seem to do that. There also also reports that Google are looking for abnormal link rate growth in order to detect spam. Sites with excessive inbound link growth seem to get banned.

There are so many sites out there that are using the SERPS of Google, Yahoo & MSN (which don't have a sandbox) to get content, that inbound links are growing at an un-natural rate for almost everybody. The bigger a site, the quicker these spammy inbound links grow. Before a site even gets out of sandbox it has all of these spammy inbound links.

I've spent months trying to figure out why my site got kicked. After taking a look at many other sites that got kicked, it seems they all have the same thing in common - there are many spamy sites scraping content from search engines (and the websites directly), and providing links to the site. Many of these spammy sites have many pages with the same listing, and almost always they use the same link text, so that adds to the problem.

I know Google have said that no other webmaster can hurt your ranking but after studying this I truly believe these spammy sites are getting legit sites kicked in Google, between abnormal link growth, duplicate content and identical link texts.

 

Chico_Loco




msg:770480
 7:59 am on Jul 11, 2005 (gmt 0)

So I'm guessing others don't really see this as a problem?

A site I launched a few weeks back already has a lot of these scraper links - and if this abnormal growth penalty is indeed an issue, then I'll probably suffer. Funny thing is - these other pages that link to me rank better for my terms that even I do!

aris1970




msg:770481
 8:56 am on Jul 11, 2005 (gmt 0)

A site I launched a few weeks back already has a lot of these scraper links - and if this abnormal growth penalty is indeed an issue, then I'll probably suffer. Funny thing is - these other pages that link to me rank better for my terms that even I do!

The other pages may rank better for a while, if your site is in sandbox indeed. Our sites have thousands of inbound links from sraper pages, but we NEVER had any problems at all. Do your work with your site, get quality backlinks and don't worry about what scrapers are doing :)

Best wishes!

Chico_Loco




msg:770482
 2:23 am on Jul 12, 2005 (gmt 0)

Do your work with your site, get quality backlinks and don't worry about what scrapers are doing

I always do, it just seems to me that if there is a buzz about excessive link growth penalties then there should be more buzz on this. I mean, they nearly always use the title tag of your site so it looks like manipulation.

Hmm - perhaps changing the title tag every so often will help!

crobb305




msg:770483
 2:30 am on Jul 12, 2005 (gmt 0)

Also, stay away from press releases I guess. If you "announce" your website in any press release, regardless of the value of the content contained within, many many sites may suddenly link to you.

walkman




msg:770484
 4:16 am on Jul 12, 2005 (gmt 0)

this the only reason I haven't launched the rss feeds yet. I just can't risk it, because I know they'll be displayed in dozens of sites right way, many of them sitewide.

A clarification from GoogleGuy would be great. It can put a lot of theories to rest and our minds at ease.

Personally, I suspect that the algo picks the sites with abnormal growth, and that they (Google) checks them by hand to make sure the links are legit. How else would they know if it's a new legitimate site or just a spammer?

robotsdobetter




msg:770485
 4:22 am on Jul 12, 2005 (gmt 0)

Personally, I suspect that the algo picks the sites with abnormal growth, and that they (Google) checks them by hand to make sure the links are legit. How else would they know if it's a new legitimate site or just a spammer?
I can't see Google checking the links, there's just to many web sites to look at and not enough time.
walkman




msg:770486
 4:25 am on Jul 12, 2005 (gmt 0)

>> I can't see Google checking the links, there's just to many web sites to look at and not enough time.

that's what I thought...until the news about students testing Google's updates and "ranking" pages came out.

robotsdobetter




msg:770487
 4:38 am on Jul 12, 2005 (gmt 0)

that's what I thought...until the news about students testing Google's updates and "ranking" pages came out.
Yeah, but that supposed to be just for testing and even with that, I still can't see them having enough people to do it, they would need to check every little detail, sure, some would be easy to spot, but there are just as many that would be hard to spot.
Chico_Loco




msg:770488
 7:30 am on Jul 12, 2005 (gmt 0)

this the only reason I haven't launched the rss feeds yet. I just can't risk it...

You know - this brings back the whole "are search engines perverting web development" topic back into light.

Funny how things have turned so quick - 2 years ago I bet you'd have jumped at putting out those RSS feeds to obtain the link benefits!?

ADDED:
In regards to the manual removals - they have stated many times that they just don't do that and that they prefer to use algorithms to find spammy stuff, so I am assuming it's all automatic - which is exactly the problem. In legit cases even white hats can fall afoul to "algorithmical discrimination".

At any rate, I am worried about these links, and from what I can see there has been some damage done already (this all being based on the theoretical existance of such a "link overgrowth" filter). So, if it can't be stopped dead in it's track, what might one do to at least limit the damage?

Are there any ways in which search engines could limit what content is scraped from them? I suppose one could argue that if they are using our content then they should at least protect it from being ripped off?

ballygobackwards




msg:770489
 12:37 pm on Jul 12, 2005 (gmt 0)

I've seen exactley the same behaviour with a nubmer of my sites, i launch them and they get spidered, they will briefly appear for a couple of days and then get banned as the scraper sites start linking to them en mass.

However i have seen that after about three months or so the sites do come back, it's as if google says "ok you got lots of links quickly which is abnormal so you are now banned, however it's been three months now and you are still around, you've waited long enough you can come back in".

walkman




msg:770490
 3:38 pm on Jul 12, 2005 (gmt 0)

>> I bet you'd have jumped at putting out those RSS feeds to obtain the link benefits

true, but now I suppose I'm losing some money by not having them on other people's site. I wish I could do it with rel=nofollow

Chico_Loco




msg:770491
 7:51 pm on Jul 12, 2005 (gmt 0)

However i have seen that after about three months or so the sites do come back, it's as if google says "ok you got lots of links quickly which is abnormal so you are now banned, however it's been three months now and you are still around, you've waited long enough you can come back in".

Anyone else been experiencing come-backs after penalties that may have been related to links or link growth?

As for the rel=nofollow, that might be a solution, however impossible to do right now? Would it be possible to break someones HTML tag and modify it by inserting a quote (") and following it by "rel=nofollow - leaving their last quote to close the rel attribute?

Also - do you think if Google were to send an HTTP_REFERER header that it would help in any way? We'd know more about what sites are linking-in then.

ownerrim




msg:770492
 8:00 pm on Jul 12, 2005 (gmt 0)

somewhat off-topic, but what's the consensus on buying high pagerank text links? Worth it, not worth it, risky, not risky...or is pagerank not enough of an issue anymore?

walkman




msg:770493
 8:08 pm on Jul 12, 2005 (gmt 0)

>> what's the consensus on buying high pagerank

the consensus is that there's no consensus ;). It may work till you get caught. Is it worth risking everything? Depends on the site I guess. If you have my-brand-i've-been-building-for-8years.com I wouldn't do it.

ownerrim




msg:770494
 8:30 pm on Jul 12, 2005 (gmt 0)

"If you have my-brand-i've-been-building-for-8years.com I wouldn't do it."

Generally, my feeling as well. I also don't interlink my sites, except for two sites that have a single link going from site a to site b, not reciprocated.

Still, you see all these auctions for high pr links from newspaper sites. Absolutely valid sites. And who's to say that there's anything wrong with trying to get a site some exposure by simply advertising with newspaper-run-of-site text links. The grey area of concern seems to be whether or not google will see it as simply advertising, or an attempt to influence the serps. I would think, to be on the safe side, they wouldn't penalize someone trying to get exposure for a site this way. The way to counteract the serp-influencing aspect would simply be to attentuate the influence of redundant links coming from a single site. And perhaps they do this, making the whole thing not risky at all, and also not as beneficial as one might hope. I'm sure many in here have done this. What did you get out of it? Anything? A bit smack on the arse from G?

Luxuryhousingtrends




msg:770495
 9:11 pm on Jul 12, 2005 (gmt 0)

it seems they all have the same thing in common - there are many spamy sites scraping content from search engines (and the websites directly), and providing links to the site. Many of these spammy sites have many pages with the same listing, and almost always they use the same link text, so that adds to the problem.

I care less that they're linking to me (and my blogs get a lot of scraper hits) and more that they get so much more traffic than I do (judging by the number of referrels I get). Why Google sends so much traffic to these sites in the first place mystifies me. You'd think their bot would be smart enough to tell when a site is doing nothing but listing their own search engine results.

Chico_Loco




msg:770496
 9:15 pm on Jul 12, 2005 (gmt 0)

You'd think their bot would be smart enough to tell when a site is doing nothing but listing their own search engine results.

Well, I think most scaper sites used to use Google SERPs for content, but they did catch on to that I think, and now they use a variety of engine, which makes it virtually impossible to track.

chelseaareback




msg:770497
 9:24 pm on Jul 12, 2005 (gmt 0)

first post for me on this site though I look regularly, if only because members seem to notice g changes first

call me an idiot but everybody seems obsessed with links

I have site with masses of google traffic and have never ever thought or worried about who links to me or who I link to

i just produce pages that visitors will find and find useful in areas that my site majors in

and getting on with developing my own site seems a much better use my time......

or have i missed the point?

robotsdobetter




msg:770498
 11:34 pm on Jul 12, 2005 (gmt 0)

I think we have only seen the beginning of scraper site problems, unless Google does something, but I can't see them doing anything soon. I keep on seeing scraper sites popping up everywhere and getting great ranking, until the search engines can beat this technique, we are only going to see more people creating them and see the problem become bigger.

or have i missed the point?
Not all of us have the time to build lots of USEFUL content, it's faster and easier to get backlinks and most webmasters seem to like the easy way.

ownerrim




msg:770499
 11:52 pm on Jul 12, 2005 (gmt 0)

"I keep on seeing scraper sites popping up everywhere and getting great ranking, until the search engines can beat this technique"

Really? I see tons of scrapers (linking to one of my sites), but I'd amazed if any one of them could rank for anything but the odd phrase here and there. They have no pagerank and no quality IBL links from sites on other isps.

"Not all of us have the time to build lots of USEFUL content, it's faster and easier to get backlinks and most webmasters seem to like the easy way."

This is why sites that find a "need" and then write the heck out of it are like gold mines. It's weird but true: most webmasters don't seem to care for writing. But with google's emphasis on content tailored to specific page titles, that what people should be doing to ad nauseum.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved