Forum Moderators: open
1. crossslinking
symptoms:
length of penalty:
2. buying links (sitewide or sigle page)
symptoms:
length of penalty:
3. Duplicate content
symptoms:
length of penalty:
4. cloaking
this one I heard it's a lifetime ban, but Whenu tried and go tback a motnh later so I don't know.
***
when do you just ditch the domain name and start fresh? Assume that you fixed whatever caused the problem the same month.
any ideas?
2- none - they can not touch someone for selling ads. If they do, it is all because of hand checks.
3- perm as long as the other dupe content is out there. Recovery time: 3-6 months.
4- not at all. I could name 100 flash sites that are totally cloaked. With a little digging - I think you could hame several thousand. Google has no problem with quality ontopic cloaking. It is only when the content of the cloaked page is radically different from the content of the browser page.
Does the penalty apply to the original site that was put up or just to the site that duplicated the original content?
We have this problem but it's a third party who just put up the dupe content ... any they say they are trying to be helpful.
I guess that would be 3 to 6 months as well?
Google has no problem with quality ontopic cloaking. It is only when the content of the cloaked page is radically different from the content of the browser page.
I find this comment intensely interesting, especially since it comes from such an authority poster. I thought cloaking was the easiest short-cut to a ban, even if it was a 'quality ontopic cloak'.
Not that I agree or disagree with this policy regarding cloaking, I just always thought that (near) zero tolerance was the only solution. Zero tolerance seems to be G's approach to most other black/grey hat techniques.
Interesting indeed.
[edited for grammar]
[edited by: instinct at 10:43 pm (utc) on Nov. 24, 2004]
2. buying links (sitewide or single page)2- none - they can not touch someone for selling ads. If they do, it is all because of hand checks.
Today I spoke at length with someone who signed a $6000 contract with a very prominent, respectable e-commerce firm. This firm even sponsors the top search engine watch site on the web. They sell link-only listings that show up in a little box table on the left side of the page, about 12 links per box. If you refresh the page, the links rotate and you see new ones.
Her backlinks went from 400 to 17,000 almost overnight. Almost all of the new ones are totally unrelated to her site, and come from these boxes. She signed up for the program hoping to increase her Google presence. To the contractor's credit, they did not represent this program as anything other than something that will "increase your traffic." She is aware of SEO issues with Google and even reads WebmasterWorld, but thought that the sterling reputation of this company would make it okay. She signed up for a limited-time exposure (one or two months, I think).
At the same time that her backlinks increased, all her keyword positions disappeared from the top 1000 in Google. She's still okay in Yahoo, MSN, and MSN beta. Yes, she was hoping to increase her Google traffic, and we both agreed that this might have worked as recently as a year ago. No, she gets almost no referrals from the ad links, because they're so fleeting and off-topic. Yes, she was aware that link farming can get you penalized. No, she didn't think this was the case with a major company that even sponsors search engine conferences where Google representatives can be expected to attend.
Her PageRank is the same, and the site: command still lists the numbers she is used to seeing. It's just that her keywords are suddenly not useful for bringing up her pages.
Questions:
1) Is Google detecting dramatic increases in backlinks and penalizing that site? Remember, toolbar PageRank and the site: command are unaffected.
2) Should Google tell us if they're doing this?
3) Should the contractor, if they nicely ask Google whether they should put the links in a frame or use JavaScript to avoid the Googlebot, expect to get a straight answer from Google?
"Should Google tell us if they're doing this?" - obviously they haven't told so far (except for Matt mentioning this could be theoretically a red flag). Whether they should has been a basis for arguments in some previous WebmasterWorld threads.
"Should the contractor, if they nicely ask Google whether they should put the links in a frame or use JavaScript to avoid the Googlebot, expect to get a straight answer from Google" - I don't think the contractor will ask (English is not my native tongue - by contractor you mean the other side, not the firm you talked with, right?) and I'm quite certain Google won't answer.
do you know how they found you out?
How much duplicate content did the sites share?
I think G can remove our sites if they think it is duplicate content and/or other people are providing better/more/different information than we are. However, that is not happening. The people that took over when we dropped have the same content (without any variation) and I they even have less content/features.
So if G does not believe our site(s) deserve to be ranked, then I think the other sites currently listed should not be ranked either (they would have to go down to the 20th place or so to get any sites with unique concepts/features.
In G's defense - they did allow us to keep one of our sites at the top of the SERPs.
I am not complaining here, just stating some facts and opinions. But I would just like to see google take consistent action and report why they take action (if it is done manually). This will help define guidelines so people know what is acceptable. It has been mentioned many times that you never know what changes google will make and you could drop out of the SERPs overnight. That is EXACTLY why we created all of these sites. We do not need 6 of the top 8 sites. But we made the others and made sure they were different - some were hubs, some were heavy with content, etc. So if one of our sites dropped, we could get another one of the sites up in the SERPs quickly. Luckily they all ranked well, but that turned out to be our downfall. Even so, I am not sure if the guidelines ever specifically state you cannot own more than one site. That is exactly why I would like to have things more clearly defined. We should have done things more sneaky from the beginning, but we didn't think we were doing anything wrong. So we have learned a lesson. I will have to be more "sneaky" with how we do things. Unfortunately that wasn't previously in my nature - if it was, we would have been ok.
The SE's act like they want to work WITH the SEO's. Unfortunately both sides are being secretive with how they do things.
Attending conferences to only deflect questions dos not seem to help - neither does seeing the same presentations (if I see the Ask Jeeves bubble gum machine one more time...) . I will say that some of the SE Reps are starting to make progress, but can only be done so in the hallways or pubs.
I personally know that for a fact and seen many other reports! Not sure if they do it via the algo, or have a G employee check the flagged sites first, but a sitewide link will rain on your parade, big time. My link is from a very well know newspaper, on the left side of the page and just has my domain.com linked. I get enough traffic to make each click less the 1 cent each, given the money I paid. Now I don't know what to do. This week I changed the anchor to just say "Mydomain.com", hopefully it will adjust itself. I'm losing too much $$ with this and I'm not sure if the there's a timed penalty or what.
[edited by: walkman at 4:59 am (utc) on Nov. 25, 2004]
Actually G seems pretty tolerant to me, and most linkage based penalties seem due to the cumulative effect of more than one linkage "scheme".
For example, G may well accept some crosslinking OR run of site links, but may well baulk at crosslinking AND run of site links.
It basically involves a reply from you know who which warned about buying or selling links to increase a site's PageRank value, pointing them to their Webmasters guidelines.
What's interesting for me is the apparent confirmation that buying/selling links can now be seen as a "links scheme", with all that entails.
For a long time there's been a very narrow view taken on what might constitute a "links scheme", and this answer seems to blow it wide open.
There naturally was great suspicion about the genuineness of the reply, it seems Mods/senior members got to view it, and were happy enough with it.
If Google is indeed doing this, then all I have to say is, "Google, please be sure you don't exempt all those sitewide blogrolls!"
Trimming back the blog noise would help the SERPs, in my opinion. On another forum, there's a short thread about apparent blog devaluation already, although so far no one has related it to the sitewide linking situation.
When you think about it, the emergence of blogging software and blogrolls really shifted the nature of backlinks on the web. Now this same tendency is observable on some big e-commerce networks. PHP makes it easy to do, and all the hype about a resurgence in web advertising means that everyone wants to get a piece of the pie.
From the perspective of Google's original definition of a backlink as a "vote" by one site for another site, these recent tendencies have made any sort of pure PageRank calculation almost meaningless, unless you impose a lot of filtering before you do the calculation.
Stands to reason, 100 links to 10000, sore thumb me thinks. Certainly not natural
>>2) Should Google tell us if they're doing this?
No, they can do what they want. :)
3)..... expect to get a straight answer from Google?
No way Jose. Misinformation is what keeps people from gaming google. Granted not all of them. But a lot of them. I would take it step further and play with ranking to keep us all completly foxed.
[webmasterworld.com...]
you would have a lot less fun if i was running the show lol, and I would have shed loads :)
Any Ideas Please
I can't help a whole lot with how much Google will devalue sites due to duplicate content - and I don't know how google knows the original creator of the content... I do know that, due to concerns about hijacking, we are now taking action against every site that is displaying our code in an attempt to hijack our listings, as well as sites that have taken text off our pages... We recently filed our sites with the Copyright office (only costs $30 for all the sites owned by a company) and will be doing so on a regular basis going forward (probably every 4-6 months). It'll take 6 months to receive confirmation, but we're looking forward to finally hearing back that it's confirmed. We should have been doing that ages ago (what if a site that has stolen some of OUR content was to have filed a copyright claim six months ago – it would be VERY difficult to disprove). As I was doing my research I came across Google's page which describes how they handle DMCA copyright complaints:
[google.com...]
(I’m assuming Yahoo and MSN search have similar DMCA methods)
It's quick and easy - and then Google removes the offending page (or, what would be really cool, the entire site). Anyway, the drawback to the Google methodology is that they will automatically reinstate the content if the other person 'protests' (all they have to claim is that they own the content and they are put right back in). If the other person protests, you have 14 days to file a suit or everything is left the way it originally was.
I'm not sure we're going to want to hassle with suits - but we are definitely going after the hijack attempts BEFORE one of them succeeds...
Hope that helps,
Chris
There is so much cloaking out there now, that it is hard to find a top 1000 site that doesn't have some kind of cloaking going on. You can't tell the difference between a geo cloak, a language cloak, a session cloak, or an seo cloak. The motto for the last 3 years has been, cloak or die.
>refers to all the code on a page and is
> for templated sites with very few differences on the page.
No, templates can be radically different and google will still id the dupe guts very easily. Templates are cake to strip off a site.
> Google seems unable to tell which site came first.
I give g their due when they deserve it, and their dupe content catcher is the best in the biz.
> Actually G seems pretty tolerant to me
Ditto Glengara. If you have a quality site that focuses primarily on content first, you can push it to the limit.