| 6:19 am on Nov 24, 2004 (gmt 0)|
1- None as long as it is approriate and on topic.
2- none - they can not touch someone for selling ads. If they do, it is all because of hand checks.
3- perm as long as the other dupe content is out there. Recovery time: 3-6 months.
4- not at all. I could name 100 flash sites that are totally cloaked. With a little digging - I think you could hame several thousand. Google has no problem with quality ontopic cloaking. It is only when the content of the cloaked page is radically different from the content of the browser page.
| 9:05 am on Nov 24, 2004 (gmt 0)|
[quote]3- perm as long as the other dupe content is out there. Recovery time: 3-6 months.[quote]
Does the penalty apply to the original site that was put up or just to the site that duplicated the original content?
We have this problem but it's a third party who just put up the dupe content ... any they say they are trying to be helpful.
| 9:40 am on Nov 24, 2004 (gmt 0)|
A few other to mention in the list are:
3) Blog Comment Spam
| 10:29 am on Nov 24, 2004 (gmt 0)|
I'm with Millie on this one. Google seems unable to tell which site came first.
| 11:12 am on Nov 24, 2004 (gmt 0)|
with regards to duplicate content, how about internal duplicate content, mass produced pages with a few keywords changed here and there - this seems to have incurred a penalty to many sites... "repeat the search with the omitted results included." will bring up very different results.
I guess that would be 3 to 6 months as well?
| 3:42 pm on Nov 24, 2004 (gmt 0)|
We got penalized for having multiple sites in the top of the SERPS (6 out of top 8). This was done manually since the sites were not interlinking and all WHOIS info is different.
This was speculation on our part for a while, but was eventually confirmed by someone from google.
| 4:43 pm on Nov 24, 2004 (gmt 0)|
Onebaldguy... Wouldn't that be unfair restraint of trade?
| 5:09 pm on Nov 24, 2004 (gmt 0)|
onebaldguy, do you know how they found you out? What I'm wondering is whether you were reported (which is out of your control), or whether some sort of red flag related to your sites kicked one or more of them onto a list for manual review.
| 8:03 pm on Nov 24, 2004 (gmt 0)|
onebaldguy, you had six of the first 8 serps. How much duplicate content did the sites share?
| 9:54 pm on Nov 24, 2004 (gmt 0)|
onebaldguy .. seems like a whole bunch of people paddle the same canoe as you.
| 10:13 pm on Nov 24, 2004 (gmt 0)|
|Google has no problem with quality ontopic cloaking. It is only when the content of the cloaked page is radically different from the content of the browser page. |
I find this comment intensely interesting, especially since it comes from such an authority poster. I thought cloaking was the easiest short-cut to a ban, even if it was a 'quality ontopic cloak'.
Not that I agree or disagree with this policy regarding cloaking, I just always thought that (near) zero tolerance was the only solution. Zero tolerance seems to be G's approach to most other black/grey hat techniques.
[edited for grammar]
[edited by: instinct at 10:43 pm (utc) on Nov. 24, 2004]
| 10:35 pm on Nov 24, 2004 (gmt 0)|
|2. buying links (sitewide or single page) |
2- none - they can not touch someone for selling ads. If they do, it is all because of hand checks.
Today I spoke at length with someone who signed a $6000 contract with a very prominent, respectable e-commerce firm. This firm even sponsors the top search engine watch site on the web. They sell link-only listings that show up in a little box table on the left side of the page, about 12 links per box. If you refresh the page, the links rotate and you see new ones.
Her backlinks went from 400 to 17,000 almost overnight. Almost all of the new ones are totally unrelated to her site, and come from these boxes. She signed up for the program hoping to increase her Google presence. To the contractor's credit, they did not represent this program as anything other than something that will "increase your traffic." She is aware of SEO issues with Google and even reads WebmasterWorld, but thought that the sterling reputation of this company would make it okay. She signed up for a limited-time exposure (one or two months, I think).
At the same time that her backlinks increased, all her keyword positions disappeared from the top 1000 in Google. She's still okay in Yahoo, MSN, and MSN beta. Yes, she was hoping to increase her Google traffic, and we both agreed that this might have worked as recently as a year ago. No, she gets almost no referrals from the ad links, because they're so fleeting and off-topic. Yes, she was aware that link farming can get you penalized. No, she didn't think this was the case with a major company that even sponsors search engine conferences where Google representatives can be expected to attend.
Her PageRank is the same, and the site: command still lists the numbers she is used to seeing. It's just that her keywords are suddenly not useful for bringing up her pages.
1) Is Google detecting dramatic increases in backlinks and penalizing that site? Remember, toolbar PageRank and the site: command are unaffected.
2) Should Google tell us if they're doing this?
| 10:36 pm on Nov 24, 2004 (gmt 0)|
instinct, I also found the cloaking comments interesting. It seems like a spammers charter. Just sprinkle a few relevant terms in to fool any semantic analysis that might take place and in the absence of a hand check, Bobs your uncle, or is there something I'm missing as I too understood that cloaking spelt death.
| 11:00 pm on Nov 24, 2004 (gmt 0)|
Scarecrow, if you call the infamous Google sandbox/lag phenomenon (that some say doesn't exist :)) a penalty then yes, her site has been penalized, since my guess is that the sudden increase in backlinks put her site deep into the "sandbox" faster than she could blink...
| 11:12 pm on Nov 24, 2004 (gmt 0)|
Personally I don't consider the Google Lag (in which I obviously believe :)) to be a penalty, just a side effect of the present algorithm. Neither do I consider the duplicate content filtering being done by Google a penalty.
Scarecrow, to answer your questions:
"Is Google detecting dramatic increases in backlinks and penalizing that site?" - I believe a dramatic increase in backlinks is one way to get hit by the Google Lag effect.
"Should Google tell us if they're doing this?" - obviously they haven't told so far (except for Matt mentioning this could be theoretically a red flag). Whether they should has been a basis for arguments in some previous WebmasterWorld threads.
| 1:06 am on Nov 25, 2004 (gmt 0)|
|do you know how they found you out? |
I am not sure, but I am sure we had plenty of our competitors submitting spam reports. One of them spends over $150,000 a month in Adwords. Not sure if their spam reports get more attention or not. Regardless of who it was, I know people were submitting reports against us.
|How much duplicate content did the sites share? |
We had ZERO duplicate content. We spent a lot of money developing content for the sites and most of the sites are based on different concepts or have different features (although there is obviously some overlap since they are all in the same industry). We are even selling different products (but again with some overlap).
I think G can remove our sites if they think it is duplicate content and/or other people are providing better/more/different information than we are. However, that is not happening. The people that took over when we dropped have the same content (without any variation) and I they even have less content/features.
So if G does not believe our site(s) deserve to be ranked, then I think the other sites currently listed should not be ranked either (they would have to go down to the 20th place or so to get any sites with unique concepts/features.
In G's defense - they did allow us to keep one of our sites at the top of the SERPs.
I am not complaining here, just stating some facts and opinions. But I would just like to see google take consistent action and report why they take action (if it is done manually). This will help define guidelines so people know what is acceptable. It has been mentioned many times that you never know what changes google will make and you could drop out of the SERPs overnight. That is EXACTLY why we created all of these sites. We do not need 6 of the top 8 sites. But we made the others and made sure they were different - some were hubs, some were heavy with content, etc. So if one of our sites dropped, we could get another one of the sites up in the SERPs quickly. Luckily they all ranked well, but that turned out to be our downfall. Even so, I am not sure if the guidelines ever specifically state you cannot own more than one site. That is exactly why I would like to have things more clearly defined. We should have done things more sneaky from the beginning, but we didn't think we were doing anything wrong. So we have learned a lesson. I will have to be more "sneaky" with how we do things. Unfortunately that wasn't previously in my nature - if it was, we would have been ok.
The SE's act like they want to work WITH the SEO's. Unfortunately both sides are being secretive with how they do things.
Attending conferences to only deflect questions dos not seem to help - neither does seeing the same presentations (if I see the Ask Jeeves bubble gum machine one more time...) . I will say that some of the SE Reps are starting to make progress, but can only be done so in the hallways or pubs.
| 1:24 am on Nov 25, 2004 (gmt 0)|
1. when you say your sites were not interlinking, do you mean there was not a single link between the sites, or that there was no crosslinking?
2. were the sites on the same server? Were they same class C IP?
| 4:42 am on Nov 25, 2004 (gmt 0)|
sitewide links will hurt your rankings...reducing them almost to zero. Not sure if it happens on 100% of the sites, but if you don't believe, there's a person with 100K pages site (on another SEO board) willing to try with your site if you don't believe it.
I personally know that for a fact and seen many other reports! Not sure if they do it via the algo, or have a G employee check the flagged sites first, but a sitewide link will rain on your parade, big time. My link is from a very well know newspaper, on the left side of the page and just has my domain.com linked. I get enough traffic to make each click less the 1 cent each, given the money I paid. Now I don't know what to do. This week I changed the anchor to just say "Mydomain.com", hopefully it will adjust itself. I'm losing too much $$ with this and I'm not sure if the there's a timed penalty or what.
[edited by: walkman at 4:59 am (utc) on Nov. 25, 2004]
| 4:50 am on Nov 25, 2004 (gmt 0)|
even G "cloaks", serving an optimized search page for pocket pcs, some highly ranked sites let your choose different skins and modules and again serve different code depending on language and browser and crawler and G has no problem with it.
|I thought cloaking was the easiest short-cut to a ban |
| 10:18 am on Nov 25, 2004 (gmt 0)|
Can anyone say how long the penalty is for using invisible text?
| 10:49 am on Nov 25, 2004 (gmt 0)|
*Zero tolerance seems to be G's approach to..*
Actually G seems pretty tolerant to me, and most linkage based penalties seem due to the cumulative effect of more than one linkage "scheme".
For example, G may well accept some crosslinking OR run of site links, but may well baulk at crosslinking AND run of site links.
| 1:11 pm on Nov 25, 2004 (gmt 0)|
I suspect onebaldguys problem was the similarity between the backlinks to each of his sites.They maynot have been interlinked but all shared common backlinks.
| 3:21 pm on Nov 25, 2004 (gmt 0)|
On buying/selling links, I've just read an interesting thread from last June on another forum.
It basically involves a reply from you know who which warned about buying or selling links to increase a site's PageRank value, pointing them to their Webmasters guidelines.
What's interesting for me is the apparent confirmation that buying/selling links can now be seen as a "links scheme", with all that entails.
For a long time there's been a very narrow view taken on what might constitute a "links scheme", and this answer seems to blow it wide open.
There naturally was great suspicion about the genuineness of the reply, it seems Mods/senior members got to view it, and were happy enough with it.
| 6:08 pm on Nov 25, 2004 (gmt 0)|
Where does the threshold lie with respect to dup content? What percentage would be considered too high now?
| 6:16 pm on Nov 25, 2004 (gmt 0)|
I've just been reading about sitewide paid links on other forums, and there does seem to be evidence of devaluation of such linking, and in some cases even drastic devaluation, to the point of "penalty." Most of the posting on this topic started less than two months ago, so Google might be doing something new.
If Google is indeed doing this, then all I have to say is, "Google, please be sure you don't exempt all those sitewide blogrolls!"
Trimming back the blog noise would help the SERPs, in my opinion. On another forum, there's a short thread about apparent blog devaluation already, although so far no one has related it to the sitewide linking situation.
When you think about it, the emergence of blogging software and blogrolls really shifted the nature of backlinks on the web. Now this same tendency is observable on some big e-commerce networks. PHP makes it easy to do, and all the hype about a resurgence in web advertising means that everyone wants to get a piece of the pie.
From the perspective of Google's original definition of a backlink as a "vote" by one site for another site, these recent tendencies have made any sort of pure PageRank calculation almost meaningless, unless you impose a lot of filtering before you do the calculation.
| 10:07 pm on Nov 25, 2004 (gmt 0)|
>>1) Is Google detecting dramatic increases in backlinks
Stands to reason, 100 links to 10000, sore thumb me thinks. Certainly not natural
>>2) Should Google tell us if they're doing this?
No, they can do what they want. :)
3)..... expect to get a straight answer from Google?
No way Jose. Misinformation is what keeps people from gaming google. Granted not all of them. But a lot of them. I would take it step further and play with ranking to keep us all completly foxed.
you would have a lot less fun if i was running the show lol, and I would have shed loads :)
| 10:25 pm on Nov 25, 2004 (gmt 0)|
Hi people sorry to back a bit but on the dupe content thing in know this refers to all the code on a page and is for templated sites with very few differences on the page.Our site has a guide section with all unique content that started in around 97 and has been built up , and were still building.To the point ive had a bit of a purge lately on sites that have totally ripped our site off.Some real bad ones as well, going about 3/4 links deep 250 + pages only difference in site is the name as well as this im now seeing a good number of other sites (some,big players) that have taken big chunks of our content as well.Does anyone have any theories on how much this damges us as a site , with the dupe content filter.We belive their is a new one since aug 24 and the reason for all the fluctuation.When incidentally we dropped like a stone.Admitedley our own fault we had a lot of bad pages borderline spammy.
Any Ideas Please
| 5:19 am on Nov 26, 2004 (gmt 0)|
I can't help a whole lot with how much Google will devalue sites due to duplicate content - and I don't know how google knows the original creator of the content... I do know that, due to concerns about hijacking, we are now taking action against every site that is displaying our code in an attempt to hijack our listings, as well as sites that have taken text off our pages... We recently filed our sites with the Copyright office (only costs $30 for all the sites owned by a company) and will be doing so on a regular basis going forward (probably every 4-6 months). It'll take 6 months to receive confirmation, but we're looking forward to finally hearing back that it's confirmed. We should have been doing that ages ago (what if a site that has stolen some of OUR content was to have filed a copyright claim six months ago – it would be VERY difficult to disprove). As I was doing my research I came across Google's page which describes how they handle DMCA copyright complaints:
(I’m assuming Yahoo and MSN search have similar DMCA methods)
It's quick and easy - and then Google removes the offending page (or, what would be really cool, the entire site). Anyway, the drawback to the Google methodology is that they will automatically reinstate the content if the other person 'protests' (all they have to claim is that they own the content and they are put right back in). If the other person protests, you have 14 days to file a suit or everything is left the way it originally was.
I'm not sure we're going to want to hassle with suits - but we are definitely going after the hijack attempts BEFORE one of them succeeds...
Hope that helps,
| 6:20 am on Nov 26, 2004 (gmt 0)|
> I thought cloaking was the easiest short-cut to a ban
There is so much cloaking out there now, that it is hard to find a top 1000 site that doesn't have some kind of cloaking going on. You can't tell the difference between a geo cloak, a language cloak, a session cloak, or an seo cloak. The motto for the last 3 years has been, cloak or die.
>refers to all the code on a page and is
> for templated sites with very few differences on the page.
No, templates can be radically different and google will still id the dupe guts very easily. Templates are cake to strip off a site.
> Google seems unable to tell which site came first.
I give g their due when they deserve it, and their dupe content catcher is the best in the biz.
> Actually G seems pretty tolerant to me
Ditto Glengara. If you have a quality site that focuses primarily on content first, you can push it to the limit.
| This 35 message thread spans 2 pages: 35 (  2 ) > > |