| 10:28 am on Aug 9, 2005 (gmt 0)|
I think that Google is likely to penalise this kind of content.
| 12:53 pm on Aug 9, 2005 (gmt 0)|
Never try to infringe the content of the website which google has indexed. Its sure it will penalize and also it will reduce your Page rank and also the position in SERPís. If you want to get the good ranking in the short span try to have unique content which google loves.
Once you have inserted the unique content then see the magic of ranking in google. And yes i am getting extremely good ranking for many competitive keywords just coz of good content which is unique.
| 1:46 pm on Aug 9, 2005 (gmt 0)|
What if it is only a paragraph or two of editorial content?Do you think google would filter/penalise you for that.
I use copyscape.com now to check for duplicate content on the web, but are u saying that copyscape should find zero pages that are a likeness to your site?
| 2:00 pm on Aug 9, 2005 (gmt 0)|
|are u saying that copyscape should find zero pages that are a likeness to your site? |
I don't think anyone knows for sure how Google checks for duplicate content or what percentage is acceptable. But if you find nothing using that tool you are pretty safe.
| 9:54 pm on Aug 9, 2005 (gmt 0)|
What is anyone's opinion about taking the content that an affiliate has provided and changing the words, structure of the sentences, or other minor changes just so it is not exactly the same as the content provided? I know it is not defined anywhere, but an educated opinon would be appreciated, is doing what I mentioned good enough to avoid the google duplication penalty?
| 7:31 am on Aug 10, 2005 (gmt 0)|
One for instance could be.
The copy everyone uses:
Green widgets are preferred over red widgets.
Green widgets have stood the test of time and endurance; making green widgets the widget of choice for many people.
In other words make it your own and it is not duplicated in any manner but spirit :-)
| 7:36 am on Aug 10, 2005 (gmt 0)|
|Never try to infringe the content of the website which google has indexed. |
If the merchant has made the content available to their affiliates - and they DO provide the text - then it isn't infringement, it's at the merchant's invitation.
|Its sure it will penalize and also it will reduce your Page rank and also the position in SERPís. If you want to get the good ranking in the short span try to have unique content which google loves. |
Position and PR aren't the same but yes, make the content unique by rewriting it.
| 9:21 am on Aug 10, 2005 (gmt 0)|
Thanks for the tip on that copyscape program, although I begining to wish I had not tried it.
I found over fifty sites copying my content.
All scraper trash, loaded with adsense.
So far non of that trash has shown up on my SERPS radar.
It is scary though, is it just a matter of time before I am beaten by my own content on another site?
Should I be worried?
| 11:08 am on Aug 10, 2005 (gmt 0)|
|All scraper trash, loaded with adsense. |
Google is it's own worst enemy. Their extreme greed (Adsense), especially post going public, has spawned millions upon millions of countless **** pages of nonsense that they are ranking just to get the Adsense click. They can't sustain these junk results. Every page I publish is original content - only to be copied 1 week later on a scraper / adsense site.
| 12:15 pm on Aug 10, 2005 (gmt 0)|
I am considering putting Adsense on one of my sites.
Will it help or hinder it?
In terms of ranking on google and other search engines.
| 12:18 pm on Aug 10, 2005 (gmt 0)|
Whats interesting tho, is if I check my main competitors, who use the same casino contentas we do (given out by the casinos themselves)
Run this through copyscape and there are over 10 matches found all the time.
But they are still #1 for competitive keywords.
So I'm still undecided as to if this makes a difference. Im leaning towards the idea that it is only if the page is 80-100% similar than a page on YOUR website. That is when a google filter would be applied.
| 12:21 pm on Aug 10, 2005 (gmt 0)|
It does seem to suck if you are adding relevant content to your site for your customers, albeit given out by the casinos etc, only for google to slam a duplicate penalty on your site.
| 2:15 pm on Aug 10, 2005 (gmt 0)|
What sucks (for Site owners Google and users) is when your original write up is hoover'ed up then vomited into a template with adsense scyscaprers plastered all over it
..-then you see 100's of backlinks from other sites from the same neighbourhood helping this crap rise to the top.
That is what sucks.
What does not suck (at least as far as I am concerned so far) is that Google seems to be dealing with these sites as best as can be expected at the moment, so that at least
in my area, travel (specific country) they are not a problem yet.
| 2:34 pm on Aug 10, 2005 (gmt 0)|
what about content news from from sites like reuters or ap, thousands sites use the content from these sites like myway.com, military.com etc.. but these site seem to be stable..
| 2:40 pm on Aug 10, 2005 (gmt 0)|
I'd avoid stock content like this or if you use it, make sure the pages contain a good ammount of original content. I doubt Google will penalize you per se, but it's unlikely that pages will rank well. As as far as adwords, they don't help or hinder your SERPS, however pay close attention to the content of the ads, if the ads are significantly off topic for a longtime, you'll need to re-optimize the on-page content.
| 2:42 pm on Aug 10, 2005 (gmt 0)|
Thanks folks, I've taken your suggestions and am currently rewriting my own content based on the content that has been served up by affiliate programs.
| 6:38 pm on Aug 10, 2005 (gmt 0)|
We've got a site that in our support section we would like to include a handful of RFC documents to help our users understand certain issues. Do you think if we take a few of these articles mentioning where they came from that we would be ok?
The orginals reside here [ietf.org...]
| 12:39 am on Aug 11, 2005 (gmt 0)|
Skunker ... Thanks folks, I've taken your suggestions and am currently rewriting my own content based on the content that has been served up by affiliate programs
You have now become a spamer in the not dupilcate but differing / not differant information about the same product under a differant guise secenario ... imo google should have allowed affilate links to progress.
Producing unique information about the same product is just a sham and will just inrease spam pages to no honest end.
Google has shot itself in its own foot.
Just my 2p
| 2:56 am on Aug 11, 2005 (gmt 0)|
Actually, my info is more accurate than the affiliate content that was provided because I am a local and I am using their content and adding my own two cents into the mix.
| 3:43 am on Aug 11, 2005 (gmt 0)|
Skunker: You have a real advantage being local.
In your shoes, I would absorb what others have about a given topic, biz, hotel or whatever,
add to that what YOU know about same, look up further info locally,
and do a completely new writeup based on all the above, removing duplication / errors etc.
Keep using copyscape, and do your own copy search with unique short phrases of your own.
There are numerous threads here about defense against scraping and its consequences. - Larry
| 6:09 pm on Aug 11, 2005 (gmt 0)|
I have a big shopping portal and over 1 million sites indexed in Google.
The normal Googlebot takes approx. 30.000 URLs per day.
But the Mozilla Googlebot is since a few days very active (up to 100,000
visits on the day) and deletes the visited URLs on its index. The normal bot ist very inactive (approx. 500 sites per day).
Neither I use Cloaking nor I make other spam techniques.
The only problem is double Content by faults in the URL Rewriting
for many months with a big duplicate content filter.
I removed all these URLs with the Google Removal tool, and the duplicate
content filter went away for 6 weeks.
For a couple of days all these URLs are appearing again - and the
deleting-problem with the Mozilla Bot started parallel.
I don't know whether the whole is due to the reappearing of the duplicate content.
Does anybody have an idea why URLs are deleted by the Mozilla Bot?
| 4:28 pm on Aug 12, 2005 (gmt 0)|
Armi- instead of using dup content pages with googles tool-try banning the dup urls from robots with the robots.txt-
| 4:34 pm on Aug 12, 2005 (gmt 0)|
The main question I have is- if I have content on my site-info articles on the subject of our product, ranging from usage to safety- and I allow other sites(authority sites and article sites)- copy and display my content with a hotlink back to me and my info box displayed- and I have a copyright on my pages-will I get a dup content penalty? especially since i am not ranked highly?
| 8:41 am on Aug 13, 2005 (gmt 0)|
This isnīt possible, because there is no determined string at the beginning of the URL.
But even if: If the Googlebot doesn't reach these URLs, it doesn't delete them either.
You have to use the removal tool too.....