Forum Moderators: Robert Charlton & goodroi
However, I am concerned about one thing: Content Duplication.
I am importing hotel and condo data from hotel affiliate websites that make the content available for people like me. No doubt, there have been thousands of other people copying and pasting just like I am doing. Will Google penalize my site for copying and pasting this content that hotel affiliates make available? Does anyone know if I should re-write this content in my own words rather than just copying and pasting the info from the affiliates? There are like 100s of hotels and condos descriptions that I'd have to re-write.
Any thoughts? Tips? Opinions? Thanks.
Never try to infringe the content of the website which google has indexed. Its sure it will penalize and also it will reduce your Page rank and also the position in SERP’s. If you want to get the good ranking in the short span try to have unique content which google loves.
Once you have inserted the unique content then see the magic of ranking in google. And yes i am getting extremely good ranking for many competitive keywords just coz of good content which is unique.
Cheers
Pradeep SV
The copy everyone uses:
Green widgets are preferred over red widgets.
Your rewrite:
Green widgets have stood the test of time and endurance; making green widgets the widget of choice for many people.
In other words make it your own and it is not duplicated in any manner but spirit :-)
Ann
Never try to infringe the content of the website which google has indexed.
Its sure it will penalize and also it will reduce your Page rank and also the position in SERP’s. If you want to get the good ranking in the short span try to have unique content which google loves.
I found over fifty sites copying my content.
All scraper trash, loaded with adsense.
So far non of that trash has shown up on my SERPS radar.
It is scary though, is it just a matter of time before I am beaten by my own content on another site?
Should I be worried?
All scraper trash, loaded with adsense.
Google is it's own worst enemy. Their extreme greed (Adsense), especially post going public, has spawned millions upon millions of countless **** pages of nonsense that they are ranking just to get the Adsense click. They can't sustain these junk results. Every page I publish is original content - only to be copied 1 week later on a scraper / adsense site.
Run this through copyscape and there are over 10 matches found all the time.
But they are still #1 for competitive keywords.
So I'm still undecided as to if this makes a difference. Im leaning towards the idea that it is only if the page is 80-100% similar than a page on YOUR website. That is when a google filter would be applied.
What sucks (for Site owners Google and users) is when your original write up is hoover'ed up then vomited into a template with adsense scyscaprers plastered all over it
..-then you see 100's of backlinks from other sites from the same neighbourhood helping this crap rise to the top.
That is what sucks.
What does not suck (at least as far as I am concerned so far) is that Google seems to be dealing with these sites as best as can be expected at the moment, so that at least
in my area, travel (specific country) they are not a problem yet.
The orginals reside here [ietf.org...]
You have now become a spamer in the not dupilcate but differing / not differant information about the same product under a differant guise secenario ... imo google should have allowed affilate links to progress.
Producing unique information about the same product is just a sham and will just inrease spam pages to no honest end.
Google has shot itself in its own foot.
Just my 2p
Keep using copyscape, and do your own copy search with unique short phrases of your own.
There are numerous threads here about defense against scraping and its consequences. - Larry
The normal Googlebot takes approx. 30.000 URLs per day.
But the Mozilla Googlebot is since a few days very active (up to 100,000
visits on the day) and deletes the visited URLs on its index. The normal bot ist very inactive (approx. 500 sites per day).
Neither I use Cloaking nor I make other spam techniques.
The only problem is double Content by faults in the URL Rewriting
for many months with a big duplicate content filter.
I removed all these URLs with the Google Removal tool, and the duplicate
content filter went away for 6 weeks.
For a couple of days all these URLs are appearing again - and the
deleting-problem with the Mozilla Bot started parallel.
I don't know whether the whole is due to the reappearing of the duplicate content.
Does anybody have an idea why URLs are deleted by the Mozilla Bot?