Welcome to WebmasterWorld Guest from 54.146.230.149

Forum Moderators: Robert Charlton & aakk9999 & andy langton & goodroi

Message Too Old, No Replies

Content copying by others - duplicated copy

     
9:25 pm on Nov 14, 2013 (gmt 0)

New User

joined:June 27, 2013
posts: 27
votes: 0


As the internet is expanding so the cost of maintaining the edge. I am finding that there are webmasters that are copying my web content.

does anyone have had a similar experiences? If so, what solutions or remedies do you suggest?
10:36 pm on Nov 14, 2013 (gmt 0)

Junior Member

joined:Oct 26, 2011
posts: 107
votes: 4


We have that problem, too. If there is contact info available for sites that copied content, we send a note and tell them to take down our content by a specific date and then we go look to see if they did so. If they didn't or if there is no contact info available, we report the offending site to Google to have them take down the offending page.
1:28 am on Nov 15, 2013 (gmt 0)

New User

joined:June 27, 2013
posts: 27
votes: 0


Thank you, what tool do you recommend to use -our site has over 3000 pages .

Much appreciated
1:39 am on Nov 15, 2013 (gmt 0)

Moderator This Forum from GB 

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month

joined:Apr 30, 2008
posts:2543
votes: 154


All of your 3000 pages are copied/scraped?
1:48 am on Nov 15, 2013 (gmt 0)

New User

joined:June 27, 2013
posts: 27
votes: 0


No all fresh copy that we have published - my concerns are how do you monitor other copying your content
5:01 am on Nov 15, 2013 (gmt 0)

Preferred Member

joined:June 10, 2011
posts: 521
votes: 0


does anyone have had a similar experiences? If so, what solutions or remedies do you suggest?


I had and still having similar experience.
My content has been scrapped all other the Internet in many ways.
I'd suggest you to do .. nothing. Google, in most cases, can recognize the original source and your site may even benefit from this situation.
You can see it like link building, since coping a content is an indication that it is valuable for readers.
5:29 am on Nov 15, 2013 (gmt 0)

New User

joined:June 27, 2013
posts: 27
votes: 0


thanks
1:46 pm on Nov 15, 2013 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Oct 14, 2013
posts:2305
votes: 185


I'd suggest you to do .. nothing. Google, in most cases, can recognize the original source and your site may even benefit from this situation.


You've got to be joking, nothing could be further from reality, Google has no clue as to the orignator.

If you've been copied/scraped outside of USA jurisdiction you will find it very difficult to get anything removed, not completely impossible, just extremely difficult.
2:56 pm on Nov 15, 2013 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:12739
votes: 159


There are services you can pay for that monitor your content for copying and scraping; you can also put some unique phraseology in your pages and run Google alerts on it (that's 90% of how I find people scraping my stuff)

It's something that can suck up a little of your time or a lot of your time or all of your time, so you want to make sure you keep on top of how much if at all your scrapers are really injuring you. In my case, they pretty much never outrank me, and if I go after them it's just on basic principles (and to maybe make an example out of them and deter others) If they were actually costing me real money, then I'd be a lot more aggressive about it - but it would come at a cost, that cost being the time I can spend on generating new stuff.

Bottom line is, it's not a Google problem, it's a human problem and one that will never never never be completely solved with the internet we got. You do what you can do, and realize that in part, it comes with the territory.
5:06 pm on Nov 15, 2013 (gmt 0)

Preferred Member

joined:June 10, 2011
posts: 521
votes: 0


Google has no clue as to the originator.


Yes, they (Google) have a clue.

The only rare case where I see problem with scrapping one's content is when a big authority site did it. Google treats them differently. But authority sites don't scrap.
5:52 pm on Nov 15, 2013 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:12739
votes: 159


Sure they do. That's who most of my scrapers are - television or newspaper sites. Big ones, in some cases.
9:18 pm on Nov 15, 2013 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:12990
votes: 287


There are services you can pay for that monitor your content

This is where things get recursive. All those services involve robots crawling other sites, right? They're not staffed by humans at home visiting the sites via ordinary consumer ISPs based in English-speaking countries. So the site owners watch the server-farm lists, as in our own SSID subforum, and block everyone in sight. Do you poke holes for everyone who provides a service to someone, somewhere? I sure don't.
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members