homepage Welcome to WebmasterWorld Guest from 54.211.164.132
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
Forum Library, Charter, Moderators: goodroi

Sitemaps, Meta Data, and robots.txt Forum

    
Submitting A Sitemap To Protect Against Scrapers
Can scrapers take credit for your non-indexed pages?
maximillianos




msg:3897427
 12:55 am on Apr 22, 2009 (gmt 0)

The question of whether or not scraper sites could take credit for your non-indexed content came to mind a few days ago when I was reviewing my indexed pages in Google's Webmaster Tools.

My site has about 100,000 content pages. Google is only indexing about 75% of those pages. Got me to wondering, can a scraper come along and take that other 25% percent, get it indexed, and become the original source?

Just because G doesn't index 25% of my pages, does that imply they are not aware of those pages?

Seems like a potential big problem for sites that are not indexed well, and get scraped a lot, or duplicated/copied, etc.

Previously I had not been using any of the search engine's sitemap tools, but I decided to go ahead and submit a sitemap to them just for this reason... So they at least have some way to know what content is mine...

Not sure if this is an obvious topic, just thought I would throw it out there...

 

goodroi




msg:3897775
 2:42 pm on Apr 22, 2009 (gmt 0)

In general search engines tend to look at what site has more links and trust when there is a question of scraped content. It also really helps to be the first site to have the content discovered on.

There are some common reasons for non-indexed content. Each situation has a different risk level for potential scrapers being the first discovered source.

Too soon - Search engines are fast but they still need a few days (or weeks depending on the site) to crawl & index content. If this is the case I would not worry too much. Search engines are usually much faster at discovering your content then scrapers are. A sitemap might help but is generally not necessary. What normally helps most to boost the speed search engines index your site is boosting your link popularity.

Unreachable - The content is hidden behind forms, blocked with robots.txt or has no links pointing to it. This has a high risk of scraping nightmare. You should definitely use a sitemap to expose this content to search engines (not to mention removing the roadblocks on your site).

Low Value - Search engines generally do not want to index blank, duplicate or other pages with very low value to users. If your page is not indexed, that does not mean it was not visited/crawled by the search engines. It does not matter how many sitemaps you submit, search engines will not index pages they deem to be of low value. The best way to fix this situation is to increase the amount of unique text on each page and boosting the link popularity wouldn't hurt.

In general, I have found most concerns about scrapers are blown out of proportion. I don't like scrapers and I try to stop them. But I worry much more about my competition.

Even if a scraper steals all of my content they will not outrank me. That is because I work hard to ensure my content is easily crawlable, most pages have significant & unique text, and my link popularity will blow away scrapers. This simple recipe ensures that my pages rank high and the search engines filter out the scrapers.

Occasionally a scraper site will slip through and ranks in the serps. That is when I pull out the DMCA requests and start emailing hosting companies.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved