homepage Welcome to WebmasterWorld Guest from 54.196.159.11
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Removing URLs from Google Search Results
techcloud7




msg:4603000
 3:32 pm on Aug 18, 2013 (gmt 0)

Hello guy's I need help regarding the removal of urls from google search. I run a blog called as www.example.com where in google search result I am getting some duplicate results like "site:www.example.com/tag" <snip>. So I have tried all methods like Google removal request and Robots.txt but still facing the same issue please help me with the same.

[edited by: aakk9999 at 4:17 pm (utc) on Aug 18, 2013]
[edit reason] ToS, examplified domain name [/edit]

 

aakk9999




msg:4603017
 4:31 pm on Aug 18, 2013 (gmt 0)

Welcome to WebmasterWorld!

From reading your post I presume you have added:

Disallow: /tag

to your robots.txt and then you have gone to Google Webmaster Tools and you have removed folder /tag using Remove URLs feature.

How long ago has this been done? I am asking as it can take some time for Google to process your request. Also note that your removal request will be active for only 90 days and if you want to keep these URLs out of Google index, you will need to repeat the request periodically.

The alternative to the above would be to add meta robots noindex on the /tag pages. In this case you need to ALLOW Google to crawl /tag URLs so that Google can see the noindex directive (i.e. if you have disallowed crawling of /tag URLs, you will need to remove Disallow: /tag from your robots.txt)

In this case too it may take some time for Google to process these and remove these URLs from its index because Google will need to request each individual /tag URL in order to see noindex directive.

JD_Toims




msg:4603048
 6:58 pm on Aug 18, 2013 (gmt 0)

The alternative to the above would be to add meta robots noindex on the /tag pages. In this case you need to ALLOW Google to crawl /tag URLs so that Google can see the noindex directive (i.e. if you have disallowed crawling of /tag URLs, you will need to remove Disallow: /tag from your robots.txt)

This is the best way I know of, because if a URL is disallowed in robots.txt and there are links to the URL Google is known to index the location based on the information in the links and surrounding text.

Another way of accomplishing noindex for URLs if it's impractical to put a noindex on the page itself is to set an X-Robots-Tag header with noindex as the content via .htaccess or httpd.conf.

phranque




msg:4603108
 12:12 am on Aug 19, 2013 (gmt 0)

welcome to WebmasterWorld, techcloud7!


the noindex solution is the best method as it's a long-term solution and works across all search engines.
in order for this to work you must allow crawling of the /tag urls.

one thing that isn't clear from your post is if your urls literally contain "/tag/" like www.example.com/tag/tag-name or if your mean the generic "/tag" urls like www.example.com/tag-name.
how many tags are there and do you want all tags noindexed or just some of them?

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved