Welcome to WebmasterWorld Guest from 54.226.67.166

Forum Moderators: goodroi

Message Too Old, No Replies

How to stop Google from indexing a page?

I need to block the following page alone from Google Index:

     

sathishvaiha

9:33 am on Jul 8, 2014 (gmt 0)



I need to block the following page alone from Google Index through Robots.txt:
h t t p://stores.example.com/BlueWidgets

I don't want to make Google to stop crawling the following pages.

h t t p://stores.example.com/BlueWidgets/Mobile-Phones/Model1
h t t p://stores.example.com/BlueWidgets/Mobile-Phones/Model2


Need your suggestion? Thanks in Advance!

[edited by: engine at 9:48 am (utc) on Jul 8, 2014]
[edit reason] obfuscated [/edit]

phranque

12:47 pm on Jul 8, 2014 (gmt 0)

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



welcome to WebmasterWorld, sathishvaiha!


if you want to "stop Google from indexing a page" you should use meta robots noindex element.
using robots.txt will exclude a bot from crawling but is not an indexing directive.

lucy24

6:39 pm on Jul 8, 2014 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



Technical answer: Googlebot does recognize the "Allow" directive, so what you want to do is theoretically possible. You Disallow the directory, and then Allow its individual pages.

But a "noindex" meta is probably a better way to go-- especially when you're talking about rewritten URLs where the directories don't physically exist.

sathishvaiha

4:19 am on Jul 9, 2014 (gmt 0)



Awesome replies! Thanks @phranque & @lucy24.

tangor

2:55 am on Sep 12, 2014 (gmt 0)

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



Sadly, you will have to let the G see the page to see the directives. the only way to keep G from seeing a page is to password protect it.

Aside: don't rely on noindex to keep G from reading the page. they have to to see noindex. Catch-22

Have to ask: Why do you want to disallow G for that page? (General response, do not be too specific)

rockymax945

4:55 am on Oct 8, 2014 (gmt 0)



You may Stop Google crawler to index your website page by using the Robot.txt File. You may need to place this file in root of your website.

lucy24

7:23 am on Oct 9, 2014 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



You may Stop Google crawler to index your website page by using the Robot.txt File

Nooo ....

tangor

10:01 pm on Oct 9, 2014 (gmt 0)

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



Clarification re: robots.txt:

Only good bots will honor, but even then your doc might be visible from other links/locations. If you don't want goggle, or anyone, to see that page, password protect it, ie. don't make it available to the net at large.

ps: if you don't want g to see it, don't put it up in the first place. They will find it via other links/references./email, etc. You can't "hide" anything from the Gorg.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month