Forum Moderators: open

Message Too Old, No Replies

Using robots.txt for site excluded from Google

         

wellnesscafe

11:03 pm on Jan 17, 2003 (gmt 0)

10+ Year Member



A question about using robots.txt.
My website was listed on the top 10 search results pages for years. Recently it got banned from google. Google wrote to me: "Your page has been blocked from our index because it does not meet the quality standards necessary to assign accurate PageRank. We cannot comment on the individual reasons your page was removed. However, certain actions such as cloaking, writing text in such a way that it can be seen by search engines but not by users, or setting up pages/links with the sole purpose of fooling search engines may result in permanent removal from our index."

I am affiliated with several merchants and I have been cloaking some of these merchant's links to protect my interest. I think these cloaked links offended google and that is why my site is now banned.

I have just created a robots.txt file with the following: User-agent: * Disallow: /a/ (/a/ is the directory where I have all the cloaked files).

I need to know if " User-agent: * Disallow: /a/ " is the correct command. I'm an inexperienced webmaster of my own site, getting some help from friends, and prone to make mistakes. I recently learned that I could have framed these links instead of cloaking them, to achieve the same purpose. Should I put a * at the end of /a/ to make sure that none of my cloaked files get spidered by google?

Is there something else I should be doing? Should I re-submit my site to google or wait until google re-indexes next month? What are my chances to have my site admitted back to google?

Any feedback would be greatly appreciated.

jomaxx

12:25 am on Jan 18, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You don't need or want an asterisk at the end of the line. Just using /a/ will indicate everything in that directory.

FYI, I don't want to confuse the issue further but think that what you are describing as "cloaking" is not the same as the search engine cloaking people discuss here. SE cloaking is basically presenting Googlebot with a different page than other users will see.

wellnesscafe

12:58 am on Jan 18, 2003 (gmt 0)

10+ Year Member



Thanks for the reply jomaxx. You're right, I am not presenting Googlebot with a different page than other users will see. All I'm doing is cloaking the urls of the merchants I'm affiliated with so that when people return to these merchants via their favorites/bookmarks to complete their purchases, I can still get a commission. So, if I understood your reply correctly, the cloaked urls on my site do not violate google's policies?

In the event that it does not comply with google's policies, can I be assured that the robots.txt User-agent: * Disallow: /a/ will satify google?

I've also cleaned up some hidden dots that I was using to position my images and text. What are the chances that google will re-index a banned site?

jomaxx

3:17 am on Jan 18, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



-> It's not "cloaking", but what you are basically doing is copying a page from another site (i.e. the merchant you're affiliated with) and changing all the links. IMO it's proper that you exclude those pages via the robots.txt file.

-> See this page for details on the robot.txt file straight from the horse's mouth:
[google.com...]

-> As far as getting unbanned, I would wait until the next indexing. If you don't recover then, then as others have already told you, it's probably going to be a long wait.

Nobody can say for sure, but my personal opinion from having taken a close look at your site several times now is that you weren't penalized for the "dots" or the affiliate links, but for using white-on-white text. There's really no grey area to this rule; the only reason to stick invisible keywords at the bottom of your page is to manipulate search engine rankings. If I'm right, you will probably suffer a long penalty.