Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Blocking Access to Directory of Member Site - Effect?

directory leads to 95%+ of all indexed pages on site

         

alexanac

8:22 am on Jul 16, 2007 (gmt 0)

10+ Year Member



Anyone have any experience with suddenly dropping access to the vast majority of internal links on a site?

Members on our site have their own custom page. Our business model dictates that about 50% of our users are searching for a specific member page (and are not members themselves). We have both a search and a directory listing. The directory listing has one page per letter and lists all matching members (i.e. ".../directory/a/" lists all members with the last name beginning with "A" and has links to their custom page). Pretty straightforward.

We had received advice early on to leave the directory pages open to the search engine webcrawlers (indeed, our robots.txt lists no restrictions of any kind on our site). The thought was to juice internal links automatically as more and more members joined and created their own content. The member content is all unique, but the pages are all structured the same.

Several members have not liked the idea of their page being included in search engines due to privacy concerns. We seem to get more and more requests to not have member pages listed in search engines as time goes on so we're trying to put together a plan to address this. One option on the table is to simply block crawler access to the directory altogether. The directory is the only place static links to the member pages exist.

We don't get much direct SE traffic to the pages that will be blocked so I'm not worried about losing traffic that comes in that way. I'm also inclined to believe we're not being helped by these internal links in regard to page ranking, though I can't verify it. My current thinking is that Google may actually be viewing the member created pages as duplicate content and be penalizing us. We are on the first page for our primary keyword on most SEs except for Google - where we have drifted between 30 and 130 for the past year. We're older and have far more quality backlinks (from major news sources - CNN, MSNBC, etc.) than 95% of the people ahead of us (and also have a far higher PR, though I realize that's not everything). *Something* is holding us back and since 80% of our SE traffic comes from Google anyway (almost all through ads now) I'm willing to shake things up to see if it can help. We don't have much to lose at this point.

Blocking the directory links - help? hurt? why not give it shot? Any advice is appreciated.

Thanks in advance!

Tony

Robert Charlton

5:05 pm on Jul 17, 2007 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Tony - I'm bumping this up, hoping you can get some answers. Your description is complicated, though. Possibly, I'm thinking, you might take a shot at clarifying your verbal picture, perhaps with a shorter description

Several members have not liked the idea of their page being included in search engines due to privacy concerns.

Is it possible to provide an option for these members which would result in a robots meta noindex tag being placed on their individual pages? I don't know how you'd code this, but the robots meta is the best way to control indexing of individual pages.

errorsamac

5:48 pm on Jul 17, 2007 (gmt 0)

10+ Year Member



I did this before using robots.txt. I basically prevented googlebot from hitting my /widgets/ pages (which were 90% of the website). I did not see any negative effects from this. This domain does not rank well to begin with however, but its position did not change at all (up or down).