Forum Moderators: open
Thread Rules:
-q's about Google operation. (not gg himself)
-one question only please.
-no noise or clarifying questions.
-no specifics about your site please.
-off topic stuff will be zapped (nffc is on duty!).
-Keep it short. Don't drone on for more than 2 sentences.
Both Netscape and SGI had some great pranks and experiments online. You just can't have as much brain-power in one place as Google has, without some really creative stupid human tricks going on.
The page rank system works well for information sites but how many inbound links occur naturally for a Viagra or online pharmacy sites? If the page rank was gone then all the un-natural linking methods would vanish saving Google a big headache. Any plans at all to change things in this area? -- you said in Boston that you agreed this was a problem.
1. With the freshbot, instead of dumping the new listings after a few days, to keep the pages listed until after the deepcrawl gets them listed for good.
2. On your webmaster section, including a section talking about HTTP_IF_MODIFIED_SINCE, telling us how to get it working.
3. Making a form in the webmaster section where we can enter a URL and it will tell us the status of the URL, has it been banned, and what the PR is for example.
If this is a penalty, can you provide any information on what type of penalty it would be? Sites that are in this situation do not seem to be doing anything against the rules (except possibly linking to a PR0 site).
Thanks.
[edited by: pixel_juice at 6:49 pm (utc) on May 27, 2003]