homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 144 message thread spans 5 pages: < < 144 ( 1 [2] 3 4 5 > >     
How long does pr0 last?

 7:17 pm on Jun 6, 2002 (gmt 0)

Is anyone else feeling about Google's PR0 (or PR=low) permanent penalty the way I am... as if I am being treated badly, unfairly, capriciously, and without understanding.

The way I see things now: Six months ago I got "caught" linking in a legitimate business fashion- I have 8 domains, most of them high-content multipage sites- but also one "blue-widget" site the main page to which I linked from most all the pages of my other domains. The main reason for these links was that I WANTED TO SELL BLUE WIDGETS to those visitors! Why Google could not just stop counting links (from a particular domain greater than one or ten or whatever number Google might select) is beyond me. Instead, Google seems to have opted for being the judge jury and executioner and given my "blue-widget" site a permanent ban.

Why is the ban for linking patterns I might have had six months ago permanent? Such permanency feels arbitrary and capricious given that other bans have been only temporary. Yes, I know that Google needs to protect their results; but when Google installs filters that effectively (and permanently) kick sites out of the database for reasons such as the above, Google would seem to create a reservoir of ill will that no company can long afford. And it is so unnecessary when a temporary ban would likely serve as well.


The Contractor

 7:34 pm on Jun 7, 2002 (gmt 0)

I also would like to know what WG has asked especially:

<<Why is it that the same cross-linking penalties never seemed to be applied to the big network of sites like internet.com?>>

I have seen a cetain site that has held the top position for certain keywords that is the largest spamming crosslinking "farm" that I have found to date. I noticed this site more than 2-years ago in google and it is so blatant throughout it's mirroring and crosslinking yet nothing has been done. Is it because it probably has 100K backlinks that Google and others do not punish it. I really ignore the site and I'm sure that most users do since it has very little useful content.
There is also another site that is a single page of exactly "101 links". It has a PR8, how can this be - one page to the whole domain with 101 links and it get PR8?

I agree that Google needs to address why certain sites can get away with this and others might as well pack up the farm after making a stupid mistake.

Although Google has been my favorite SE for a few years, I must admit I wonder all the time how stable it's algo and filters are when seeing the above :(


 1:53 am on Jun 8, 2002 (gmt 0)

Good questions. Some of these touch on scoring/indexing, so give me a day to mull them over and I'll follow up later this weekend.

The Contractor

 1:59 am on Jun 8, 2002 (gmt 0)

Thanks GoogleGuy ;)

I just think by answering a few of these questions we could better understand what is set in stone and what is not? Is there hope for recovery or should they move on?


 2:42 am on Jun 8, 2002 (gmt 0)

The amount of "gratuitous" cross-linking a group of pages can indulge in depends, I think, on how well they are connected externally. So cross-linking between pages that have high extrinsic PR won't get them into trouble where identical cross-linking between low-PR pages will. I can't see how an algorithm could distinguish between appropriate "related page" links (which lots of people put at the bottom of their pages) and "gratuitous" link anchor "farming" - which is why I've scare-quoted "gratuitous".

What I don't understand is how a big site (lots of pages) that's just starting up (and may only have a few external links to the top page) avoids looking like a link-farm.


The Contractor

 2:46 am on Jun 8, 2002 (gmt 0)


<<What I don't understand is how a big site (lots of pages) that's just starting up (and may only have a few external links to the top page) avoids looking like a link-farm.>>

Google's algo can define the difference or every "directory" style site would be considered a link farm ;)


 8:19 am on Jun 8, 2002 (gmt 0)

Google has never really done any active PR (Public Relations).

Everything came through quality and word of mouth.

Now they are facing normal large company issues because of their importance and they are falling behind.

Well, any company would have troubles answering 20.000 emails a week (or whatever the number is). However, whenever my company gets a Nigeria-spam mail through Yahoo's or Hotmail's service, we report back to them, and we always get an (automated) answer within 24 hours.

Google should start answering every email, whatever the content, automatically, immediately. This requires no special resources.

In this email they should lay-out the number of emails they get a week and how difficult it is to answer them all. People do not mind bad news, they hate no news.

They should also state in this email, that if a personalised answer will be returned, within what time frame this will occur.

In this email they should offer links to helpful pages on their own site.

On these "helpful pages" they should give more examples of penalties. Is there a normal PR0? or is PR0 only for penalised pages? If they both exist, why not differentiate or explain?

PR0 penalised webmasters now have no-where to go but to these type of forums.
And the problem with those PR0 threads is that few penalised explain explicitly what they have been fidling with (if they know or show their site here). If Google shows more examples of wrong-doing on their pages, WMW could also be more helpful to the repentful wrongdoers. A simple "do not participate in link exchanges" on the Google page is e.g. less explicit than telling people not to link to bad neighborhoods.


 10:14 am on Jun 8, 2002 (gmt 0)

I think that Google has a miniscule "ill will" problem, just most of it ends up here :)
A clear roadmap of what gets penalized would only be good information for those who would spam the index, not that difficult to see. As vitaplease points out we could easily assemble that map here, if everybody owned up to what got them penalized, but that is not likely to happen. Obviously a few innocents get caught also, and how to repair a pr0 is a valid question, and some answer,even if it is "start over", would help. If I were them I would not feel obliged to respond to every "why did you drop my site" letter with more than a nice form response. Obviously looking at these complaints does help them adjust things, so reviewing some of them makes sense.
My guess, correct me if I am wrong, is that many pr0 problems are related to producing big sites that are shy on good content, particularly dynamically generated ones. I like the fact that I can get a local or regional business to rank well for an appropriate term, rather than having those pages filled with affiliate sites, or pages that have pasted in every town and state in the country, in combination with every possible widget.
Of course, If I wake up one day to a penalty, I reserve the right to alter my opinion.


 11:49 am on Jun 8, 2002 (gmt 0)

Great thread, and one covering some very fundamental issues regarding PR0 and index protection.

The two biggest for me are summarized very well by WG:

>>How long should one expect to ride the pine if they have been penalized for bad linking? <<

>>Why doesn't Google just ignore intenral/cross links? <<

For the latter, this is an issue I have raised before. Google had EVERY right to protect the quality of its returns. In doing so, however, it had choice... choice of whether to act in line with its long standing 'ethical' and 'middle way' philosophy, or whether to take punitive INK type measures.

The former is one of the qualities that has won it so many friends and helped its popularity crank up to current levels. The latter has always proven counterproductive in the long term to every SE that has employed it.

In this case the two choices, with respect to links it views as 'suspect', were simply to ignore the links for PR purposes, or to zap the sites totally.

The former route, just to ignore any link it views as potentially problematic, would have been universally accepted. The route it chose, zapping whole sites, has caused havoc and quite a bit of resentment.

In my opinion it was quite simply the wrong choice. Maybe it was a quick fix... if so, it really is about time it was addressed.

It is not just an issue of throwing totally innocent sites out of the index, but also one of dishing out punishment totally out of proportion with offense. In many cases, possibly most cases, the bogus linkage would have simply been clumsiness... remember, people have always been told that linking is a good thing. Sticking links in right, left and centre is quite often just misguidence - not a serious attempt to distort Google returns.

To only have one level of punishment here seems quite wrong. In my view, attacking the actual problem, the links themselves, would have been a FAR better option.

Why could Google not just ignore the presence of any link(s) it didn't like? In that scenario, the more dodgy links and dependency upon them, the more the impact of the Google filter. However, the terrible sight of very good sites missing just because of bad linking strategy would not occur.

The end result for some has been disaster, and I really do believe that this is a big issue for Google. It has two constituents: the people who search and the people who create and maintain the sites. BOTH must be treated with respect, and the balance between the two really does have to be carefully managed.

Google doesn't make many mistakes, but the PRO issue is certainly one of them. On past record, Google will probably address this... but it is certainly not rushing.


 8:57 pm on Jun 8, 2002 (gmt 0)

>>Why is it that the same cross-linking penalties never seemed to be applied to the big network of sites..<<

>>Why doesn't Google just ignore intenral/cross links? <<

I would guess Google does a lot more to the effect of ignoring cross-links than expected. I just think they add an extra penalty for joining link set-ups. This penalty could be a form of a negative pagerank inheritance until the links are removed. If this is not the case Google definitely should do something in this manner.

First example: small set-up

You start four sites all extensively linking to each other with just the occasional extra incoming links from elsewhere.
Google ignores the links and adds a small penalty - you end up with nearly no links so you are PR0.

Second example: bigger set-up with many more external incoming links.

Your sites are still linking extensively towards each other, but they are also each earning their own incoming links (pagerank). Google also ignores these links towards each other, even puts in a penalty, but this is overcome by all the other incoming links, so you still show pagerank.


 9:14 pm on Jun 8, 2002 (gmt 0)

Here's an example of Google behavior that sort of ticks me off. I've got these four sites, and some of the files are duplicated on the three semi-mirror sites. By now I expect that a given file will show up in the index under only one of these four sites, and Google will zap (remove entirely from the index) that file if it exists elsewhere. So far, so good.

I've been running things this way since before Google got into the game. None of the sites has any advertising; it's all nonprofit. There are reasons for doing things this way, and the only problem with it today comes from the fact that Google doesn't like it. Two of these three sites are disallowed to Google (or supposed to be), and on the third it doesn't matter if a couple of files get removed by Google. Other spiders don't seem to mind in any case.

I've come to expect that Google will zap the duplicate pages from the site (or page) that has the lowest PageRank. But lately, I'm not so sure.

One of these sites has a low PageRank and duplicate pages, simply because I have this grandfathered account from the old home.sprintmail.com that gives me 10 megs of free Web space. The account itself is a dollar per hour with no monthly minimum, and I only use it for backup access. The whole thing runs me maybe a dollar a month.

This website has a "home.sprintmail.com/~user/" address -- sort of like the geocities situation. I have a dickens of a time using robots.txt because I have zero access to what most bots see as the proper place for a robots.txt, which would be home.sprintmail.com. I also have no access to server logs, which means it's difficult to monitor what's happening.

Three months ago, I did an urgent Google removal of the entire site. According to their documentation, this was an "inside directory" robots.txt, which would expire in 90 days. I left the robots.txt in place that allowed Google to properly process the removal, and forgot about it.

Guess what? While my back was turned, during that 90-day period, Google came in and crawled all that duplicate content all over again. This was a PR2 site, as I recall. I just noticed this yesterday.

Big deal, you say? Well yes, because 88 important doorway pages on my PR6 site got completely removed from Google during that 90-day period. Google had to decide whether to zap the sprintmail pages at PR2, or zap the main site at PR6. They chose to zap my PR6. It wasn't my fault that Google crawled the PR2 site that had been disallowed to the best of my ability, and had been specifically removed by Google at my request last March.

It makes a big difference when your doorways are on a PR2 site as opposed to a PR6 site, even when the links themselves on those doorway pages point to just the main site. The inherited PR of the target pages seems to have been affected by the lower PR of the doorway page.

So yesterday I put in another urgent removal and got the PR2 site kicked out of Google again. In anticipation of further difficulty on Google's part when it comes to finding and interpreting the robots.txt at home.sprintmail.com/~user/robots.txt, I also put in META NAME="GOOGLEBOT" CONTENT="NOINDEX, NOFOLLOW" on the home page and the other 88 pages. But I may have been too late for the next update.

Why can't Google get its act together with the robots.txt on /~user/ sites? Why have they apparently changed their policy, so that now it isn't always the lowest PR site that gets the duplicate pages removed?

Here's an even better question: Why can't Google explicitly explain their policy, and their behavior, with respect to duplicate pages, and set up some sort of system that will remedy problems that arise? Something like the "urgent removal" system?

Someone in another thread expressed horror at the possibility that a competitor could set up nearly an entire site with your hijacked pages, and end up getting you removed. When it was always the lowest PR page that got removed, the possibility of a disaster such as this was minimized. But lately things seem sufficiently arbitrary at Google so that now I think this could happen.

What recourse would the owner of the pages have? By the time your lawyer sends out a cease and desist to the hijacker, Google has your site removed for 60 days. The bad guy says, "Oops, sorry, I didn't know you'd object," and takes down the pages. GoogleGuy, if you can catch his ear, says, "It should be okay after the next update, or the one after that."

I don't have to tell anyone what 60 days of Google downtime would do to one's future on the Web. We noticed a decline in traffic, but these 88 pages weren't the only basket our eggs were in, so we'll survive. But I can imagine that a clever attack (as opposed to a clumsy accident) could be devastating to any business that depends on the Web.


 9:24 pm on Jun 8, 2002 (gmt 0)

According to statistics, most people in the world have less than two legs...


 11:33 pm on Jun 8, 2002 (gmt 0)

Doofus, 3 semi-mirror sites, 88 doorway pages, seems like you have answered your own questions. I suspect you would be better off with one good site, no doorway pages. It is fine to gamble, just don't complain when you loose :)


 12:51 am on Jun 9, 2002 (gmt 0)

The thing is that Google and this whole PR0 thing have made webmasters anxious and afraid. Think about all the rumours and half truths that must be flying around by email, IM and on lesser forums.

These may be people who have not been affected at all and don't understand but they hear -- things that make them worry about being banned by Google.

People who are made to be afraid will either leave or start lashing out. The problem is, that increasingly there are just no alternatives to listing in Google so you can't even leave.

Having to abandon a company domain forever and start over is a terrible price for some small business to pay.

That will be when the calls about anti-trust in Internet search will start. If you don't think it will happen look at Microsoft.


 1:28 am on Jun 9, 2002 (gmt 0)

Brad, a lot of us have not been directly affected, but that doesn't mean we're not affected indirectly because of caring about others who have been. That's why every effort is made by everyone here to lend support and try to share research and information as much as possible.

Thank goodness discussions here don't get into hassles like legal talk, that would be counter-productive and defeat the purpose of the resource in the long run. Every search engine is perfectly justified in setting their own criteria and tuning their algo to provide the most relevant results they can. The best will win by reason of public usage because people find what they are looking for.

Whatever Google's done, it seems to be working for both them and the searching public. In spite of a percentage of ill will among those who have been hurt, it doesn't seem possible to argue with the fact that they've managed to accrue a phenomenal amount of good-will; the evidence is that their search is voluntarily being used by a vast multitude of searchers who, after all, have free choice. The votes are in, showing in logs and stats.

[edited by: Marcia at 7:31 am (utc) on June 9, 2002]


 2:11 am on Jun 9, 2002 (gmt 0)

>Is Google creating a reservoir of ill will?
PR0 penalty creating ill will


I am directly addressing the topic, by explaining the human nature of the situation much in the same way that NFFC has previously in a different way.
>nothing to lose

I don't advocate legalistic or political resolutions to these things at all. I am trying to point out that people might start calling for that sort of thing if they get worried.

I think that would be bad for Google and other search engines, and bad for us all too.

>Every search engine is perfectly justified in setting their own criteria and tuning their algo to provide the most relevant results they can.

Sure they can PR0 if they want. But I thought the question was if in doing so they would create ill will.

I have not been hit by any PR0. Despite reading countless threads here about it I'm still not sure what actually triggers it. I worry, and others must worry about stumbling into it by accident.


 5:51 am on Jun 9, 2002 (gmt 0)

I must have read this thread 10 times now and in the voice of "Data" when linking with the "Borg Collective", FASCINATING"!

Brad, (and womensmedia1) your PR0 problem could be really simple like linking to a site that "is spam" or they are linked to "spam" so that by association you are considered "spam".

A good example (and I hope I'm not breaching WMW policy, I don't think so)
dmoz*org/Computers/Internet/Web_Design_and_Development/Promotion/Link_Popularity/ <added> OOPS! - (don't want the give this page a PR0 now do I!)</added>

I think we would all agree that DMOZ is not in the practice of spamming but a PR0 category they do have.

Generally, this means that you need to do alot of homework in advance.

In my spare time I search the web for pages and sites that suite my clients sites for linking purposes. When I find one that is the right fit (but not indexed in Google) I help them out and submit their URL.

Is no way, shape or form would I "ever" link to a site that Google doesn't know about.

This save alot of time (and explaining) when all their traffic seems to be going to somewhere else and you don't know why!

On a personal note: I would rather have a listing on the first page "with nine sites that are spam" than 9 other competitors that got their through good design.

Spam is good for the tickle-down effect but I will never ever link to it!

Just another deep thought from fathom!


 2:17 pm on Jun 9, 2002 (gmt 0)

How can Google NOT ultimately be generating anything but ill will with their thousands of "clever" emails telling folks "it's automatic" when it is only automatic to get on their bad guy list, never to get off if on it for links? When people perceive they have been led on in such a fashion, how can they not feel ill will towards such a "clever" Google email practice?

And what about the ethics of sending poor Googleguy here? A few months ago he responded with a number of "you should get off the PR0 soon" to various folks here. But what about the sites he looked at to which he did not speak? His honest words might have been "sorry chump, you hit our one-strike-and-you're-out filter. You are banned forever no matter what you do. We won't tell you about that and let you spend hundreds of hours trying to fix your site and give you words of encouragement to 'build your content.'" The implications of his selective helpfulness and the "everything is automatic" emails have caused thousands of webmasters to spend many thousands of hours of wasted effort.

I know that my searching behavior has changed.. it used to be that I "knew" everything important was in Google.. now I know it is not and my searches frequently do not go via Google.

The total lack of negative response to Google's decision to just not count guestbook links ought to give reasonable people at Google the idea that JUST NOT COUNTING LINKS is the way to go. I think the important question now is whether Google is so fixated on the penalty idea that they are unable to change themselves.


 2:30 pm on Jun 9, 2002 (gmt 0)

A few months ago he responded with a number of "you should get off the PR0 soon" to various folks here. But what about the sites he looked at to which he did not speak?

Welcome evidence, (that sounds)

To Googleguy's defence, this forum at that time was being plagued by repetitious PR0 messages on every possible message/thread subject. As much as I sympathise with people trying to do there best to make good, these same posters of these messages did not always try to be very clear on what they had done and not. Also many, at least in the beginning, did not put their penalised pages on their profile listing. This also creates an unclear speculation and rumour.

Googleguy offered to check some sites if they were put in the profile listings. He also gave a plain, if not detailed explanation of what had happened within Google. Is see nothing wrong with that. I do see wrong in Google, as a folow-up, not trying to resolve the penalty questions and answers matters in a more general way.


 2:44 pm on Jun 9, 2002 (gmt 0)

Googleguy offered to check some sites if they were put in the profile listings.

This is true. There are many people who have, obviously, a lot to hide and be protective of. Many times I wonder if people are being honest to themselves as to why they got the 0 pagerank in the first place. I know I messed up big time in Jan with my crosslinking and I'm still paying the price now - but at least I admit it was my fault...

But yes, it's also true that Google SHOULD reply to the countless e-mails on the 0 rank problem -- and I think they will, one day. I send an e-mail every few months but never get any further then the standard automated reply.


 5:03 pm on Jun 9, 2002 (gmt 0)

Our web sites are our business.

Our marketing is our business.

Our advertise is our business.

Before we, in any other business venture... research, plan, and strategize to see if our business decision-making practices are sound.

This is no different. I don't believe that before PR0 happen - people were complaining about their own "business decisions". It is human nature to want to blame someone else when a catastrophic incidence occurs, but in reality Google's "good will" was worth the risk!

So excepting that risk after the fact, souldn't be any different.

Google's visitors are more important to me than any other search engine. They are also the most profitable.

Research and planning and a number of test sites to prove beyond a reasonable doult that I don't overstep Googles definitions is well worth the investment.


 5:15 pm on Jun 9, 2002 (gmt 0)

Just had a thought. Isn't the "ban" resident on the IP rather than the domain name.

1. remaining of the same host IP with a new domain "is a new web site".

but isn't

2. changing hosts with the same domain the same thing.


 5:57 pm on Jun 9, 2002 (gmt 0)

This thread is carrying a lot of different conversations, so let me go back and tackle some of WebGuerilla's original questions.

"How long should one expect to ride the pine if they have been penalized for bad linking?" There isn't one answer, because there are different types of penalties. The term PR0 seems to mean different things to different webmasters. For example, all of people on the usenet group who were convinced that they had PR0 really had "Google changes the way it crawls all the time, and we didn't have resources to crawl your site this time", or even more often "Your ISP did something bad." Besides the usual virtual hosting and downtime problems, we've seen ISP's that started returning a 403 instead of a 404. The user didn't create a robots.txt => we got a 403 trying to fetch it => we didn't believe we could crawl the site. Site stops showing up and boom--"I've got a PR0!!" :) Just because a user didn't change anything doesn't mean that their ISP didn't.

If you disregard the PR0's which are really uncrawled pages/sites, that still includes more than one type of penalty. Some expire with time and some are permanent.

"Why is it that the same cross-linking penalties never seemed to be applied to the big network of sites like internet.com?" Both this question and the last question assume that penalties are for cross-linking, which isn't always the case. We're certainly aware of internet.com and the massive number of different domains it has for different topics. Our algorithms try to take all those types of situations into account.

"Why doesn't Google just ignore internal/cross links?" Ignore is a strong word--it means you're setting aside information completely. If there are cross links, that can be useful information in good or bad ways, and we try to take that into account.

"What does Google have planned for the future in terms of improving their response times to email?"
That's a tough question, because I'm not too familiar with our user support, but I'll tackle it. (The short answer is that I don't know.)

The longer answer is that we do get thousands of emails each week, and we have a finite amount of resources. The people who answer email have to prioritize. If we get an email because a business left a list of credit cards lying out where any browser can see them, that's a pretty high priority. On the other hand, take an email like "I used to have pages in Google, and now I don't. What's up?" As I mentioned up above, the majority of those emails have nothing to do with Google penalties. Often the changes have nothing to do with Google, even. "I changed my site to all-Flash, and my boss won't let me use any text links at all on my site. Why don't you crawl all my pages?" ;) I believe that we get scores of emails like that. Our user support people simply can't lead people one-on-one through diagnosing why their site doesn't get fully crawled--that's a pretty non-scalable approach. A more scalable approach would be to spend resources on better crawling (e.g. crawling Flash pages), or on enriching the webmaster pages that we provide.

Finally, you get to the "Have I been penalized?" emails. Sometimes the emails are just because a domain didn't get crawled, and sometimes the email comes from a professional spammer trying to get more information. That's a tough problem, right?

Okay, WebGuerilla, that's a pass at answering broader questions. Let's go back to the beginning of this ill-will thread, and trade places with me for a minute. I'm going to mention thayer's domain from his profile--if a moderator needs to snip it that's fine. We've got
- multiple domains selling the same thing
- 8 domains with cross-links from every page
- links from zeus (search for hghcompany zeus to bring up a couple zeus sites)
- at least one other piece of data which I can't talk about--sorry
Given all that, a bot might be suspicious, right? But hghcompany.com doesn't have a PR0 penalty. It's got a PR of 2.

Now I'm not saying that hghcompany didn't have a higher ranking sometime before. But Google is always adjusting our algorithms--that doesn't mean that a penalty has been imposed. I never want to see Google generate ill-will, but sometimes SEOs/webmasters see penalties where there are just changes or variations in what we crawl.

Hope that helps clear up things a little. Again, this is all just my personal take. thayer, I will give your site a full run-down (in a good way, don't worry :) ) to see if we're doing anything wrong with respect to your site.


 6:09 pm on Jun 9, 2002 (gmt 0)

From a 2002-05-31 thread, GoogleGuy said:
This is a fun discussion, but Doofus and Mikael, you should know that we would never hand boost a result. My own mother couldn't get a boost in the rankings. And don't think she hasn't tried.

Actually, I almost believe this to be the case. They never tweak a site UP by hand, but they sure slam then DOWN, both by hand and automatically. When your site gets penalized unfairly, there's no way to get it corrected except to hope that it corrects itself with whatever algorithm happens to be flying on the next crawl.

Looking at the Google directory, at this URL, which has a PR of 7:


Categories under that level, followed by (number of sites) and (PR X)

Advertising and Banners (45) (PR 4)
Banner Exchanges (207) (PR 5)
Branding (28) (PR 5)
Ezines (22) (PR 6)
Free Classifieds (77) (PR 6)
Free For All Links (29) (PR 5)
Guides (32) (PR 0) ZERO!
Link Popularity (10) (PR 0) ZERO!
Pay For Traffic (9) (PR 0) ZERO!
Press Release Services (40) (PR 5)
Reciprocal Links (30) (PR 0) ZERO!
Search Engine Submitting and Positioning (1227) (PR 6)
Tips and Tricks (32) (PR 0) ZERO!
Traffic Exchanges (37) (PR 0) ZERO!

Does anyone detect a pattern here?

Where does it end? Let's ask Google for a PR 0 for the Republican Party's favorite issues, and a PR 7 for the Democratic Party's favorite issues!

How would that be different from what Google appears to have done in this directory category? Apparently some of the listees were flagged as bad neighborhoods. It seems to me that this is proof that your page can get damaged by simply listing a site that Google doesn't like. The only other interpretation is that Google gave some of their own directory categories a PR0 penalty because they're closet masochists.

Google has certain opinions about the sort of site they want to see on the web.

Microsoft has opinions about the dangers of open source software, or the threat to innovation when software monopolies are sanctioned, and pays "independent" think tanks to spew out bogus "objective" studies on the matter.

Where do you draw the line on monopoly behavior? Remember, it's not illegal to be a monopoly, but it is illegal for a monopoly to use its position in an anti-competitive manner.


 6:17 pm on Jun 9, 2002 (gmt 0)


<<at least one other piece of data which I can't talk about--sorry>>

As usual, thanks for your insight, GG.


 6:39 pm on Jun 9, 2002 (gmt 0)

My 2 cents worth in an interesting thread...

Firstly, GG has a pretty tough job walking a fine line here where there are some pretty savvy people looking for any clues as well as people genuinely looking for answers to their problems. As one of the predators ;) I think GG has done a pretty good job of communicating without giving too much away.

I think that it is only human nature to blame someone else if your site disappears - even if (in your heart) you have a shrewd suspicion as to why it occured. However, what I really think causes the feeling of ill-will is that you see with update after update more and more people getting away with blatant cross-linking, hidden text, etc., etc. This can be pretty upsetting when a site owner has really tried to clean up their act and still remains knocked-out with a penalty.

I appreciate that Google is trying to automate the spam filters and it can be tightened too much (as we saw). But a level playing field might lead webmasters to feel less offended if they see that often more blatant 'spam' is quickly filtered out. I sometimes think that Google is more interested in the fancy and more esoteric 'tricks' we professional SEMs (search engine manipulators) may get up to rather than concentrating on the basic 'let's fool the engines by putting 200 words in hidden text..' type deals which can sometimes flood their index.


 6:46 pm on Jun 9, 2002 (gmt 0)

Doofus I think you need to go pass the directory and look at the individual sites to see an overall pattern. The directory links to some sites that are pretty blatant with "spam".

By association the rest are too. "I would recommend anyone in the DMOZ category that doesn't have PR0 on all their web pages remove themselves from the directory categories afflicting with PR0 and an improvement will occur.


 7:27 pm on Jun 9, 2002 (gmt 0)

Googleguy's response seems to apply to the "big business" sites - but what about us small guys? I still don't have a clue as to why my site, which did come back from a PR0 to a PR2, but in which all the internal pages are PR0, has no links showing at all and I do have some links that are PR4 and above - but the site is still essentially buried in Google. To my knowledge I don't have any spam, I don't link to bad neighborhoods, I do have the same internal links on every page but why would that be penalized? I've really worked hard on my content and feel it is pretty good.
I'm about to bar Googlebot all together to see if that would help, right now I've decided to wait one more month. It just doesn't seem fair to have to go to all the work to build a new domain, I've worked for 3 years to get where I am and have a loyal customer base plus good listings on the other search engines. That is I do until Google takes over the whole world. I'm worried about that.
Ok, I'll stop whining - its just so dog-gone frustrating!


 7:35 pm on Jun 9, 2002 (gmt 0)

Blimey GoogleGuy, good reply. Boy, would I love to see the Google linking structure screen - no doubt you can see skeletons in many Webmasters' cupboards..

Jan: you raise many good points. It could be argued this is a flaw in the Google Pagerank system. Big companies (ala Immune About) get a massive rank through complex multi cross-linking (ooh!) while smaller business sites don't get big ranks because in the cut-throat world of small business many people don't want to lose any trade via outside links (so they often refuse reciprocal linking suggestions!)

The Contractor

 8:04 pm on Jun 9, 2002 (gmt 0)


I agree with you: <<GG has a pretty tough job walking a fine line here where there are some pretty savvy people looking for any clues as well as people genuinely looking for answers to their problems. As one of the predators I think GG has done a pretty good job of communicating without giving too much away.I think that it is only human nature to blame someone else if your site disappears - even if (in your heart) you have a shrewd suspicion as to why it occured >>

Doofus you stated earlier in this thread which "SmallTime" stated: << Doofus, 3 semi-mirror sites, 88 doorway pages, seems like you have answered your own questions. I suspect you would be better off with one good site, no doorway pages. It is fine to gamble, just don't complain when you loose>>

If you are blaming Google for what you perceive as a penalty you're barking up the wrong tree. You should know better than that and it's your own fault.

I believe we all get nervous when we hold decent rankings and then all of a sudden the bottom falls out. I think GoogleGuy basically stated in very short terms: Sometimes there is a "burp" at our end and sometimes it's at your end - that is probably the reason why in 90 percent of the cases sites drop out for any given month. One of my sites did this earlier this year and I didn't touch it, as I was confident I was not doing anything wrong (I was still nervous though). The others that are hit with a real "penalty" should already know why ;)

My 2-cents


 8:17 pm on Jun 9, 2002 (gmt 0)

>Let's go back to the beginning of this ill-will thread

I don't think it is an ill will thread, it's a thread that is addressing the issues and trying to prevent ill will developing. The clearest thing to me is that the current "policy" of simply ignoring emails is not working, that needs to change.

>at least one other piece of data which I can't talk about--sorry

I hate it when you say that :)


 8:19 pm on Jun 9, 2002 (gmt 0)

fathom says:
Doofus I think you need to go pass the directory and look at the individual sites to see an overall pattern. The directory links to some sites that are pretty blatant with "spam".

Exactly. That's why I said that "some of the listees were flagged as bad neighborhoods."

These bad neighborhoods may be evil, they may merely be bad according to Google's definition of "bad," and some of them could even be innocent. I haven't studied how many PR 0 sites are listed in each category.

But that's not my point. There's a Google directory, and it has subcategories that list sites. When one or more "bad" sites are listed in Google's subcategory page, the Google page itself inherits the penalty.

Now you can argue that Google is guilty, and certain of Google's subcategory pages deserve their PR 0 because they were negligent and listed some "bad" sites.

Okay, then why not take it one step further and argue that the directory above these zero subcategories is negligent, because listing PR 0 categories is reprehensible. Yeah, let's give that directory a PR 0 also!

Hey, why stop there?

Let's go up the ladder and give the entire Google directory a PR 0. And since it's derived from DMOZ, let's give the entire DMOZ directory a PR 0 also!

Do you see my point? The PR 0 penalty can act like a ... <scratches head> ... like a VIRUS. That's what I'm complaining about.

How do you stop a virus? I know how Google stops one. They say, "Well, it may be penalized, or it may not. Hard to say. Let's see what happens on the next crawl."


Contractor, you're seeing only what you want to see in my posts. I said that two of the three mirrors were disallowed to Google, and the third one involved only a few essays that I don't care about.

My complaint was that Google violated my disallow on one of the mirrors, and I was being punished for duplicates that should have been invisible in the first place.

I also said that I was doing this before Google existed, and there are good reasons for me to continue doing things this way.

Finally, my 88 doorway pages are for a tax-exempt nonprofit, which carries no advertising, and these doorway pages are the only way Google can get into my data. I have every indication from Google that they want my data indexed, as much as I want them to play by their own rules.

[edited by: Doofus at 8:31 pm (utc) on June 9, 2002]

This 144 message thread spans 5 pages: < < 144 ( 1 [2] 3 4 5 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved