| 11:42 pm on Oct 23, 2002 (gmt 0)|
It can't be automatic, because then WebmasterWorld would have been hit with the penalty, too.
GoogleGuy mentioned at one point they dislike hand penalizing sites, but will do so, they'd rather develop an algorithm to handle it...given WebmasterWorld has kept it's PageRank for a while now, if there is an 'automotic' filter, it's more intelligent than that. :)
Does that help? Personally, I wouldn't lose sleep over linking to a site with bad pagerank...you should really be linking for other reasons than that, similar content, trading traffic, offering stuff to your visitors that your site doesn't provide, but another does...etc.
| 11:58 pm on Oct 23, 2002 (gmt 0)|
Thanks Jeremy. It does help, but I'd like to know for certain if possible. The example I gave does exist except that it's a forum that I'm associated with and not this one. If Google automates the process they would do harm. The web shouldn't be like that. Manual penalties are ok when applicable, but this type of automated penalty is not.
| 12:12 am on Oct 24, 2002 (gmt 0)|
I don't think they would automate it, given the whole 'house of cards' potential they could cause to their own db.
WebmasterWorld gets penalized
DMOZ.org gets the same....
Yahoo.com gets the PR0...
and on, and on, and on....you see where I'm going :)
They can't automate it, it's not in their own interest.
| 12:36 am on Oct 24, 2002 (gmt 0)|
I'm going there with you Jeremy. You're right - they couldn't automate it. Thanks for taking the time :)
| 12:54 am on Oct 24, 2002 (gmt 0)|
I disagree, it would be easy to automate it.
Just remember that you do not have to receive the the same penalty for linking to a penalized site as the site itself has. Linking to a PR0 does not give you a PR0, but it might give you a -PR.5.
What will matter is how you link to it (specific link exchange code) and how many bad sites you link to (more than 10% of your links).
| 2:51 am on Oct 24, 2002 (gmt 0)|
Im with sasquatch.. like most google "penalities" that often apply in "very grey areas", I'm more and more confident that automated penalties are easy, but they would only kick in in very obvious cases.
We work on the assumption that a few links to PR0 sites here and there would not be a problem but say 10% (or even more) would cause a problem. No way we can search through around 1,000 outgoing links on our site, created over 5 years to check whether any are now PR0 or a bad neighbourhood.
Same with crosslinking i thing. A bit is OK, but from every index page or every page to more than 10 domains to each other may case an auto penalty.
| 3:15 am on Oct 24, 2002 (gmt 0)|
I agree that auto-penalties would be easy to write into a program but that isn't the question. The question is *are* they written in for sites linking to penalised sites, or are they manual. The evidence seems to be that they are not automatic or, if they are, the penalty for linking to penalised sites is so small as to be hardly noticeable.
I have good reason to believe that crosslinking penalties are manual, but that's a different subject.
| 3:52 am on Oct 24, 2002 (gmt 0)|
|Suppose a few people in this forum have penalised (PR0ed) sites, and there are links to them in their profiles. Google spiders the profiles and will find that this forum links to some penalised sites. |
One would hope the fine folks at Webmaster World would send the link to the profile through their cgi scipt thingy and do that voodoo that they do to keep out gbody.
Yup, looks like thats what they do and presto, no pr on profile page... Unless that is a penalty.;)
| 4:28 am on Oct 24, 2002 (gmt 0)|
Google is all about automation. That's why they hire so many phd engineers.
For instance, while Yahoo uses human editors to pick and post to their news service, Google has a completely automated news service. No humans at the wheel. None.
Google is all about intelligent software. As jeremy noted, GoogleGuy stated that Google prefers to study a spam report and figure out how to eliminate a particular type of spam, rather than a specific instance of spam.
Makes sense. You can kill a locust with a fly swatter, but in a swarm, it makes more sense to fly over them with a crop duster and spray them with a bio-engineered chemical that renders the locust (and not the butterflies or bees) impotent. Then they drop and die.
| 6:02 am on Oct 24, 2002 (gmt 0)|
Google would not automate something like that because it could be too widespread (too broad).
Go look at the Touchgraph Java Google webmapper [webmasterworld.com]. That is probably similar to what Google has to work with in house. From that, it is easy to find domain farms that are inner linked and from that spam sites. That's where who you link too becomes important and hand editing of the offending farm is in order.
| 7:16 am on Oct 24, 2002 (gmt 0)|
|Touchgraph Java Google webmapper |
Never heard of this, evidently neither has Google. What is it? Where do we take a look at it.
| 7:37 am on Oct 24, 2002 (gmt 0)|
|Touchgraph Java Google webmapper |
| 9:42 am on Oct 24, 2002 (gmt 0)|
I still have a few PR0 sites around, one of which is linked to by another site I have, which more or less recovered from PR0 a few months ago. The site that links to the PR0 site does in fact have PR, and did recover from PR0 even though that link existed at that time.
| 10:58 am on Oct 24, 2002 (gmt 0)|
The short answer to my question is that we don't know whether such penalties are automatic or not. Google's search coverage is now so great that they have reponsibilities to surfers and websites as well as to themselves - morally, at least. To automatically penalise a site for linking to penalised sites would be grossly irresponsible and, in a perfect world, wouldn't be allowed.
But we don't have a perfect world and, like most other businesses, Google is unlikely to recognise its moral responsibilities if they think they conflict with its own interests. So I guess it's a case of suck-it-and-see. In the forum case, a manual penalty would not be applied but an automatic one probably would. I've no intention of changing anything for the sake of Google because that's not how the web should work. I'll wait and see what happens.
Incidentally, I could have sworn that the toolbar showed a PR value for the profile pages here. Now it shows grey. Have the links or pages been hidden from spiders in the not too distant past, or is my memory worse than I thought :(
| 11:13 am on Oct 24, 2002 (gmt 0)|
When Google introduced that PR0 penalty around the begininng of this year it seemed like a automated profiling clampdown that swept up a lot of innocent sites too.
As I remember it, only after a great uproar did GG go in and investigate specific sites.
That makes me think it is automated.
| 12:44 pm on Oct 24, 2002 (gmt 0)|
> ...if there is an 'automotic' filter, it's more intelligent than that
I think that's the case. Sometimes a category at DMoz.org gets a penalty, but not the whole domain. It seems to happen only when it links to some proportion of penalised/banned sites.
I tend to think in terms of:
* A gets banned by hand (along with all pages on the domain)
* B gets PR0 from linking to A
* C gets the 'can have PR but not pass it on' penalty from linking to B
...but this is just a very crude model, it's much more complicated. Note that B and C are pages not whole domains. I'm not sure about A, can domains get automatic penalties?
As with Chiyo and Brad, I believe the cross-linking penalties from the end of 2001 to be automatic, also the guestbook penalty from this Spring.
> I'd like to know for certain if possible
I think this puts us at odds with Google. It's in their interest to keep some mystery around penalties. Partly it helps people from spamming in a penalty-avoiding manner, and partly the fear helps to convince people not to spam.
I'm pretty sure that human penalties used to be complete bans, while PR0 penalties were automatic. I wouldn't like to predict either way on what the current status is.
| 1:53 pm on Oct 24, 2002 (gmt 0)|
>I think this puts us at odds with Google. It's in their interest to keep some mystery around penalties. Partly it helps people from spamming in a penalty-avoiding manner, and partly the fear helps to convince people not to spam.<
Yes, it is in Google's interest to keep us in the dark about the way their penalties work, but it isn't in their interest to automatically penalise sites in all circumstances. E.g. if the profile pages in this forum were indexed, as they would be if steps were not taken to avoid it, and some of them contained urls that were later penalised, it wouldn't be in anyone's interest to penalise this forum or any part of it. There's a huge difference between intentional linking and the links in the profile pages. And there are many other examples where OBLs (outbound links) could link to penalised sites but shouldn't be penalised.
I've no objection to manual penalties, but automatic ones for OBLs are surely wrong.
| 4:44 pm on Oct 24, 2002 (gmt 0)|
|I think this puts us at odds with Google. It's in their interest to keep some mystery around penalties. Partly it helps people from spamming in a penalty-avoiding manner, and partly the fear helps to convince people not to spam. |
I just found this mixed in with the letters on my keyboard.
From the new dictionary
Search Engine Optimisation- The act of spamming in a penalty-avoiding manner. ;)
| 4:54 pm on Oct 24, 2002 (gmt 0)|
> Search Engine Optimisation - The act of spamming in a penalty-avoiding manner;).
Robot Usability - The act of helping search engines to list a site under relevant phrases;).
| 5:12 pm on Oct 24, 2002 (gmt 0)|
Consider this possibility as far as what would be the penalty for linking to a bad neighbood.
Site A is a member of a link exchange bad neighborhood. It gets PR0 penalty,
Site B is a forum that has member profiles, one of which has a link to Site A
The member also posts his URL in his sig on the forum.
You do not automatically become part of the bad neighborhood by just having a link to the bad neighborhood, but you do become eligable for a penalty on the *page* that links to the bad neighborhood. The Penalty will only apply to the one page.
Now we will make up a penalty
CPR = current PR of the page on your site
blinks = percentage of links to bad neighborhood
NPR = new PR after calculating penalties
NPR = CPR * blinks
So the member from Site A will cause the PR to drop on his profile page and on any of his posts, it will only affect other pages on Site be to the extent that those penalized pages are not going to be spreading as much internal PR around the site.
Of course if your links to bad neighborhoods pass a certain threshold value on one of your pages, there is a good possibility that you will be automatically included in a bad neighborhood.
Again, that is only one way that they *might* do it.
| 7:21 pm on Oct 24, 2002 (gmt 0)|
That sort of calculation is a possiblility - they may do something like that. But the more I think about it, the more I'm convinced that *any* automatic penalty for linking to a bad neighbourhood or penalised site is simply wrong, no matter how many links there are from a site to the bad area.
In the forum that I'm associated with, we allow urls in sig lines and in posts. If a person with a bad url makes a post, s/he is quite likely to have a discussion in the same thread, just like I am doing here. Judging by the opinions here, the number of links will build up to a point where a site-wide penalty is incurred - especially if the same person also has discussions in other threads. That scenario cannot be right.
Somebody may say that all we have to do is ban urls in sig lines and posts and we will be ok. But no site where visitors can post messages should have to take those kind of steps. Websites should be able to carry on naturally without any thought of PageRank and penalties, and they should never be penalised for doing so. I've no doubt that Google would agree with that. So, without genuine artificial intelligence, *any* automatic penalty based on where a website links to cannot be right. If it isn't manual, they are doing it wrong - in a moral sense.
| 7:58 pm on Oct 24, 2002 (gmt 0)|
|I don't think they would automate it, given the whole 'house of cards' potential they could cause to their own db. |
WebmasterWorld gets penalized
DMOZ.org gets the same.... (BK says: This happened, see [webmasterworld.com...] and ask whether you think Google applied a PR0 to DMOZ by hand)
Yahoo.com gets the PR0...
This is a great argument that ends the debate only if you assume that the PRzero penalty is exactly that - either normal rank or no rank.
However, SearchKing has been well documented and went from PR8 to PR4. Not zero, but still a major loss of PR.
GoogleGuy may well have hinted that the PR penalty is done by hand before, bit I doubt it usually is. Remember when the PR penalties first became a major issue? Remember GoogleGuy apologising that they had tweaked the filter too high? How can that happen by hand?
I believe that the majority of the PR penalisation is automated. At least in terms of Googlebot flagging thousands of sites each day, which perhaps a human then has to select to penalise or ignore (not always with enough time to be as careful as we'd hope).
I believe that the big troubles with PR penalties months ago (well documented here and in every other relevant forum around) were a clear sign that the penalties were automated. I believe we all witnessed them tuning the algorithm more carefully based on that feedback.
| 9:13 pm on Oct 24, 2002 (gmt 0)|
My example was just using PR0 as the example penalty for the bad neighborhood. The penalty I refered to as the penalty for linking to a bad site would match your example.
PhilC, how many other links are there on the forum page where someone could put their sig? I just checked on the page I am entering this on and there are 33 links on this page. In the example I gave, even if you had 3 of those links, it would have a negligable effect on the PR of this page.
The links are probably also discounted on pages where they are repeated.
Unless someone is able to fill your page with bad links, your site is likely to be safe.
You may think it's wrong to have automatic penalties, but they do not have the staff or the money to impose all manual penalties.
| 10:12 pm on Oct 24, 2002 (gmt 0)|
Phil said, "Somebody may say that all we have to do is ban urls in sig lines and posts and we will be ok. But no site where visitors can post messages should have to take those kind of steps."
Agreed. Whether or not to allow sig lines should be at the discretion of the Forum owner, not due to perceived or real threat of PR0 ramifications.
If linking to PR0 sites causes a negative domino effect or something similar to that, does this mean we're put into the position of censoring sig lines? If I were to remove my blasted Google Toolbar, I'd never know of any potential risk. And even if I use the bar and see a page/site I want to link to, I'm going to chuck Google's prejudice and vote for the link.
Google has to have the expertise to add variables that take into account the IP, text around the link, URL associated with the link and source of link. If/And statements can make things fair and logical so that a sig line link, or link to a page in an article, or newsletter, isn't going to wreak havoc on the integrity of an entire site or those that link to it. It should easy to automate the hunt for true "offenders".
To put your mind at ease, Phil, I've decided to attend the next Google sponsored dance and flirt with a cute Google man employee until he surrenders all his knowledge to me.
| 2:37 pm on Oct 26, 2002 (gmt 0)|
From what I've learned in this, and another thread, I've concluded that the penalty for linking to 'bad' pages and sites may be automatic or it may be manual. If automatic, it may be triggered for just one link or it may be triggered when a certain number of links are counted. I think it's all pretty clear ;)
Anyway, to avoid any possible penalties, I've been forced to modify the forum so that spiders can't see most of the urls that users can post or place in their signatures. But we dislike having to do that. The freedom of websites is seriously eroded by having to fit in with a search engine's desires, and that can't be right.
I know that Google is only trying to protect the integrity of the serps, and that it can be difficult to get it right all round, but it's coming to something when a site on which users can post content has to hide genuine links from Googlebot, just in case someone posts a link to a 'bad' area.
The alternative is to do what this forum does and disallow everything, but that's much too great a constriction.
| 3:35 pm on Oct 26, 2002 (gmt 0)|
PhilC, I think you've just summed it up:).
Keep in mind that the WebmasterWorld Google News forum has a higher than average proportion of penalised people (because people come here to find out why they've been penalised).
Unless your forum is about reciprocal link networks, gambling affiliate programs or some other Google-dangerous activity, I wouldn't be especially worried.
| 3:48 pm on Oct 26, 2002 (gmt 0)|
I had a PR0 site a few month ago (who knows why), but I also linked to two other web sites and they also got a PR0.
| 5:50 pm on Oct 26, 2002 (gmt 0)|
Did you regain your PR back? How long does it take to regain lost pr?
| 6:27 pm on Oct 26, 2002 (gmt 0)|
liana I got my PR5 the next month.
| This 45 message thread spans 2 pages: 45 (  2 ) > > |