Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
The way I see things now: Six months ago I got "caught" linking in a legitimate business fashion- I have 8 domains, most of them high-content multipage sites- but also one "blue-widget" site the main page to which I linked from most all the pages of my other domains. The main reason for these links was that I WANTED TO SELL BLUE WIDGETS to those visitors! Why Google could not just stop counting links (from a particular domain greater than one or ten or whatever number Google might select) is beyond me. Instead, Google seems to have opted for being the judge jury and executioner and given my "blue-widget" site a permanent ban.
Why is the ban for linking patterns I might have had six months ago permanent? Such permanency feels arbitrary and capricious given that other bans have been only temporary. Yes, I know that Google needs to protect their results; but when Google installs filters that effectively (and permanently) kick sites out of the database for reasons such as the above, Google would seem to create a reservoir of ill will that no company can long afford. And it is so unnecessary when a temporary ban would likely serve as well.
I also would like to know what WG has asked especially:
<<Why is it that the same cross-linking penalties never seemed to be applied to the big network of sites like internet.com?>>
I have seen a cetain site that has held the top position for certain keywords that is the largest spamming crosslinking "farm" that I have found to date. I noticed this site more than 2-years ago in google and it is so blatant throughout it's mirroring and crosslinking yet nothing has been done. Is it because it probably has 100K backlinks that Google and others do not punish it. I really ignore the site and I'm sure that most users do since it has very little useful content.
There is also another site that is a single page of exactly "101 links". It has a PR8, how can this be - one page to the whole domain with 101 links and it get PR8?
I agree that Google needs to address why certain sites can get away with this and others might as well pack up the farm after making a stupid mistake.
Although Google has been my favorite SE for a few years, I must admit I wonder all the time how stable it's algo and filters are when seeing the above :(
What I don't understand is how a big site (lots of pages) that's just starting up (and may only have a few external links to the top page) avoids looking like a link-farm.
<<What I don't understand is how a big site (lots of pages) that's just starting up (and may only have a few external links to the top page) avoids looking like a link-farm.>>
Google's algo can define the difference or every "directory" style site would be considered a link farm ;)
Everything came through quality and word of mouth.
Now they are facing normal large company issues because of their importance and they are falling behind.
Well, any company would have troubles answering 20.000 emails a week (or whatever the number is). However, whenever my company gets a Nigeria-spam mail through Yahoo's or Hotmail's service, we report back to them, and we always get an (automated) answer within 24 hours.
Google should start answering every email, whatever the content, automatically, immediately. This requires no special resources.
In this email they should lay-out the number of emails they get a week and how difficult it is to answer them all. People do not mind bad news, they hate no news.
They should also state in this email, that if a personalised answer will be returned, within what time frame this will occur.
In this email they should offer links to helpful pages on their own site.
On these "helpful pages" they should give more examples of penalties. Is there a normal PR0? or is PR0 only for penalised pages? If they both exist, why not differentiate or explain?
PR0 penalised webmasters now have no-where to go but to these type of forums.
And the problem with those PR0 threads is that few penalised explain explicitly what they have been fidling with (if they know or show their site here). If Google shows more examples of wrong-doing on their pages, WMW could also be more helpful to the repentful wrongdoers. A simple "do not participate in link exchanges" on the Google page is e.g. less explicit than telling people not to link to bad neighborhoods.
joined:Nov 20, 2000
The two biggest for me are summarized very well by WG:
>>How long should one expect to ride the pine if they have been penalized for bad linking? <<
>>Why doesn't Google just ignore intenral/cross links? <<
For the latter, this is an issue I have raised before. Google had EVERY right to protect the quality of its returns. In doing so, however, it had choice... choice of whether to act in line with its long standing 'ethical' and 'middle way' philosophy, or whether to take punitive INK type measures.
The former is one of the qualities that has won it so many friends and helped its popularity crank up to current levels. The latter has always proven counterproductive in the long term to every SE that has employed it.
In this case the two choices, with respect to links it views as 'suspect', were simply to ignore the links for PR purposes, or to zap the sites totally.
The former route, just to ignore any link it views as potentially problematic, would have been universally accepted. The route it chose, zapping whole sites, has caused havoc and quite a bit of resentment.
In my opinion it was quite simply the wrong choice. Maybe it was a quick fix... if so, it really is about time it was addressed.
It is not just an issue of throwing totally innocent sites out of the index, but also one of dishing out punishment totally out of proportion with offense. In many cases, possibly most cases, the bogus linkage would have simply been clumsiness... remember, people have always been told that linking is a good thing. Sticking links in right, left and centre is quite often just misguidence - not a serious attempt to distort Google returns.
To only have one level of punishment here seems quite wrong. In my view, attacking the actual problem, the links themselves, would have been a FAR better option.
Why could Google not just ignore the presence of any link(s) it didn't like? In that scenario, the more dodgy links and dependency upon them, the more the impact of the Google filter. However, the terrible sight of very good sites missing just because of bad linking strategy would not occur.
The end result for some has been disaster, and I really do believe that this is a big issue for Google. It has two constituents: the people who search and the people who create and maintain the sites. BOTH must be treated with respect, and the balance between the two really does have to be carefully managed.
Google doesn't make many mistakes, but the PRO issue is certainly one of them. On past record, Google will probably address this... but it is certainly not rushing.
>>Why doesn't Google just ignore intenral/cross links? <<
I would guess Google does a lot more to the effect of ignoring cross-links than expected. I just think they add an extra penalty for joining link set-ups. This penalty could be a form of a negative pagerank inheritance until the links are removed. If this is not the case Google definitely should do something in this manner.
First example: small set-up
You start four sites all extensively linking to each other with just the occasional extra incoming links from elsewhere.
Google ignores the links and adds a small penalty - you end up with nearly no links so you are PR0.
Second example: bigger set-up with many more external incoming links.
Your sites are still linking extensively towards each other, but they are also each earning their own incoming links (pagerank). Google also ignores these links towards each other, even puts in a penalty, but this is overcome by all the other incoming links, so you still show pagerank.
joined:Sept 12, 2012
I've been running things this way since before Google got into the game. None of the sites has any advertising; it's all nonprofit. There are reasons for doing things this way, and the only problem with it today comes from the fact that Google doesn't like it. Two of these three sites are disallowed to Google (or supposed to be), and on the third it doesn't matter if a couple of files get removed by Google. Other spiders don't seem to mind in any case.
I've come to expect that Google will zap the duplicate pages from the site (or page) that has the lowest PageRank. But lately, I'm not so sure.
One of these sites has a low PageRank and duplicate pages, simply because I have this grandfathered account from the old home.sprintmail.com that gives me 10 megs of free Web space. The account itself is a dollar per hour with no monthly minimum, and I only use it for backup access. The whole thing runs me maybe a dollar a month.
This website has a "home.sprintmail.com/~user/" address -- sort of like the geocities situation. I have a dickens of a time using robots.txt because I have zero access to what most bots see as the proper place for a robots.txt, which would be home.sprintmail.com. I also have no access to server logs, which means it's difficult to monitor what's happening.
Three months ago, I did an urgent Google removal of the entire site. According to their documentation, this was an "inside directory" robots.txt, which would expire in 90 days. I left the robots.txt in place that allowed Google to properly process the removal, and forgot about it.
Guess what? While my back was turned, during that 90-day period, Google came in and crawled all that duplicate content all over again. This was a PR2 site, as I recall. I just noticed this yesterday.
Big deal, you say? Well yes, because 88 important doorway pages on my PR6 site got completely removed from Google during that 90-day period. Google had to decide whether to zap the sprintmail pages at PR2, or zap the main site at PR6. They chose to zap my PR6. It wasn't my fault that Google crawled the PR2 site that had been disallowed to the best of my ability, and had been specifically removed by Google at my request last March.
It makes a big difference when your doorways are on a PR2 site as opposed to a PR6 site, even when the links themselves on those doorway pages point to just the main site. The inherited PR of the target pages seems to have been affected by the lower PR of the doorway page.
So yesterday I put in another urgent removal and got the PR2 site kicked out of Google again. In anticipation of further difficulty on Google's part when it comes to finding and interpreting the robots.txt at home.sprintmail.com/~user/robots.txt, I also put in META NAME="GOOGLEBOT" CONTENT="NOINDEX, NOFOLLOW" on the home page and the other 88 pages. But I may have been too late for the next update.
Why can't Google get its act together with the robots.txt on /~user/ sites? Why have they apparently changed their policy, so that now it isn't always the lowest PR site that gets the duplicate pages removed?
Here's an even better question: Why can't Google explicitly explain their policy, and their behavior, with respect to duplicate pages, and set up some sort of system that will remedy problems that arise? Something like the "urgent removal" system?
Someone in another thread expressed horror at the possibility that a competitor could set up nearly an entire site with your hijacked pages, and end up getting you removed. When it was always the lowest PR page that got removed, the possibility of a disaster such as this was minimized. But lately things seem sufficiently arbitrary at Google so that now I think this could happen.
What recourse would the owner of the pages have? By the time your lawyer sends out a cease and desist to the hijacker, Google has your site removed for 60 days. The bad guy says, "Oops, sorry, I didn't know you'd object," and takes down the pages. GoogleGuy, if you can catch his ear, says, "It should be okay after the next update, or the one after that."
I don't have to tell anyone what 60 days of Google downtime would do to one's future on the Web. We noticed a decline in traffic, but these 88 pages weren't the only basket our eggs were in, so we'll survive. But I can imagine that a clever attack (as opposed to a clumsy accident) could be devastating to any business that depends on the Web.
These may be people who have not been affected at all and don't understand but they hear -- things that make them worry about being banned by Google.
People who are made to be afraid will either leave or start lashing out. The problem is, that increasingly there are just no alternatives to listing in Google so you can't even leave.
Having to abandon a company domain forever and start over is a terrible price for some small business to pay.
That will be when the calls about anti-trust in Internet search will start. If you don't think it will happen look at Microsoft.
Thank goodness discussions here don't get into hassles like legal talk, that would be counter-productive and defeat the purpose of the resource in the long run. Every search engine is perfectly justified in setting their own criteria and tuning their algo to provide the most relevant results they can. The best will win by reason of public usage because people find what they are looking for.
Whatever Google's done, it seems to be working for both them and the searching public. In spite of a percentage of ill will among those who have been hurt, it doesn't seem possible to argue with the fact that they've managed to accrue a phenomenal amount of good-will; the evidence is that their search is voluntarily being used by a vast multitude of searchers who, after all, have free choice. The votes are in, showing in logs and stats.
[edited by: Marcia at 7:31 am (utc) on June 9, 2002]
I am directly addressing the topic, by explaining the human nature of the situation much in the same way that NFFC has previously in a different way.
>nothing to lose
I don't advocate legalistic or political resolutions to these things at all. I am trying to point out that people might start calling for that sort of thing if they get worried.
I think that would be bad for Google and other search engines, and bad for us all too.
>Every search engine is perfectly justified in setting their own criteria and tuning their algo to provide the most relevant results they can.
Sure they can PR0 if they want. But I thought the question was if in doing so they would create ill will.
I have not been hit by any PR0. Despite reading countless threads here about it I'm still not sure what actually triggers it. I worry, and others must worry about stumbling into it by accident.
Brad, (and womensmedia1) your PR0 problem could be really simple like linking to a site that "is spam" or they are linked to "spam" so that by association you are considered "spam".
A good example (and I hope I'm not breaching WMW policy, I don't think so)
dmoz*org/Computers/Internet/Web_Design_and_Development/Promotion/Link_Popularity/ <added> OOPS! - (don't want the give this page a PR0 now do I!)</added>
I think we would all agree that DMOZ is not in the practice of spamming but a PR0 category they do have.
Generally, this means that you need to do alot of homework in advance.
In my spare time I search the web for pages and sites that suite my clients sites for linking purposes. When I find one that is the right fit (but not indexed in Google) I help them out and submit their URL.
Is no way, shape or form would I "ever" link to a site that Google doesn't know about.
This save alot of time (and explaining) when all their traffic seems to be going to somewhere else and you don't know why!
On a personal note: I would rather have a listing on the first page "with nine sites that are spam" than 9 other competitors that got their through good design.
Spam is good for the tickle-down effect but I will never ever link to it!
Just another deep thought from fathom!
And what about the ethics of sending poor Googleguy here? A few months ago he responded with a number of "you should get off the PR0 soon" to various folks here. But what about the sites he looked at to which he did not speak? His honest words might have been "sorry chump, you hit our one-strike-and-you're-out filter. You are banned forever no matter what you do. We won't tell you about that and let you spend hundreds of hours trying to fix your site and give you words of encouragement to 'build your content.'" The implications of his selective helpfulness and the "everything is automatic" emails have caused thousands of webmasters to spend many thousands of hours of wasted effort.
I know that my searching behavior has changed.. it used to be that I "knew" everything important was in Google.. now I know it is not and my searches frequently do not go via Google.
The total lack of negative response to Google's decision to just not count guestbook links ought to give reasonable people at Google the idea that JUST NOT COUNTING LINKS is the way to go. I think the important question now is whether Google is so fixated on the penalty idea that they are unable to change themselves.
A few months ago he responded with a number of "you should get off the PR0 soon" to various folks here. But what about the sites he looked at to which he did not speak?
Welcome evidence, (that sounds)
To Googleguy's defence, this forum at that time was being plagued by repetitious PR0 messages on every possible message/thread subject. As much as I sympathise with people trying to do there best to make good, these same posters of these messages did not always try to be very clear on what they had done and not. Also many, at least in the beginning, did not put their penalised pages on their profile listing. This also creates an unclear speculation and rumour.
Googleguy offered to check some sites if they were put in the profile listings. He also gave a plain, if not detailed explanation of what had happened within Google. Is see nothing wrong with that. I do see wrong in Google, as a folow-up, not trying to resolve the penalty questions and answers matters in a more general way.
Googleguy offered to check some sites if they were put in the profile listings.
This is true. There are many people who have, obviously, a lot to hide and be protective of. Many times I wonder if people are being honest to themselves as to why they got the 0 pagerank in the first place. I know I messed up big time in Jan with my crosslinking and I'm still paying the price now - but at least I admit it was my fault...
But yes, it's also true that Google SHOULD reply to the countless e-mails on the 0 rank problem -- and I think they will, one day. I send an e-mail every few months but never get any further then the standard automated reply.
Our marketing is our business.
Our advertise is our business.
Before we, in any other business venture... research, plan, and strategize to see if our business decision-making practices are sound.
This is no different. I don't believe that before PR0 happen - people were complaining about their own "business decisions". It is human nature to want to blame someone else when a catastrophic incidence occurs, but in reality Google's "good will" was worth the risk!
So excepting that risk after the fact, souldn't be any different.
Google's visitors are more important to me than any other search engine. They are also the most profitable.
Research and planning and a number of test sites to prove beyond a reasonable doult that I don't overstep Googles definitions is well worth the investment.
"How long should one expect to ride the pine if they have been penalized for bad linking?" There isn't one answer, because there are different types of penalties. The term PR0 seems to mean different things to different webmasters. For example, all of people on the usenet group who were convinced that they had PR0 really had "Google changes the way it crawls all the time, and we didn't have resources to crawl your site this time", or even more often "Your ISP did something bad." Besides the usual virtual hosting and downtime problems, we've seen ISP's that started returning a 403 instead of a 404. The user didn't create a robots.txt => we got a 403 trying to fetch it => we didn't believe we could crawl the site. Site stops showing up and boom--"I've got a PR0!!" :) Just because a user didn't change anything doesn't mean that their ISP didn't.
If you disregard the PR0's which are really uncrawled pages/sites, that still includes more than one type of penalty. Some expire with time and some are permanent.
"Why is it that the same cross-linking penalties never seemed to be applied to the big network of sites like internet.com?" Both this question and the last question assume that penalties are for cross-linking, which isn't always the case. We're certainly aware of internet.com and the massive number of different domains it has for different topics. Our algorithms try to take all those types of situations into account.
"Why doesn't Google just ignore internal/cross links?" Ignore is a strong word--it means you're setting aside information completely. If there are cross links, that can be useful information in good or bad ways, and we try to take that into account.
"What does Google have planned for the future in terms of improving their response times to email?"
That's a tough question, because I'm not too familiar with our user support, but I'll tackle it. (The short answer is that I don't know.)
The longer answer is that we do get thousands of emails each week, and we have a finite amount of resources. The people who answer email have to prioritize. If we get an email because a business left a list of credit cards lying out where any browser can see them, that's a pretty high priority. On the other hand, take an email like "I used to have pages in Google, and now I don't. What's up?" As I mentioned up above, the majority of those emails have nothing to do with Google penalties. Often the changes have nothing to do with Google, even. "I changed my site to all-Flash, and my boss won't let me use any text links at all on my site. Why don't you crawl all my pages?" ;) I believe that we get scores of emails like that. Our user support people simply can't lead people one-on-one through diagnosing why their site doesn't get fully crawled--that's a pretty non-scalable approach. A more scalable approach would be to spend resources on better crawling (e.g. crawling Flash pages), or on enriching the webmaster pages that we provide.
Finally, you get to the "Have I been penalized?" emails. Sometimes the emails are just because a domain didn't get crawled, and sometimes the email comes from a professional spammer trying to get more information. That's a tough problem, right?
Okay, WebGuerilla, that's a pass at answering broader questions. Let's go back to the beginning of this ill-will thread, and trade places with me for a minute. I'm going to mention thayer's domain from his profile--if a moderator needs to snip it that's fine. We've got
- multiple domains selling the same thing
- 8 domains with cross-links from every page
- links from zeus (search for hghcompany zeus to bring up a couple zeus sites)
- at least one other piece of data which I can't talk about--sorry
Given all that, a bot might be suspicious, right? But hghcompany.com doesn't have a PR0 penalty. It's got a PR of 2.
Now I'm not saying that hghcompany didn't have a higher ranking sometime before. But Google is always adjusting our algorithms--that doesn't mean that a penalty has been imposed. I never want to see Google generate ill-will, but sometimes SEOs/webmasters see penalties where there are just changes or variations in what we crawl.
Hope that helps clear up things a little. Again, this is all just my personal take. thayer, I will give your site a full run-down (in a good way, don't worry :) ) to see if we're doing anything wrong with respect to your site.
joined:Sept 12, 2012
This is a fun discussion, but Doofus and Mikael, you should know that we would never hand boost a result. My own mother couldn't get a boost in the rankings. And don't think she hasn't tried.
Actually, I almost believe this to be the case. They never tweak a site UP by hand, but they sure slam then DOWN, both by hand and automatically. When your site gets penalized unfairly, there's no way to get it corrected except to hope that it corrects itself with whatever algorithm happens to be flying on the next crawl.
Looking at the Google directory, at this URL, which has a PR of 7:
Categories under that level, followed by (number of sites) and (PR X)
Advertising and Banners (45) (PR 4)
Banner Exchanges (207) (PR 5)
Branding (28) (PR 5)
Ezines (22) (PR 6)
Free Classifieds (77) (PR 6)
Free For All Links (29) (PR 5)
Guides (32) (PR 0) ZERO!
Link Popularity (10) (PR 0) ZERO!
Pay For Traffic (9) (PR 0) ZERO!
Press Release Services (40) (PR 5)
Reciprocal Links (30) (PR 0) ZERO!
Search Engine Submitting and Positioning (1227) (PR 6)
Tips and Tricks (32) (PR 0) ZERO!
Traffic Exchanges (37) (PR 0) ZERO!
Does anyone detect a pattern here?
Where does it end? Let's ask Google for a PR 0 for the Republican Party's favorite issues, and a PR 7 for the Democratic Party's favorite issues!
How would that be different from what Google appears to have done in this directory category? Apparently some of the listees were flagged as bad neighborhoods. It seems to me that this is proof that your page can get damaged by simply listing a site that Google doesn't like. The only other interpretation is that Google gave some of their own directory categories a PR0 penalty because they're closet masochists.
Google has certain opinions about the sort of site they want to see on the web.
Microsoft has opinions about the dangers of open source software, or the threat to innovation when software monopolies are sanctioned, and pays "independent" think tanks to spew out bogus "objective" studies on the matter.
Where do you draw the line on monopoly behavior? Remember, it's not illegal to be a monopoly, but it is illegal for a monopoly to use its position in an anti-competitive manner.
joined:July 21, 2000
Firstly, GG has a pretty tough job walking a fine line here where there are some pretty savvy people looking for any clues as well as people genuinely looking for answers to their problems. As one of the predators ;) I think GG has done a pretty good job of communicating without giving too much away.
I think that it is only human nature to blame someone else if your site disappears - even if (in your heart) you have a shrewd suspicion as to why it occured. However, what I really think causes the feeling of ill-will is that you see with update after update more and more people getting away with blatant cross-linking, hidden text, etc., etc. This can be pretty upsetting when a site owner has really tried to clean up their act and still remains knocked-out with a penalty.
I appreciate that Google is trying to automate the spam filters and it can be tightened too much (as we saw). But a level playing field might lead webmasters to feel less offended if they see that often more blatant 'spam' is quickly filtered out. I sometimes think that Google is more interested in the fancy and more esoteric 'tricks' we professional SEMs (search engine manipulators) may get up to rather than concentrating on the basic 'let's fool the engines by putting 200 words in hidden text..' type deals which can sometimes flood their index.
By association the rest are too. "I would recommend anyone in the DMOZ category that doesn't have PR0 on all their web pages remove themselves from the directory categories afflicting with PR0 and an improvement will occur.
Jan: you raise many good points. It could be argued this is a flaw in the Google Pagerank system. Big companies (ala Immune About) get a massive rank through complex multi cross-linking (ooh!) while smaller business sites don't get big ranks because in the cut-throat world of small business many people don't want to lose any trade via outside links (so they often refuse reciprocal linking suggestions!)
I agree with you: <<GG has a pretty tough job walking a fine line here where there are some pretty savvy people looking for any clues as well as people genuinely looking for answers to their problems. As one of the predators I think GG has done a pretty good job of communicating without giving too much away.I think that it is only human nature to blame someone else if your site disappears - even if (in your heart) you have a shrewd suspicion as to why it occured >>
Doofus you stated earlier in this thread which "SmallTime" stated: << Doofus, 3 semi-mirror sites, 88 doorway pages, seems like you have answered your own questions. I suspect you would be better off with one good site, no doorway pages. It is fine to gamble, just don't complain when you loose>>
If you are blaming Google for what you perceive as a penalty you're barking up the wrong tree. You should know better than that and it's your own fault.
I believe we all get nervous when we hold decent rankings and then all of a sudden the bottom falls out. I think GoogleGuy basically stated in very short terms: Sometimes there is a "burp" at our end and sometimes it's at your end - that is probably the reason why in 90 percent of the cases sites drop out for any given month. One of my sites did this earlier this year and I didn't touch it, as I was confident I was not doing anything wrong (I was still nervous though). The others that are hit with a real "penalty" should already know why ;)
I don't think it is an ill will thread, it's a thread that is addressing the issues and trying to prevent ill will developing. The clearest thing to me is that the current "policy" of simply ignoring emails is not working, that needs to change.
>at least one other piece of data which I can't talk about--sorry
I hate it when you say that :)