| 8:19 pm on Jun 9, 2002 (gmt 0)|
|Doofus I think you need to go pass the directory and look at the individual sites to see an overall pattern. The directory links to some sites that are pretty blatant with "spam". |
Exactly. That's why I said that "some of the listees were flagged as bad neighborhoods."
These bad neighborhoods may be evil, they may merely be bad according to Google's definition of "bad," and some of them could even be innocent. I haven't studied how many PR 0 sites are listed in each category.
But that's not my point. There's a Google directory, and it has subcategories that list sites. When one or more "bad" sites are listed in Google's subcategory page, the Google page itself inherits the penalty.
Now you can argue that Google is guilty, and certain of Google's subcategory pages deserve their PR 0 because they were negligent and listed some "bad" sites.
Okay, then why not take it one step further and argue that the directory above these zero subcategories is negligent, because listing PR 0 categories is reprehensible. Yeah, let's give that directory a PR 0 also!
Hey, why stop there?
Let's go up the ladder and give the entire Google directory a PR 0. And since it's derived from DMOZ, let's give the entire DMOZ directory a PR 0 also!
Do you see my point? The PR 0 penalty can act like a ... <scratches head> ... like a VIRUS. That's what I'm complaining about.
How do you stop a virus? I know how Google stops one. They say, "Well, it may be penalized, or it may not. Hard to say. Let's see what happens on the next crawl."
Contractor, you're seeing only what you want to see in my posts. I said that two of the three mirrors were disallowed to Google, and the third one involved only a few essays that I don't care about.
My complaint was that Google violated my disallow on one of the mirrors, and I was being punished for duplicates that should have been invisible in the first place.
I also said that I was doing this before Google existed, and there are good reasons for me to continue doing things this way.
Finally, my 88 doorway pages are for a tax-exempt nonprofit, which carries no advertising, and these doorway pages are the only way Google can get into my data. I have every indication from Google that they want my data indexed, as much as I want them to play by their own rules.
[edited by: Doofus at 8:31 pm (utc) on June 9, 2002]
| 8:24 pm on Jun 9, 2002 (gmt 0)|
It's hard to find the right balance. I want to help webmasters but not spammers. nutsandbolts, you're right that when I poke around I often trip over a lot of skeletons. :)
One point I would add to the point above is that Google isn't trying to harm any webmaster. Our goal is pretty simple: to get the best information to our users. When we make changes, it's not to hurt anybody--it's because we think the change will improve our search.
| 8:27 pm on Jun 9, 2002 (gmt 0)|
<<I also said that I was doing this before Google existed, and there are good reasons for me to continue doing things this way.>>
You may have good reasons and I am "not" doubting you. What I am saying is that Google may not think they are good reasons and they will "not" change their algo/filters for your site. So I guess you will have to follow under the same rules as all the other sites on the web or forget about Google.
| 8:32 pm on Jun 9, 2002 (gmt 0)|
<<It's hard to find the right balance.>>
That's the understatement of the year ;)
For every change you make that will help 95% of the users - there will probably be a few % of websites that will be hurt from it. That will never change in my opinion.
| 8:40 pm on Jun 9, 2002 (gmt 0)|
|So I guess you will have to follow under the same rules as all the other sites on the web or forget about Google. |
I believe you meant to phrase this differently, according to the facts I just presented.
How about, "I guess you will have to follow Google's rules the same as all the other sites on the web or forget about Google. And furthermore, you have no right to expect that Google itself will abide by their own rules."
Now that's a statement I can agree with.
| 8:43 pm on Jun 9, 2002 (gmt 0)|
<<How about, "I guess you will have to follow Google's rules the same as all the other sites on the web or forget about Google.>>
I can agree with that - regardless of how fair/unfair their algo/filters may seem, they are still the rules I will not risk breaking for a few spots up in the SERPS.
| 9:05 pm on Jun 9, 2002 (gmt 0)|
Doofus I agree with certain part of your argument. Though the current understanding of PageRank is voting for pages (not sites) that are of value to your visitors (ultimatley this is what we and google are trying to achieve).
If for example the Dmoz page votes for a spam page (however, uninattentional) I believe, it should get a blackmarked. In this instance all you got to do is remove the link whether that be Dmoz or yourself. From a Dmoz perspective PageRank isn't their concern so the problem is address at your end. Does anyone really want to be associated in any way with spam, no! The visitor is the commodity not google.
The solution to all of this may very well be found in "paid for inclusion and review" both at Google and Dmoz but a tend to think we all would have something new to complain about.
NFFC - I tend think google really doesn't ignore emails - more to the point how less than perfect questions are asked here on a daily basic. The open forum allows many at WMW to address problems at a very generic level, but an email is one on one and very specific or at least it should be.
I get 500 emails a day myself and I am not as well used as google. How do you put resources onto a webmasters problem like -"I was in google now I'm not, what's wrong". I think the greatest problem here is ignorance. Many are not willing to put sufficient research in designing, optimizing, and anlysing problems when they occur. Google is "free" I'll ask them and not very happy when a "free service" is not to there satisfaction. There are many, many site owners that have legitimate concerns, I think the single biggest problem is one where the hosting service does something wrong and your association with them leads to a penalty.
janmccl this may well be your problem.
| 9:14 pm on Jun 9, 2002 (gmt 0)|
and we still have no clue about what pr2 is about, when the site obviously should have more, and did have more before. and the internal pages are 0. and the site is getting nowhere in google. and for how long?
i'm sure a lot of people are about to give up on trying to write websites for google. and google would like that because then people aren't trying to "affect" the rankings (what a crazy concept).
but, when webmasters and seo's move on, what ever search engine it is, its the beginning of a slippery slope down. thats almost ALWAYS been the case.
| 9:31 pm on Jun 9, 2002 (gmt 0)|
|Googleguy's response seems to apply to the "big business" sites - but what about us small guys? |
While I can't speak for Google, and don't want to get into diagnosing one site's positioning problems here, I think this is a good example of a pretty common situation so I'll approach it in general terms:
Links that are pages with a fairly low (1-3, for example) PageRank aren't going to bring you much PR. Links from pages that are nothing but links aren't going to bring you much PR; neither are links from post signatures in low-PR web forums, or in guest books. Links from Zeus-generated link pages are simply bad news.
So this is an example of a common post here: "My site's not ranking well, why am I being penalized?" But low PageRank isn't necessarily a "penalty," it's often "deserved," given the PR algorithm.
To get specific just for a moment: I looked at a random 9 or 10 links, none of them were from a page with a PR higher than 3 and every one fell into one of the categories I described above.
| 9:32 pm on Jun 9, 2002 (gmt 0)|
Kind of like placing an advertisement on TV without proper marketing research.
Pets.com spent $5 million, to advertise at the Superbowl, only to find out that their most profitable customers "retired, willows" don't watch the Superbowl.
| 10:35 pm on Jun 9, 2002 (gmt 0)|
|So this is an example of a common post here: "My site's not ranking well, why am I being penalized?" But low PageRank isn't necessarily a "penalty," it's often "deserved," given the PR algorithm. |
I disagree. Even if you were correct, on the "little gal" jewelry site in question (the site in janmccl's profile), it doesn't explain why all the inside pages I checked were PR 0. It's almost like the PR 2 for the main site is a ruse designed to "cloak" the penalty. A PR 0 on inside pages is a kiss of death. Most Google referrals come into your inside pages, not to your home page.
Furthermore, Jan is correct that the link: command shows zilch for her site. But the link command is yet another "cloak" to divert Google's critics. Better to look for all sites using a "www.myhomepage.com" in the search box, with quotes around it. When I do this with Jan's site, I find two DMOZ listings that made it to the Google directory. One Google directory page is a PR 6 and the other is a PR 2. I also see a link from a PR 5 page, and three links from PR 4 pages. That's after five minutes of looking. Never assume that getting a "nothing found" back from a link: command means anything these days. It's a ruse. Her site got screwed by Google, and it would take a forensic scientist to figure out what caused it.
|I've worked for 3 years to get where I am and have a loyal customer base plus good listings on the other search engines. That is I do until Google takes over the whole world. I'm worried about that. |
I say, "Thank you, Jan, for speaking out. And shame on others of you who keep Google groveling."
All I know is that if Google's own subcategories in the Google directory that are currently PR 0 get tweaked up before the next crawl, I'm going to tell GoogleGuy's mother on him.
But they should be hand-tweaked up, because currently a legitimate SEO that advises and practices safe and responsible techniques, is getting penalized by finding himself in one of these Google subdirectories.
You have a situation that looks like this:
bad guy(s) --> listed in Google subcategory --> PR 0 for Google subcategory --> good guy in same category gets questionable benefit from his DMOZ listing
But of course, if GoogleGuy wouldn't let his own mother get a PR boost from a hand tweak, then certainly he wouldn't follow instructions from his boss to do the same thing.
What's wrong with this picture?
1) Google's anti-spamming policies and techniques spawn collateral damage. Innocent non-combatants are getting zeroed by anti-spam bombs.
2) Instead of setting up some sort of arbitration mechanism, or fixing the algorithm, or even admitting that there's a problem and telling us that they're working on it, Google just sends out GoogleGuy to act as a flak-catcher.
That's what's wrong.
| 11:15 pm on Jun 9, 2002 (gmt 0)|
Doofus or Jan,
Could you explain why your other jewelry store site did not suffer the same fate as the jewelry store site that is mentioned here?
What was/is different? Have you made changes to your low PR site as your other jewelry site has a PR5?
I notice one thing right off the bat is you have quite a few links off your index page of your low PR site but not off your higher PR site?
All those links on your index page plus 5 full pages of links on your site is "not" helping your PR at all.
Edited after closer inspection of just your first page of 5 of pure links I saw one PR0 site and a couple more on the brink. I didn't look at the other 4 full pages of links. I would say this is where your biggest problem is.
[edited by: The_Contractor at 11:37 pm (utc) on June 9, 2002]
| 11:30 pm on Jun 9, 2002 (gmt 0)|
Ahhhh, now I get it. Doofus, I didn't recognize you at first. It clears a lot of things up to realize that you are also Everyman. I was just wondering where you were the other day. :) Doofus, I believe the domain you're referring to was in both LinkTopics and a buddy links program?
Going back to your dmoz/directory question, Google has been pretty clear that linking to enough bad neighborhoods can taint a page's trust level. It shouldn't be a surprise that a dmoz page can have its trust level affected, or that a directory.google.com page which uses dmoz data can be affected as well. Why should the algorithms show any preference to dmoz or even google.com pages, right? So I disagree with you that I should "hand tweak up" the PageRank for any page. :)
| 11:44 pm on Jun 9, 2002 (gmt 0)|
Well I'm not a forensic scientist but if you have that many links on the index page and 5 other pages of links - I think I understand what happened to her PR if she is not penalized. Get rid of all those links especially to the PR0 sites.
GG wrote: <<So I disagree with you that I should "hand tweak up" the PageRank for any page. >>
I agree, if you started that practice I could hear it now. No tweaking ALLOWED(unless it's my site). :)
It posted double messages for some reason :(
| 12:02 am on Jun 10, 2002 (gmt 0)|
|Doofus, I didn't recognize you at first. It clears a lot of things up to realize that you are also Everyman. |
Flattery will get you nowhere, GoogleGuy. Now then, who are you?
| 12:51 am on Jun 10, 2002 (gmt 0)|
>>And shame on others of you who keep Google groveling.<<
Hey doofus..I guess if you created enough "identities", you could be a booster and a groveler.
Who you are probably depends on your PR.
Quite frankly I'm a little surprised that anyone who has been reading these forums for any length of time would get a PR0, there is plenty of solid info here to keep you out of the soup.
If you have disregarded all of the other PR0 postings then I guess you have earned your nic (the doofus one).
| 1:14 am on Jun 10, 2002 (gmt 0)|
"Google's own subcategories in the Google directory..."
I have to admit, that even though we have benefitted from it for some sites, that I am starting to question how much relevance Google should give to "their directory", ie DMOZ.
When Google first started using ODP data, it was much smaller and also much more up to date and relevant. However, ODP now has over a million unreviewed sites in it's backlog, and many directories have not been edited for months.
This would seem to give an unfair advantage to those that have managed to get in (or in some cases, became an editor primarily so they could get their site listed). It penalizes those that have faithfully followed the guidelines and submitted their sites just like they were supposed to - and then fell into oblivion among all the other million or so new sites still sitting in unreviewed (some since 1999).
| 1:21 am on Jun 10, 2002 (gmt 0)|
"But there are some that won't budge off that damn 0! "
This has me wondering.
Assuming that the PR pages are ranked by a percentage factor (that is, 2% will get 10, 5% will get 9, 20% will get 5, etc.) - does the current rage towards SEO give the big guys a large advantage over the little guys?
Since there are (assumption) only so many PR5+ site rankings to go around, are the big guys hogging them all?
I don't know how Google works in this respect, so it is all speculation.
| 1:27 am on Jun 10, 2002 (gmt 0)|
|If you have disregarded all of the other PR0 postings then I guess you have earned your nic (the doofus one). |
John 3:32 says:
|He bears witness to what he has seen and heard, yet no one receives his testimony. |
Do your homework, john316. I don't have any PR0 sites. I have three PR6 domains. Used to have four, but had to kick Google out of one entirely. One of my PR6 sites occasionally reports a PR7.
There's a fifth site that used to be a PR2. Google was supposed to stay out of that one, but climbed over my robots.txt fence. This made trouble for my top site.
Unlike a lot of you, I was concerned about Google before "PR0" entered our vocabulary. I'm not selling widgets, and my opinion of Google doesn't yo-ho with my balance sheet.
| 1:27 am on Jun 10, 2002 (gmt 0)|
I believe that DMOZ is still the most up to date and relevent of the directories. I also have more respect for DMOZ since it is not pay-to-play. Yes it has problems but what would Google use if not DMOZ?
You're not harboring ill feelings against DMOZ since you are no longer an editor there are you?
| 1:31 am on Jun 10, 2002 (gmt 0)|
To get back to the thread where you stated that there is nothing wrong with Jan's site. Did you ask her or bother to check all the links she has going out before jumping to a conclusion that it was Googles fault? Did you bother to look up her other site before stating there is nothing at all wrong with her site?
[edited by: The_Contractor at 1:53 am (utc) on June 10, 2002]
| 1:32 am on Jun 10, 2002 (gmt 0)|
"Now then, who are you?"
I'm a googler who stops by on weekend and after hours to answer questions in their free time. I like to talk about spam issues with webmasters and explain Google's stance from my personal viewpoint. I think you've got a pretty good feel for what I'm like by now from my posts.
I prefer not to give out my name. Do you remember the line from The Princess Bride?
Fezzik: "Why are you wearing a mask? Were you burned by acid or something like that?"
Dread Pirate Roberts: "Oh no, it's just they're terribly comfortable. I think everyone'll be wearing them in the future."
| 1:38 am on Jun 10, 2002 (gmt 0)|
Doofus, I'm not aware of any robots.txt bugs uncovered in the last few months. Did googlebot crawl the PR2 site shortly before/during/after you changed the robots.txt? The bots recheck robots.txt pretty often, but it's possible that the timing was bad.
If you're willing to tell me the site, I'd be happy to investigate. If we did have a robots.txt bug, we'd want to know about it so we could fix it pronto.
| 2:19 am on Jun 10, 2002 (gmt 0)|
Googleguy, I'm surprised at your company's policy concerning outbound links. You've twisted the concept of linking into both a commercial asset and a liability.
I think you should set the Contractor (and other misinformed webmasters) straight about outbound links. You know as well as I do that links are a vital component of the World Wide Web and that Google would be nothing without them. Since we're not supposed to be optimizing for Google in the first place, why is it so important to check the Page Rank of a site that we're considering linking to?
And Contractor- 5 pages of links pales in comparison to the (how many?) dozens of pages of links on the site in your profile.
| 2:28 am on Jun 10, 2002 (gmt 0)|
A "directory site" such as mine is a little bit different in my opinion or any SE I can think of. Otherwise DMOZ would have a PR0. I was merely stating that I think we all know that you will give a little PR away on links. The site that doofus was refering to has many links from it's index page + 5 other pages of links. Why question what the reason is for having a PR 2 when you design your site in this way?
Edited: Also the site in question here has another related site that is designed exactly the same with a different color scheme (it is NOT a mirror) and no links and has a PR5. What does that say?
[edited by: The_Contractor at 2:48 am (utc) on June 10, 2002]
| 2:46 am on Jun 10, 2002 (gmt 0)|
|If you're willing to tell me the site, I'd be happy to investigate. If we did have a robots.txt bug, we'd want to know about it so we could fix it pronto. |
I sent an email to email@example.com on June 7 from a sprintmail.com address. I appreciate your concern, and this is something that is relevant to webmasters generally. I also explained it under page 3 of this thread, in my post that starts about 60 percent deep from the top of that page, and starting in the middle of my post.
There's confusion over how you handle robots.txt on sites that don't have robots.txt in the root directory. This site of mine was one of those tilde situations, as in home.sprintmail.com/~username/
When the site owner has no access to what most bots consider the root of the site's URL, how does the owner tell Google to stay out? I've read your stuff about the 90 day disallow for robots.txt on inside directories, but Google jumped the fence after I cleaned it out in March, but before the 90 days were up. Also, I had trouble 90 days ago, as well as two days ago, getting Google to understand my robots.txt during the removal process. I got it to work (the site was gone 24 hours later, at any rate), but from the feedback during the removal process, I wasn't confident it was going to "take." Same experience three months ago.
Anyway, I think we'd all benefit from a clarification of how Google treats robots.txt on inside directories. I think there's a bit of flakiness there. I will say that with a normal robots.txt situation, where it's in the root of the domain, Google has been pretty good about not jumping the fence.
Anyway, I just stuck in a bunch of METAs for googlebot to NOINDEX everything important on that site, including the home page, so it probably won't happen again even if googlebot does jump the robots.txt again.
| 2:48 am on Jun 10, 2002 (gmt 0)|
>>>Also the site in question here has another related site that is designed exactly the same with a different color scheme and no links and has a PR5. What does that say?
>>>I think we all know that you will give a little PR away on links.
Your example proves nothing. I have a site with a PR6 and it has thousands of outbound links. Based on your theory, if I removed all of the outbound links I should get at least a PR8 or PR9.
Also, please elaborate more about the special qualities of link directories.
| 2:49 am on Jun 10, 2002 (gmt 0)|
>policy concerning outbound links
I distinctly remember a policy inference referencing linking out to what are considered bad neighborhoods. We can look at links as an integral part of the nature and connectivity of the web, but how about the motivation for and value of the linking being considered when evaluating how relevant that linking is to the concept of connectivity?
What is the motive for some links? Is it for adding relevancy and value to the site visitor who we feel will gain something from visiting the other site? Would there be an interest on the part of visitors in both products or sets of information and make it a viable business decision to exchange links and benefit all, including visitors?
Or are the links done in order to increase Page Rank, and according to most references I see all week, link popularity - with the PRIMARY express purpose of improving search engine rankings?
How do we define what is legitimate linking done in accordance with the concept of the connectivity of the web, as opposed to what's done deliberately to manipulate search engine rankings?
Granted, some people get involved innocently and/or ignorantly (not many, from what I see weekly, on a steady basis year-round), but that is what I personally think needs to be clarified before we can even suggest that Google is being unfair and declare them guilty.
This issue has not been answered or defined, and imho it has to be before we can decide who's guilty and who's innocent.
I've seen some claim unfair treatment and even with my limited checking, I've found linking relationships between them and other penalized sites, as well as associations with what even I recognize as being bad neighborhoods.
With some even I have seen what the problems could be, to some degree, but I have never said anything, simply because individual site reviews are not allowed here.
>>>We do not allow review my site posts.<<<
According to the Terms of Service doing them is a violation of TOS, and I have to abide by the rules.
[edited by: Marcia at 2:58 am (utc) on June 10, 2002]
| 2:54 am on Jun 10, 2002 (gmt 0)|
Are you telling me that you do not believe that you give away a small amount of PR with outbound links? If so that is against everything I have ever read in this forum.
Yes, I do believe that directories or search engines are recognized as such and treated differently then the average site. Take all of the DMOZ dump sites and explain to me why they carry any PR at all if they are not treated differently. If a site copied another site word for word in any other situation it would be penalized. Let alone the hundreds of sites that do this with DMOZ and retain high rankings.
| 2:56 am on Jun 10, 2002 (gmt 0)|
I agree :)
| 3:13 am on Jun 10, 2002 (gmt 0)|
|How do we define what is legitimate linking done in accordance with the concept of the connectivity of the web, as opposed to what's done deliberately to manipulate search engine rankings? |
To bend your question just slightly, I can tell you why I was doing mirror sites before Google even existed. We have a lot of political content. Not just my sites, but the other nonprofits on our Class C. We've been hit with denial-of-service attacks going back to 1996, from fascists around the world. If I had a choice, I'd put duplicate sites on completely different servers. To some extent I was doing this, but it's too expensive, and besides, our server has good bandwidth now, and we now have our own router (which gives us a lot more flexibility to effectively block bad guys). So the situation evolved to where only one of my little sites, which hardly matters, is on another server now.
Then Google comes along, and my mirror sites are evil incarnate. I have to keep Google out of them now, or get penalized. Without saying that Google is right or wrong about this, I can say that I'm not a spammer.
Is Google right? Are we right? The question would never arise in pre-Google days. Perhaps Google does make more sense in this particular example. But it's also an example that shows we have to stay on top of where Google is taking us.
| This 144 message thread spans 5 pages: < < 144 ( 1 2  4 5 ) > > |