Forum Moderators: Robert Charlton & goodroi
We need to keep this thread focused on the followings:
- Changes on your own site ranking on the serps (lost & gained positions or disappearance of the site).
- Changes you have noticed on the new serps (both google.com and your local google site) especially in regards to the nature of the top 10 or 20 ranking sites.
- Stability of the serps. I.e do you get the same serps when you run the same query within the same day or 2-3 successive days (both google.com and your local google site).
- Effective ethical measures to deal with the above mentioned changes.
Thanks.
Clint,
The url removal tool is good for removing links that are 302ing to you (in case these make you nervous).The method is to add the NOINDEX tag to the page and then do the url-removal by pasting the 302 link into the url-removal box.
Then, quickly remove the NOINDEX tag - hopefully the whole process takes no more than 1 min thus reducing the chance that googlebot will catch your page with the noindex in it.
The only problem I have found is that some of the 302 urls are tooo long for the urlremoval tool to accept.
And a side note on this subject: If the url-removal tool is allowing you to remove a url on someone elses site then what does that tell you? It tells me that G thinks your content is part of that 302er's site...
somebody tell me I'm full of it on this point.
This is becoming muddier and muddier to me. ;) If only you (I) can remove our own URL's, then how can you put someone else's URL in the box and get it removed? What would stop someone else like your competitor from putting your URL in the removal box and getting it removed? This is why I mentioned that I THOUGHT, thought mind you, and G had to send you an email to an address on the domain you want removed, in order for it to be removed since that's the only way they'd know you're the owner of the domain. Again, this is with the "urgent URL removal" method.
I know about the robots tag in the <head> tag of each page to do this, G mentions this, but this is the slow method. There is also actually a URGENT "URL removal tool" here: [services.google.com:8882...] and on that page they state: "A confirmation email will be sent to you once you have submitted your login information. Follow the instructions in that email to continue", but I never went that far to see just what those instructions are. I can't remember exactly where, nor I am even positive about the bit regarding the email address having to be on the domain that you want removed.
Well, I just tried the process on that page. I put an email address & pass in to create an account, I clicked the URL in their email they sent to me (I just used a free email account), and you are brought to a G webpage:
Remove pages or subdirectories using a robots.txt file.
You must create a robots.txt file before proceeding. If you haven't, please return to our Remove URL page for further information.
Note: Any changes will be temporary unless your robots.txt file is in your server's root directory.
URL to your robots.txt ......[text input box here]
e.g. [google.com...]
or [google.com...]
So, looking at that, that area at least you can put ANYONE's robots.txt file in that box! But, am I to understand that you FIRST have to go to your robots.txt file and CHANGE IT to reflect the pages you want removed, THEN put it in the text input box area? If so, then that now makes sense. However, what's to stop someone (competitor) from making a robots text file on THEIR server, then putting the paths in it to YOUR website? Can that even be done? For example:
User-agent: Googlebot
Disallow: www.SomeoneElsesDomain.com
Or, is a robots.txt only capable of LOCAL commands? If so, then I guess this topic is closed, and I hope that clears this up for everyone that had a question on it. :) If not, and someone CAN do this above, then that's a very serious issue.
With either method, I would also have thought the process would take at least days, maybe weeks.
I too would like to know what Google considers spam.
1) We can assume that anything within the Google Webmaster Guidelines isn't considered to be spam.2) As to what does constitute spam, the definition obviously changes as the spammers' techniques evolve. (For example, AdSense scraper sites didn't even exist until recently.)
3) Spam isn't the only reason why pages might be whacked or downranked in the Google index. Duplicate content is another. For example, an affiliate or e-commerce site might have thousands of boilerplate catalog pages for legitimate reasons, but Google may feel that, since those boilerplate pages aren't adding value for the reader, they don't deserve to be ranked high for their keyphrases. GoogleGuy mentioned the need for "value add" a week or two ago, and I believe he also referred to the need for "diverse" search results (presumably so that users don't have to dig through boilerplate clutter to find a range of information on a topic).
I don't know if you were directly replying to Johan007's post above it (page 89), or mine on a previous page. So, I'm stating this again that I put in my original post asking about G spam:
--------------
(From Googleguy) P.S. J., thanks for the spam report. I love to hear about any spammy sites that we're missing
GoogleGuy, would you be kind enough to please explain exactly just what G considers a "spammy site"? Do you mean a site or page with nothing but links; or, a site or page with repeated key words all over it (and if so where, meta tags or body, and how many is too many) ; or, what's in alt image tags; or, the "title" tag for links, etc.? I'm sure many would like to know this as per G's definition.
Before anyone replies with things such as "just use your head", or "well, I think it's....", etc., I reiterate: as per Google's technical definition of such. ;)
Thanks.
-------------
I don't want to "assume" anything, know what I mean? ;) I can appreciate your post, and thank you, but I don't want to be guessing here on this, none of us should be that need to know the exact and specific answers. Only info straight from GoogleGuy will we know what he meant and the definites of this since what's on the "Google Webmaster Guidelines" page is obviously not being followed by G themselves. Also, that info could be outdated, and G may have more precedence or weight on one taboo area than another.
The main reason why *I* ask is I don't want to use too many targeted words on my pages, their meta tags, alt image text, etc., and I don't know just how many is too many. I don't even know if this is even an area that can even be considered a "spammy" area. If it IS, then we need to know just how many is too many in order to be in technical compliance with G.
Very good. There are other problems:
1. What about encrypted links.
Only in script (program) code? Bots ignore them?
2. Links that redirects from an affiliate provider
to a merchant.
3. Syndicated mini sites within a site.
KBleivik
Make it simple, as simple as possible but no simpler.
[edited by: kgun at 1:55 pm (utc) on June 20, 2005]
"In 1964, Justice Potter Stewart tried to explain "hard-c-re" p-rn-graphy, or what is obscene, by saying, "I shall not today attempt further to define the kinds of material I understand to be embraced . . . but I know it when I see it . . . "
It is not (in Norway) regarded spam when you get your postbox filled with paper. Paper that is taken from our forests.
Money defines what is spam. Money speaks.
KBleivik
The only thing you own for ever is what you lost.
Henrik Ibsen.
You can submit any robots.txt file that you like, and Google will process it. You need to submit the whole URL of where that file is. The robots.txt file usually lives in the root folder of a site, but for the removal tool you can put it in a deep folder if you wish. You can submit the robots.txt file found on any other site to the removal tool.
The files mentioned in the robots.txt file are always files that are on the same domain as the robots.txt file itself, so you cannot remove files on other peoples sites that they did not actually want to be removed (unless they accidentally put something in their own robots.txt file that they didn't mean to do).
With a 302 redirect hijack, another site is redirecting to you and using a 302 redirect to do it. The URL will be something like www.theirsite.com/sneaky-redirect.php?where=www.yoursite.com and the bug in Google is that that URL will appear in the search results, with YOUR page title, and YOUR page description, and the cache will be of YOUR page. The real URL of your own page will drop out of the search results as being a duplicate. In this case you DO submit the URL of the other site to the removal tool, as Google actually thinks that page is a part of YOUR site. You put the "noindex" on your own page (as that is where Google is actually going to look because of the redirect) just long enough for Google to see the noindex attached to the rogue URL, and then you get rid of the noindex tag before Google crawls the real URL for your own page and sees the noindex tag also attached to that.
You cannot remove a normal page from the index on some other site, unless you can access that other site and amend their robots.txt file directly. You can remove a redirect URL, because there is no real page on the other site; there is only a URL that actually takes the user to your own site. The error is that Google indexes the "page" as being the URL found at the starting point of the redirect (i.e. www.theirsite.com/sneaky-redirect.php?where=www.yoursite.com) rather than the real location of your page.
Education should be the primary here. But myth and speculation runs rampant, and the big EDU is no where to be seen.
I don't do Salem witch hunts and base my findings on the coding and the fact that goes with that coding.
At best, some speculation perhaps has become a bit more than speculation at this point, but there are still NO definites on anything as to what (issue) hurt person A and why it didn't hurt person B, or if it (issue) even had anything to do with it in the first place and if it was something else, and on and on and on.
Tromping through the wilderness, keeping a totally keen and watchful eye for any wildlife, and I come across the awesome post that lists what not's and what do's in handling the totally kewl burpin' update.....Dude's, this is totally wise to do?
I mean like, this is some really rich stuff, but wouldn't GG get a taste of that and run off to, like his dad maybe and spill the goods, thus in the end, making life even more difficult during the next algo shift?
The Do's and Don't's that Reseller is posting (if that is to what you are referring, or that anyone else is posting) are nothing concrete. They are only possible suggestions and possible guidelines; since again, no one has nothing definite due to the lack of real and specific technical info input by Google.
If what you are suggesting is that GG will tell those at the G-plex what's going on here on these threads; yes, we HOPE that's what is happening! For there were and still ARE a lot of pi$$ed-off people here that had no business getting their sites trashed, yet they did, and Google MUST BE MADE AWARE of this and the outcry it has caused, along with the countless thousands of $ in lost wages, sales, lay-offs, business that went or are going down the tubes, etc.
Is it possible GG will tell the G-plex what some of us may have did or may be doing to get their G-SERP's back? Sure, I don't see that necessarily being a bad thing though, I guess it depends on GG's intentions and no one knows what they may be. He has been helpful here on countless occasions, but also somewhat "indirect" and "not exactly to the point" on a few occasions. From looking at all the posts here, I don't recall seeing any that even vaguely alluded to doing something questionable or sinister to get back their G-SERP's. From what I have seen, (but I can only speak for myself and a couple of others), we are all decent whitehat people with whitehat sites that simply only want (or wanted) to get back our ranks in G, and are doing so (or have done so) by only using legitimate and Google-accepted methods, and in some cases methods suggested by GG himself. So, I don't understand why he would "spill the goods" back to G, when the "goods" in most if not all cases were alluded to by GG, or specifically made by him in some instances.
I always try to first see the good in people and hope for that. We can only have faith in GG and Google that they will not take (for only one example of several), will not take the 301'ing of non-www to www and choose to instead on their next update penalize for that!
I have used it successfully on my site to remove some pages indexed with wrong urls (kind of www/non-www) problem, but it was my site.
However, I have just read through g1smd explanation and it seems to me that in case of free hosting services if you can put robots.txt in your directory and use it to remove any other directory on that site - this is hole that allows you to remove URLs of pages that don't belong to you.
Or am I missing something?
Oh, BTW: my pages today definitely dissapeared for some of my keywords from the search results. They were not modified since May 23rd and they were pushed down every week. I am not banned as the search for my company name (and domain) which is a more or less unique word (about 4k hits, of these about 3k about me - forums, shareware sites and so on) shows me at #1.
[edited by: Borek at 3:19 pm (utc) on June 20, 2005]
their webmaster guidelines have been in place a long time. Follow them and you won't have to worry about exactly where the line is.
Well, this certainly NOT true. Where have you been? ;) Most, if not ALL of us here WERE and ARE following them, and sites still got trashed! Furthermore, sites that were NOT following them were and still ARE at the TOP of some of the G SERP's! This is what many including myself have been posting here for weeks now. Therefore, AGAIN, as I stated, the need for direct input straight from GG.
In my case, I don't need to know about "spamming techniques", again, I only need to know how many of my sales products words I can put in my meta tags, and page body before it's considered "spammy", or (again), if this area even has anything to do with an area that G considers "spammy". If it is NOT an area, then it's moot topic and academic to continue. If it IS, then everyone needs to know how many times they can put "widget" on their pages before it has a negative effect in G.
To me, this is not even a spam area since my intentions are good. Yes I know some people's intentions are NOT, I'm all to aware of that. However if we whitehat people are going to follow G guidelines, then it stands to reason that we are going to HAVE TO KNOW specifics in order to comply. I don't want to have on my pages or tags "blue widgets sales", "used and tested blue widgets", "refurbed blue widgets", and let's say several more times for other blue widget types and descriptions, if there is a G limit of say only 5 "blue widgets" mentions per page. See what I'm saying? This is what I need to know. I made my site to be very descriptive for my visitors and customers, for their convenience and sake of clarity. As also to cover variations of the specific "blue widgets" for which they would be searching. Therefore I, and I'm sure many others here in sales-related fields, need to know how descriptive and helpful we can be before it hurts us. Now I personally didn't change anything on my pages regarding meta tags or body text, and I got back most of my G SERP's, but FAIK I could plummet again tomorrow, and again not know why!
IMO, this is ridiculous that this specific area of being descriptive could ever hurt anyone, but who knows. Hence the question. :) Again, no assumptions please, only definite facts. ;)
1. I was hit too...just go back and read Bourbon threads 1-3. And I have always followed G's webmaster guidelines.
2. G will NOT come here and tell you what they consider spam so you may as well stop chasing that wild goose.
Clint, do you see why G won't come here and say "yeah, four mentions is OK, but five is too far"? Do you understand the fact they would be shooting themselves in the foot if they ever did anything of that ilk?
The robots.txt file cannot address any webspace that is "higher order" than the location of where that robots.txt file is actually located.
That is, if it is located in a sub-folder, then it cannot address the root folder, nor can it address any other sub-folder except those sub-folders that are both at a lower level and are within the same branch.
If a robots.txt file is located on a sub-domain, then it cannot address folders and files on the main domain, nor can it address folders and files on other subdomains.
If a robots.txt file is located on a domain, then it cannot address other subdomains on the same domain, nor can it address other domains at all.
... except when a 302 redirect is involved, where Google seems to lose track of where the content actually is and allows the redirecting URL to be listed instead. In that case you can't use robots.txt to remove the offending URL, but you can use meta noindex tags on your own page for Google to see as it follows the redirect through to your own site.
>The other site i run which is in my profile? It's now rapidly climbing in google, yahoo and MSN, no sign of a sandbox AT ALL.<Would you be kind to elaborate more on that?
Well, i kept hearing about this sandbox issue with new sites on the Google engine. Heck, i was TOLD that my new site would be sandboxed by people in here.
Never happened. The site was added to Zeal, Yahoo, MSN and DMOZ 2 weeks before Google, its the way i always do it. When i submitted to Google it was before bourbon kicked off.
Heck, i was even worried about the earlier discussions on the update part 1 thread where some people said that running AdSence would mess up your ranking, so ya know what i done? I sent an email to Google from my work's place, they sent a reply within a couple of days stating plain and clearly that running adsence on a website WILL NOT effect in anyway the ranking position of a site. As far as the spider can see with regards to the Adsence code is a few lines of script, so how can adsence effect the rank?
I got bitten before by basing all a sites income on a search engine, thats why i got a job with a PC Magazine and became self employed by running a business in the PC sales sector. Sure, a lot of people have written into the magazine I work for and have complained, the complaints are now getting stronger and stronger from all over the UK and it may get to the point where the magazine does end up running an editorial about the current state of Google, but the emails we send back to the people who complain is simple, diversify.
Just a few hours work cleaning up your sites code, and a few days work finding multisource places for either a customer base or hits to your site can make a larger difference. Everything works hand in hand. If a business phones up another business and asks to exchange links on there sites then you will get people visiting from there site and google will latch onto the recip linking.
Sponsoring works along with the old style way of advertising, through the media. TV and Radio is not as dead as people think it is.
As for the rapid climb of the site in my profile? FOr a few keywords we didnt even think on which was relevant the site is now in the top 5. FOr some of the major keywords the site has climbed out from page 30 odd to page 2 and in some cases page 1. Googlebot is deep crawling the site almost everyday (which is really messing up the logs) and it's forceing the pages with major content up the rankings. Time is always on the side of the web designer and webmaster, why?
Think of it, it's the webdeveloper and designer that plays with the new technologies, constantly updating there site, constantly changing code. Never be afraid to make drastic changes to the base code of a site, as long as you have multisourcing for income and hits then YOU can have Google chase your site and not the other way around. Thats what i have been doing since the legend of google.com first appeared.
here are two definite facts for you:
1. I was hit too...just go back and read Bourbon threads 1-3. And I have always followed G's webmaster guidelines.
2. G will NOT come here and tell you what they consider spam so you may as well stop chasing that wild goose.Clint, do you see why G won't come here and say "yeah, four mentions is OK, but five is too far"? Do you understand the fact they would be shooting themselves in the foot if they ever did anything of that ilk?
It could be the "pursuit of an untamed aquatic ornithoid", but I have to try. ;)
Well, in OTHER spam areas, sure, I understand that. But this is a logical, sensible and legit question, and I don't understand why they can't say that since it has to do with the basic intelligent layout of a webpage that YOU want THEM to index well. Again, how can anyone follow a guideline regarding how many times a word or phrase is mentioned on a page, if they do not know exactly what the guideline is? See the dilemma? ;) I want to follow a guideline, but if a guideline is non-specific, then I cannot fully follow it, nor do I know HOW to fully comply with it. I WANT to comply, but I don't know if I am, or if I am not.
If someone needs to sell their products online, and be found by a SE, I think an SE has duty to the webmaster to let them know precisely what they allow and what they do not allow since they can remove your site from their index if you do not follow a guideline. It's like (I'm making this up of course) coming out with a new State Law in some Penal code, then seeing it's "punishable by $5000 fine and or 1 year in jail", then looking at the Law's text all you see is.... "There is something you cannot due on Sundays, but we can't tell you what that is ". Huh? Well we all as law-abiding citizens certainly want to avoid this, but what the hell is it that we are supposed to avoid? Now that may not be an accurate analogy, but I think it at least gets my dilemma across. ;)
The robots.txt file cannot address any webspace that is "higher order" than the location of where that robots.txt file is actually located.
What is misleading is that it is not conforming with robots.txt definition. robots.txt should be put only in the root directory and describes whole server - which happens to be space 'below', but it is not defined this way (no links allowed - but it is at robotstxt org, just checked, just in case).
Perhaps Google defines it otherwise. If so, there is no problem.
Ya know what, we also sent an email to the adsense side and got back the same response except they also forwarded the email to the search engine side of the house. We haven't heard back from that side yet.
People may also be confusing an effect for a cause.
We also launched a new site in the middle of Bourbon, it has been partially indexed, as for there being a sandbox it may all depend.
However IANAGEAPNAY (I am not a Google expert and probably neither are you.)
But, if you want to pursue the analogy for a moment, the only types of "laws" that work from Google's perspective, are vaguely worded rules that leave them with substantial discretion in deciding how to interpret the rule in any given factual situation. There are many laws like this, including laws that prohibit you from driving "recklessly" or dirving under "the influence of alcohol".
That allows Google to judge whether or not someone is driving "recklessly" or not, or if they are driving under the "influence" or not, given the circumstances.
From Google's perspective, the problem with a more specific rule is that it becomes too easy for webmasters to create sites that are just short of the line--if sites start adapting to fit their guideline, it becomes harder for Google to decide which ones should appear at the top of the SERPs, and which ones farther down.
As well, the line might logically need to be drawn in different locations for different topics or keywords, or types of websites. If Google were to publish detailed numerical guidelines, it would hurt them in two ways: it would make it easier for competing search engines to obtain the benefit of Google's research and experience at minimal cost, and it would make it easier for SEOs to figure out ways to "beat the systems" (defeat the purpose of Google's algorithms).
I searched for a unique phrase from one of the subpages which is listed as supplemental.
G gave me 7 matches. All are scrapers using my snippets or regurgitated G SERPs.
My page does not appear.
Oh boy.
First, who is ever going to admit that they spammed, had hidden links etc? At least not in a thread like this.
Second,
Google can't be specific about the guidelines, because spammers would push the limit.
Third,
Mistakes happen with the algo, many innocent sites get caught in the middle. E-mail GG with the address he provided; it's really nice of him that he is willing to go through the e-mails and remedy them.
Those pages will show on google but not my pages. My Home page on a search on google is URL-Only, so I still hope It will get crawled and updated. THE ONLY reason would be that in my opinion, or the second one is your screwed because of dup penalty..
But is there any webmasters here that are now convinced they wont return in the serps because of a DUP penalty? (on purpose or accidental?)
Alex
If someone needs to sell their products online, and be found by a SE, I think an SE has duty to the webmaster to let them know precisely what they allow and what they do not allow since they can remove your site from their index if you do not follow a guideline.
Clint, it isn't Google's job to help you sell products online. That isn't a responsibility that Google has asked for or accepted.
Google's mission is to serve users. Telling Webmasters how many times they can say "blue widgets" or "elbonia hotels" on a page wouldn't help users find more relevant search results. It would simply weight the SERPs more heavily toward sites that are trying to influence Google's search results.
What's more, even Google probably couldn't tell you how may times you can safely say "blue widgets" or "elbonia hotels," because the factors in its algorithm don't work in isolation. A high keyword density for "widgets" might help a site that doesn't practice other obvious SEO techniques, while the same keyword density might be enough to nudge a "grey hat" site into "black hat" territory." That's just common sense. One would have to be very naive to think that Google's algorithm consists of a simple checklist--or that Google would ever reveal secrets that would help Webmasters and SEO manipulate its search results.
>> I don't recall seeing any that even vaguely alluded to doing something questionable or sinister to get back their G-SERP's <<<Oh boy.
First, who is ever going to admit that they spammed, had hidden links etc? At least not in a thread like this.
Second,
Google can't be specific about the guidelines, because spammers would push the limit.
Mistakes happen with the algo, many innocent sites get caught in the middle. E-mail GG with the address he provided; it's really nice of him that he is willing to go through the e-mails and remedy them.
It is my understanding, after researching this, that you need 12% of the body text to be original, so if someone scapes several paragraphs of your content and posts it as their own on a new page with nothing else on the page, either their page or yours will likely be penalized with duplicate content.
I've been dealing with stolen content for a client for several months. My recommendation re posting articles on other sites, is that as long as you don't also post your articles on your own website you should be fine. I would recommend posting the article in a newsletter that is dated and whenever there is a dispute as to who owns it you can point to that dated newsletter (3rd party unbiased proof) as to the original author and earliest date.
Annej said:
Interesting what looking for duplicate copies of your webpages brings. I just found a site that sells papers to students. It seems they are selling at least one of my articles.
Write the host of that company giving them all the proof they need to tell if the article is truly yours or not. I have an article on hijackings on my web site and what to do about it.. You can get to it through my profile.
Thanks Kgun for posting my url to that article several pages earlier.
If someone needs to sell their products online, and be found by a SE, I think an SE has duty to the webmaster to let them know precisely what they allow and what they do not allow since they can remove your site from their index if you do not follow a guideline.
Clint, it isn't Google's job to help you sell products online. That isn't a responsibility that Google has asked for or accepted.
Hoo boy HERE WE GO AGAIN. I don't recall saying it was G's JOB to HELP ME sell products online! Let me check again.......nope, I didn't state that. Please read the quote again s-l-o-w-l-y. If an SE removes your site due to violation of something, all I'm saying is they SHOULD TELL YOU WHY so you WILL KNOW what not to do in the future!
Google's mission is to serve users. Telling Webmasters how many times they can say "blue widgets" or "elbonia hotels" on a page wouldn't help users find more relevant search results. It would simply weight the SERPs more heavily toward sites that are trying to influence Google's search results.
It most certainly WOULD help user's find their searched-for phrases at websites. I've mentioned this before: if some kid writes in their online diary "my computer is acting funny today, I think I may need computer repair", that is "computer repair" mentioned ONCE and ONLY once and is a TOTALLY NON-RELEVANT HIT if it shows for someone searching for "computer repair", now, is it not? Of course it is. If anyone has a computer repair website, then they are going to have to mention it more than ONCE, twice, three times, four times......(where does it stop) in order for THEIR site to be included in the relevant SERP's for a search for "computer repair". So the number of times you mention "blue widgets" can indeed help users find your relevant product or service for their search.
What's more, even Google probably couldn't tell you how may times you can safely say "blue widgets" or "elbonia hotels," because the factors in its algorithm don't work in isolation. A high keyword density for "widgets" might help a site that doesn't practice other obvious SEO techniques, while the same keyword density might be enough to nudge a "grey hat" site into "black hat" territory." That's just common sense. One would have to be very naive to think that Google's algorithm consists of a simple checklist--or that Google would ever reveal secrets that would help Webmasters and SEO manipulate its search results.
Understood. I guess you are used to seeing and know about "borderline techniques" or blackhat techniques, and what people will do with them. I'm not. So, it never occurs to me to try and dishonestly manipulate SERP's. I've noticed that lately during this update when I was searching for some of my monitored phrases and did indeed see some "questionable sites with questionable tactics" at the top of G SERP's. If I have for example a information only website, I'm not about to put mentions of products for sale on my page or in my meta tags in order to get persons that searched for the sales of that product to my site. That will result in nothing but pi$$ing off the visitor, and SE's in the long run.
But, it's not "common sense" for it if were, no one else would be wondering this. I've never claimed in the remotest way to be any "SEO guru" or "SEO genius", I know very little about it if anything. My field of expertise lies elsewhere. THAT is why I'm asking questions--to LEARN.
Every clue they give to spammers makes their job much harder next time around.
If they say "repeat your keyword exactly four times", then next month when faced with ten million pages all with the keyword repeated exactly four times, how the heck are they going to be able to rank them?
Which is why people read something like webmasterworld daily. Others prefer to think they know it all and need no one else, but others of us know we can benefit from the perspectives... ranging from brilliant to bonehead... from people all over the world doing similar but definitely different types of work. People who scoff at learning more have a lot to learn.
"As for the rapid climb of the site in my profile?"
You mean the one that doesn't rank in the top 1000 for its prime three word search term? Very puzzling posts.
The analogy to laws is not really valid, for multiple reasons.
Sheesh. Like I said: Now that may not be an accurate analogy, but I think it at least gets my dilemma across. ;) Some really read more into statements than they should. It was NOT meant to be taken in that context, or literally! I didn't think I would have to explain that. :)