|Page Rank zero|
need quick tutorial
| 11:21 am on Nov 2, 2003 (gmt 0)|
Would someone a pitiful newbie a little help on PR Zero? From what I read from Google Guy and others, you get the zero PR when a site is brand new, or when Google penalizes you. What are the likely reasons for being penalized? Can someone offer (or point me to) a brief tutorial?
| 6:29 am on Nov 4, 2003 (gmt 0)|
Most importantly I would say that most PR0 pages are natural PR0 pages.
there are no certainties on when a PR0 penalty might occur.
Furthermore penalties might expire and penalties might only relate to certain factors.
Best to keep to what google refers to as "Quality Guidelines":
you can also do a (google) search on PR0 or PR zero penalty within the WebmasterWorld site for a historical overview.
Heavy interlinking between sites or participating in certain linkfarms used to trigger a PR0.
| 11:59 am on Nov 5, 2003 (gmt 0)|
Thanks to Vitaplease for this info. Read the Google guidelines again. As some have addressed earlier, according to Google, "There Shalt Be No SEO" ... which is really silly at heart. Even Google would admit that SEO is ethical if it improves the relevance of a site that would otherwise be lost in the rankings to less relevant sites. And, given that the optimized site is AS RELEVANT as the other top listings, any advantage you'd gain would be based on something called "competition." Imagine applying Google's "guidelines" to the offline world; they'd be essentially saying "There Shalt Be No Advertising -- all products and services should speak for themselves." That might please many people, but would also defeat many legitimate vendors and causes ...
| 1:43 pm on Nov 5, 2003 (gmt 0)|
Google is happy for sites to be optimised. They even encourage it with their recommendations.
What google doesn't allow is 'spamming' which is pages pretending to be what they're not, or pretending to have content that they don't.
Read around the subject a bit and I'm sure you'll come to understand the difference.
| 4:53 pm on Nov 5, 2003 (gmt 0)|
I have done a bit of reading and I'll hold my ground. Let's look at some of Google's specific language from their guidelines page:
Quality Guidelines - Basic principles:
|"Make pages for users, not for search engines. Don't deceive your users, or present different content to search engines than you display to users." |
-- No argument here; clearly one shouldn't optimize a site for "widgets" when it's really a porn site. No one is arguing with the standard of "relevance."
|"Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you. Another useful test is to ask, "Does this help my users? Would I do this if search engines didn't exist?" |
-- This depends on what you mean by "tricks." Does putting my keyword twice in a Title tag constitute a "trick?" If in fact my site IS relevant, and AS relevant as any other site, don't I have the right to compete as long as I don't deceive or do damage? 'If the search engines didn't exist', I'd be posting simple text and graphics pages. Relevance would be determined by human moderators. Chaos would ensue -- like DMOZ.org
|"Don't participate in link schemes designed to increase your site's ranking or PageRank." |
-- What is a "scheme." Does that include simply getting a link from a legitimate related source? What if other sites are already linking to me -- should I ask them to disconnect?
|Don't use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our terms of service. Google does not recommend the use of products such as WebPosition Gold™ that send automatic or programmatic queries to Google. |
-- Here Google is giving away their real concern: their interest is not so much in the quality of the user's experience as it is in saving on their "computing resources."
|Quality Guidelines - Specific recommendations: |
"Avoid hidden text or hidden links."
-- Since these don't directly affect the user's experience, what difference does it make? Is it that "computing resources' thing again?
|"Don't employ cloaking or sneaky redirects." |
-- Someone in another posting pointed out that Google is the king of using redirects. Again, what difference does it make other than the "computer resources thing" (call it CRT)
|"Don't send automated queries to Google." |
|"Don't load pages with irrelevant words." |
-- How would that be possible? I challenge anyone to show me a page without an "irrelevant" word. What Google is trying to say is "Don't load pages with 'deceptive' keywords" But, again, no one is supporting deceit.
|"Don't create multiple pages, subdomains, or domains with substantially duplicate content." |
-- CRT; but also, consider the stupidness of this. (And I'll admit I'm still trying to figure out what Google means by 'duplicate content.') Any respectable product site most likely has nearly 'duplicate' pages, each illustrating and detailing one of many products. In fact, isn't it to the user's benefit that these pages have common nav. elements, and a consistent look, feel, etc.? Google is likely trying to dismiss "doorways" or pages who differ only in focus on specific keywords, but, again, if such pages serve to bring users to a RELEVANT site, then the issue isn't the quality of the users' experience, it's CRT.
Google's primary interest is in profitability, just like the rest of us. But we shouldn't consider these to be "ethical" guidelines -- just a form of commercial self-interest.