Forum Moderators: open

Message Too Old, No Replies

My Site went to PR0 Need Advice

         

Net_Warrior

7:56 pm on Jan 5, 2005 (gmt 0)

10+ Year Member



Hi everyone...I'm new to webmasterworld. I run a personals site. Being uninformed I made the mistake of having a few rotton apples in the links list. Now my google rankings are PR 0, no backlinks. I feel our site is quality. I removed all outgoing links that go to sites with PR 0. I would appreciate anyone's advice or site review. Can my site recover from this or should I start over with a new URL and?

If I start from the beginning again. Should I redirect my old site to the new site? Should I just ban the googlebot from my present site since it has great rankings in yahoo, msn, altavista, etc.. Do I need to ban the bots from the non-google search engines from visiting the new site? I don't know the best strategy. Or should I try to get my site back on good terms with google. I emailed the reinclusion request today. Thanks!

goodroi

6:14 pm on Jan 7, 2005 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



The reinclusion path will probably take some time but is your best bet. Just make sure that you have good inbound and outbound links and unique content. Also it never hurts to start building up additional sites. While you are waiting on Google, you might want to build specialty sites like a personal site for divorced people or spanish speaking. By having multiple sites it will help to diversify your traffic. Just be careful with interlinking them. Good luck.

Net_Warrior

6:39 pm on Jan 7, 2005 (gmt 0)

10+ Year Member



Thanks for the reply to my post. Someone on here gave me some good advice to use a code validator on my site. Sure enough, there was some serious code mistakes. Well, I hope that fixing this will help a little bit. I'd be greatful for a coder to check out my site and give me their opinion. Please email me privately and I'll send the URL. Thanks!

walkman

6:50 pm on Jan 7, 2005 (gmt 0)



bad code generally doesn't cause problems...unless your pages don't load at all. Virtually of of the major sites have "code errors" yet they rank well.

DerekH

9:38 am on Jan 8, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



bad code generally doesn't cause problems

That's as may be, but h**p://www.google.co.uk/intl/en/webmasters/guidelines.html explicitly makes the point "Check for broken links and correct HTML"

I take notice of what they say, and poor HTML is generally simply that the webmaster can't take a few seconds to validate the code. It's a few seconds well-spent.

As a cyncical shopper, I work on the principle that if a site's webmaster produces code that's rushed and not quite right, his site's products may be the same.... <smile>

DerekH

nippi

9:45 am on Jan 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Bad code can be a problem, I acidentally deleted the </body> tag on on of my site home pages. Pr went to 0, down from 5. Fixed the problem, recovered to 3 then 5 in 3 months.

IN your case, I'm betting its bad links, just delele ALL of them in case you aren't sure which one is the bad one, then make sure you vet the new ones more carefully.

Forget gambling, viagra etc, these sites usually have no linking scruples and you will get burnt.

nippi

9:46 am on Jan 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Bad code can be a problem, I acidentally deleted the </body> tag on on of my site home pages. Pr went to 0, down from 5. Fixed the problem, recovered to 3 then 5 in 3 months. The whole time though, the page rendered fine, I only realised the problem by doing a code check.

IN your case, I'm betting its bad links, just delele ALL of them in case you aren't sure which one is the bad one, then make sure you vet the new ones more carefully.

Forget gambling, viagra etc, these sites usually have no linking scruples and you will get burnt.

Marcia

9:48 am on Jan 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>Bad code can be a problem, I acidentally deleted the </body> tag on on of my site home pages. Pr went to 0, down from 5.

I didn't do it, but I've also seen a site go south with messed up code - even worse than the missing </body> tag, it had an extra <head> section by accident.

Net_Warrior

12:46 pm on Jan 9, 2005 (gmt 0)

10+ Year Member



I found that my homepage was missing a body tag. Gosh I hope that was my problem...thanks for sharing.

DerekH

1:39 pm on Jan 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Net_Warrior - a belated Welcome to WebmasterWorld.

Have you tried the W3C Validator (Google it) - it'll check your pages for all sorts of errors, and if you're using templates, it's worth checking out one page before they all end up with the same fault <grin>

DerekH

suidas

3:21 am on Jan 10, 2005 (gmt 0)

10+ Year Member



On the subject of bad code: I wrote a Perl script, which ran the Alexa 100 through the W3's HTML validator. It's easy to do, but the results may surprise you.

Not *one* of the 100 passed through without errors--anway, what the W3 calls errors. XHTML and CSS extremists take note: the future is "invalid"; The most successful webmasters do not care for your religious views.

Google certainly didn't pass muster. And, like most others, they also "fail" to declare what version of HTML they were "failing" to write. I was glad to see it. Google appears to write code for uses and browsers, not validators. They have also, like most successful sites, continued to use tables, another bette noir of the all-CSS set.

So, write good code. But good code is functional, compact and easy to maintain, not necessarily valid.

nippi

4:53 am on Jan 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Its off topic, but I've got to respond.

Its different to have

(1) Crucial html errors such as no <body> tags.
(2) Minor errors such as <p> tags inside a <span> tag.
(3) Not be dom3 compliant.

I always try to achive all 3, though not always possible. IE6 does not even render properly all dom3 compliant code.

The point is, not being dom3 compliant, and having errors in your code, are different things. I don't beleive google cares about dom3 compliant, but they do care about errors.

Dom3 compliancy often means less code on the page, so you can present more content per page to google and come in under 100k so its worth doing.

my 2 cents

Boaz

9:42 am on Jan 10, 2005 (gmt 0)

10+ Year Member



Another error that (from personal experience...) has managed to bring a PR6 down to PR0 was a missing </head> tag in all pages of a site. Everything looked fine from a user point of view, but understandably Google didn't like this. It took me a while to find out the cause, but less than a month after the error was fixed the site was reinstated back in Google where it was before (PR and ranking).

suidas

10:46 pm on Jan 10, 2005 (gmt 0)

10+ Year Member



Well, if we're talking about errors of that level, I'm sure you're right.

It seems to me, prejudice against pages with errors can come from two sources, (1) prejudice against it or (2) inability for Google to make heads or tails of it. I don't think (1) is very important. If I were Google, I might watch for errors that will hurt the browsing experience, just as Google should penalize for lots of broken images or links. I suspect that's about the limit of it. With a missing <head> tag, I think Google ought to be in real doubt about what's going on. If they don't understand it, they can't assess it right, and they could be manipulated.

I agree that, in general, Dom3 is good for size. And discipline is the handmaiden to creativity, so long as you don't replace replace creativity entirely.

Boaz

11:06 pm on Jan 10, 2005 (gmt 0)

10+ Year Member



By "Google didn't like it" I meant Google couldn't digest it, I agree prejudice has nothing to do with it. Actually what I think happened was that, due to the missing </head> tag, Google decided all pages of the site were with no content and/or identical.

Net_Warrior

5:06 pm on Jan 11, 2005 (gmt 0)

10+ Year Member



Okay, so yesterday, I received an Email from help@google. They assured me my site was not being penalized in any way. Here is a snippet of what they said. Now I'm convinced it was the missing <body> tag from my home page. I thought I'd share this email I received from google since it is rare from what I'm told to get a response.

---

From: help@google.com

Thank you for your reply. As we mentioned previously, your sites are not currently penalized, and are included in our search results. To see the results of our search, please visit the following links:

(Links removed)

As you may know, our search results change regularly as we update our
index. However, these processes are completely automated and not
indicative of wrong-doing or penalization of individual sites. Normal
changes you observe may include, but are not limited to, changes in the
ranking of existing sites, sites falling out of the index or getting
dropped for particular keywords, addition of new sites, and fluctuation
between old and new webpage content.

We realize these changes can be confusing. We currently include over eight
billion pages in our index, and it is certainly our intent to represent
the content of the internet fairly and accurately.

While we cannot guarantee that any page will consistently appear in our
index or appear with a particular rank, we do offer guidelines for
maintaining a 'crawler-friendly' site. You can find these guidelines at
h**p://www.google.com/webmasters/guidelines.html. Following these
recommendations may increase the likelihood that your site will show up
consistently in the Google search results.

We appreciate your taking the time to write to us.

Regards,

Rollo

9:10 pm on Jan 11, 2005 (gmt 0)

10+ Year Member



Hi, that's just a form letter... I'm sure it'll get edited out soon by a moderator, but real quick... I got the exact same one. Read nothing into it.

What was your PR before? I really don't think a little inperfect code or even a couple bad links would cause this unless you were linking to garbage systematically. Check the code of the top 5 sites for most money keywords and they're rife with minor coding errors.

If it does have a penalty, and is not a major site that already has brand recognition or a big client base, etc... I'd consider just starting a new site on a new domian. That would probably be the quickest way.

suidas

4:19 am on Jan 12, 2005 (gmt 0)

10+ Year Member



Newbie question. Does Google confirm it when you ARE penalized? If you go to PR0 and ask them about it, what happens?

JuniorOptimizer

12:02 pm on Jan 12, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Suidas,

Yes, they'll respond with something like: "a site can be dropped for the index for a lot of reasons, including penalization".

I think they're having a bad technical trouble. The last email I sent concerning all my links and PR resulted in them saying they "dispatched a notice to an engineer" which sort of sounded like they're doing something.

inbound

6:32 pm on Jan 12, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Going back to W3 Validator. It's VERY strict, many top pages have over 100 errors flagged!

I have one site that has been written to be extremely compliant, passes AAA bobby tests and the American disability guidelines, has code that has been hand checked so it works consistently on every platform that is has been tested on. Even though this has been given so mich attention W3 still came up with 3 errors which don't come up with other checks don't show.

It seems that the differences between browser compatibility and compliance are irreconsilable. I changed the errors and found that it caused display anomalies which just goes to show that the whole effort to get compliance is too strict. I'd rather have pages that work on browesers than completely compliant code.

suidas

9:54 pm on Jan 12, 2005 (gmt 0)

10+ Year Member



Well, if I did websites for SEO chaps--which I don't--I'd suggest a Search Engine Validator. It would only flag problems known to trip up search engines.

I imagine there's lots of places you can submit a page and get keyword density figures, etc.

Rollo

2:25 am on Jan 13, 2005 (gmt 0)

10+ Year Member



A lot of the validator's info should be taken with a grain of salt... I get these on every site:

there is no attribute "BORDERCOLOR"... "HEIGHT"... "BACKGROUND" etc...

Doesn't mean much. These are used by programs like Dreamweaver and are fine.

Forkbeard

5:43 am on Jan 14, 2005 (gmt 0)

10+ Year Member



I can verify that messed up HTML can drop you to PR0. I had a site that was PR4 after the last update that just today dropped to zero after some edits I did last week. When I went in to look at it, I had managed to delete the closing </body> and </html> tags during a sloppy cut-n-paste operation.

suidas

6:49 am on Jan 14, 2005 (gmt 0)

10+ Year Member



Well, personally, I think you should consider other possibilties, ie., you're get penalized. But I'll do better than talk.

I'm taking my privacy policy webpage, which is PR4, and removing the closing HTML and BODY. If, at a month or two it's PR0, you owe me a beer.

Deal?

Rick_M

3:06 am on Jan 21, 2005 (gmt 0)

10+ Year Member



My main domain name just went to a PR0 yesterday, right around midnight. I've had the domain several years and Sept 23rd, I had taken a hit as a result of a mild form of a filter, with all my pages ranking anywhere from 5 to 20 spots lower than usual, effectively cutting my traffic by 80%. I had contacted google and got a variety of canned responses about checking google's guidelines, cloaking, and buying links (I think my site is fairly white hat, and certainly nothing that should cause a PR0). I did have a lot of pages included from affiliate datafeed driven pages, that I immediatly blocked out with my robots.txt file. The last communication I had from Google in December was that the information was being forwarded to an engineer and that I should wait 6-8 weeks to see any potential changes (it was not clear they were doing anything). They never did confirm that my site had a penalty either.

In mid-late December, I thought it might be the canonicalpage / duplicate site content issue, so I removed any pages that were showing as duplicate content as well as removed a large number of similar content pages (print friendly format, etc). Finally, I made sure that my domain name without www redirected to the www.domainname.com version with a 301 redirect - as I was showing both versions in the search results at times. I used the url removal tool to get them immediately taken out.

So, now 2 days ago, my site completely drops out of Google, gone from the directory (I'm in several DMOZ cats), and PR0. My site is still getting spidered by googlebot, but I don't know if that is normal behavior for a PR0 site or not.

Anyone know if PR0 domains usually continue to get spidered by googlebot (but not ranked)?

I again emailed help@google.com yesterday but have gotten no response. I don't know if that is because they are all on the slopes this week, or because they won't respond as my domain is now banned.

I have a feeling I did something to get the boot and that pursuing this is going to ultimately be futile. At some point I'm going to just have to call it quits on this domain, and continue to develop my site from a new domain.

JuniorOptimizer

10:47 am on Jan 21, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Rick_M, when the site: command shows no pages and the spider stops visiting, it generally means you're banned.

Rick_M

1:54 pm on Jan 21, 2005 (gmt 0)

10+ Year Member



Well, for my site, the site:command shows nothing found, my site is PR0 everywhere, I'm removed from the directory, BUT the spider is still visiting.

I think the domain got banned, and I'm guessing that when I communicated with Google, they manually reviewed my site, which is what I had expected, and things that I felt were within the Google TOS they felt were too artificial and they banned my site. The thing I am thinking is mainly linking to friends' site that are off topic with sitewide links. I did not think that off topic links would warrant a ban, but if they think that it was someone buying links, then perhaps. I'm still not sure though and hopefully the domain will come back. There is a lot of unique content and a fairly busy discussion forum with very unique discussions that I have to decide if I want to move to a new domain or not. I'm thinking I will start now rather than waste more time, and be careful to be TOTALLY clean from the start. Maybe in 4-5 years the new site will rank where the old one did? Good thing I never quit my day job to do this full time.

Rick_M

2:42 am on Jan 22, 2005 (gmt 0)

10+ Year Member



About a month ago, I wanted to get rid of the indexed pages without www, so I added the following to my .htaccess file - does anyone think that could lead to a PR0 penalty and a ban based on a false assumption of duplicate content? (grasping at straws)

RewriteEngine on
RewriteCond %{HTTP_HOST} ^domainname\.net
RewriteRule ^(.*) [domainname.net...] [L,R=301]

If that is not the issue, and my site has indeed been penalized, I'm wondering what people think is the best of the following options:

1. Start all over, edit all of the unique content to be slightly different again, and then contact all people who have linked to me and ask for them to move their links to the new appropriate pages

2. Same as #1 above, but use a domain name I registered a few years ago and just haven't used

3. Use a new domain, just copy all of the content over to the new domain, then do a 301 redirect from the old site to the new, and also ask all people who link to me to change their links

4. Use a domain that is a few years old, but hasn't been used for anything and do the same as #3

5. Remove virtually everything from my current domain, except just some bare content, with absolutely nothing that could draw a penalty, and ask Google to re-include my domain?

How long do people think I should wait before doing any of the above to see if there is still some chance that Google just choked on my domain and it will get re-indexed without me doing anything?

OptiRex

2:59 am on Jan 22, 2005 (gmt 0)



suidas

>They have also, like most successful sites, continued to use tables, another bette noir of the all-CSS set.

What do you mean by this?

I use CSS and tables together and how I please...have I misunderstood your statement?

.incs inside tables inside .incs inside tables...wonderful...:-) CSS has made my life so much easier it is unbelievable.

Shurik

5:31 am on Jan 22, 2005 (gmt 0)

10+ Year Member



Hey Rick_M, believe it or not, but exactly the same thing happened to my site a week ego. In cleaning up my dup pages I submitted a request thru a URL removal tool to remove all pages from my domain without www. I made sure that mydomain.com was not redirecting to www.mydomain.com. Google queued the request correctly (for mydomain.com only) but in 3 days wiped out all the pages from both domains! I suspected that something like that could happen but the desire to restore recently lost ranking was too overwhelming.

And, same as in your case, googlebot keeps coming but the site is gone from the index without a trace. Sent 2 re-inclusion letters but I guess we all know what follows. The good news is that new MSN just started sending 10 times more traffic than i was ever getting from google. My only regret is that if de-listing continues I will loose most of my reciprocal links which i spent lots of time acquiring, and it will eventually affect my position in Yahoo and MSN.

G does no evil – its just us, stupid mortals, who can’t figure its mysterious ways…

This 32 message thread spans 2 pages: 32