homepage Welcome to WebmasterWorld Guest from 54.198.25.229
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 61 message thread spans 3 pages: 61 ( [1] 2 3 > >     
How can I recover from a penalty?
Combined PR0 threads.
tristan

5+ Year Member



 
Msg#: 6587 posted 10:26 pm on Nov 2, 2002 (gmt 0)

Hi,
I own a rather small specialized directory/search engine which was doing quite OK in Google in the last months (had a PR5 last month), but it looks that the whole site (mainpage and all subpages) have been PR0'd this update...
This is obviously a penalization (last month I had +160 backlinks, and they're still there)
The only reason I can come up with is that I must have linked to a bad neighbourhood or something like that...
Some weeks ago I posted here asking how you could distinguish between a PR0 as in low PR count, and a PR0 as in a penalized site, and the answers I got was I should analyse the site etc.
Unfortunately, since I'm running a directory/search engine this is almost an impossible task to do that for every submission I get, so I decided to list everything that was usefull for my surfers... guess I was wrong... :Ķ

I've been busy now changing all direct links with links through a cgi-script, and blocking that one with a robots.txt, but I've already tried that one in the past with no success (google still saw them as backlinks, link format was href="/cgi-bin/link.pl?http://www.google.com/")

Is there anybody who knows how long a PR0 penalization lasts, and if you can contact Google so they can un-penalize my site so it can be back in the game next update?

Thanks!

 

Yidaki

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 6587 posted 10:35 pm on Nov 2, 2002 (gmt 0)

>I've been busy now changing all direct links with
> links through a cgi-script, and blocking that one
> with a robots.txt, but I've already tried that
> one in the past with no success (google still saw
> them as backlinks, link format was
> href="/cgi-bin/link.pl?http://www.google.com/")

welcome tristan,
did you check your robots.txt file with a robots.txt validator [searchengineworld.com] if it's the right format? Goodle should *not* see this type of links as backlinks!

tristan

5+ Year Member



 
Msg#: 6587 posted 10:53 pm on Nov 2, 2002 (gmt 0)

jep,
"No errors detected! This Robots.txt validates to the robots exclusion standard!"

It only contains this:

User-agent: *
Disallow: /cgi-bin/link.pl

Markus

10+ Year Member



 
Msg#: 6587 posted 11:05 pm on Nov 2, 2002 (gmt 0)

/cgi-bin/link.pl?http://www.google.com/

and

/cgi-bin/link.pl

is not the same. You'll have to disallow the whole cgi-bin folder:

Disallow: /cgi-bin/

Yidaki

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 6587 posted 11:18 pm on Nov 2, 2002 (gmt 0)

nope, markus! I don't have a cgi but other url's like "here/go?". Google doesn't grab them at all - allthough the robots.txt says "Disallow: /here/go"! <added>Hoops, the line doesn't inlcude the "?" Could this be a reason?</added> But i never use something like "?http://www.andheretheurl.com". I use a database / table with numbered records and search and resolve them on click. Maybe google takes the url from the anchor link at your directory pages, strips everything before the "http://" and adds this url to "the new urls" table? One never knows!? ;)

ciml

WebmasterWorld Senior Member ciml us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 6587 posted 11:42 pm on Nov 2, 2002 (gmt 0)

If /robots.txt says "Disallow: /foo" then you have restricted any URL beginning with /foo

"Disallow: /cgi-bin/link.pl?http://www.google.com/" does not exclude /cgi-bin/link.pl but "Disallow /cgi-bin/link.pl does exclude /cgi-bin/link.pl?http://www.google.com/

Black Knight

10+ Year Member



 
Msg#: 6587 posted 2:45 am on Nov 3, 2002 (gmt 0)

The only way to remove a PR0 for certain (as far as I have ever heard) seems to be to block Google entirely for a while by robots.txt - only once your PR is blank can you remove the disallow, letting Google spider you again from scratch (without the penalty). If whatever caused the penalty is still there, its pointless to do.

startup

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 6587 posted 6:45 am on Nov 3, 2002 (gmt 0)

Are you using Dmoz data for your directory/search engine?
Have you changed hosts?
Did the site get crawled recently?
Is the site generated content?
Was the host down during a crawl?

Banning googlebot by robot.txt file will not help. I just had two sites come back at the same time. They where both PR0ed at the same time also. One was used banning by robot.txt, the other wasn't.

2_much

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 6587 posted 7:08 am on Nov 3, 2002 (gmt 0)

Black Knight, will that work if a site is banned? Or do you think bans are manual?

tristan

5+ Year Member



 
Msg#: 6587 posted 9:11 am on Nov 3, 2002 (gmt 0)

>Are you using Dmoz data for your directory/search engine?

no, I just build my own system to accept/list submitted sites (if they follow some basic rules)

>Have you changed hosts?

no, been with the same host for almost a year, the PR for that site has rissen constantly uptil a PR5 last month

>Did the site get crawled recently?

yes, the site even has a fresh tag (which I hope I won't loose, since it looks like a PR0 doesn't affect the listing of fresh-crawled pages)

>Is the site generated content?

It doesn't have pages of text, but it's a directory/search engine with unique content...

>Was the host down during a crawl?

not at all...the 213.* googlebot visited this month, and daily I get visited by the 64.* googlebots

tristan

5+ Year Member



 
Msg#: 6587 posted 9:16 am on Nov 3, 2002 (gmt 0)

>googlebot visited this month

this should be - has visited me every month - I keep an eye on my logs for it, and It's has always visited me as long as the site existed

as a side note on the robots.txt and link.pl:
disallow: /cgi-bin/link.pl
seemed to work, since googlebot never fetched links with the link.pl, so my guess is that it saw a link (a href), with a script called "link(.pl)", and an url (the parameter after the?), and decided that that must be a link to that site.

Black Knight

10+ Year Member



 
Msg#: 6587 posted 9:59 am on Nov 3, 2002 (gmt 0)

Startup said: Banning googlebot by robot.txt file will not help. I just had two sites come back at the same time. They where both PR0ed at the same time also. One was used banning by robot.txt, the other wasn't.

I didn't mean that sites never come back unless you use the robots txt. Sometimes a site's PR is dropped without it being an actual penalty. Sometimes PR can return without you doing a thing. However, if you have the penalty, and you've already been through the site with a fine-tooth-comb, removed everything Google might have penalised and still find the penalty hanging there, then the robots.txt to exclude Googlebot is a last resort that I have never heard fail (though if whatever was penalised originally is still present, the PR penalty can return).

I don't suggest this as a first line of defense. It is a last resort. It works, but then, you had to ban yourself from Google completely then start afresh to do it. 3 months of zero google traffic. Hardly the thing you'd try first. :)

Provisos
It is faster to buy a new domain and submit it than to wait until you are completely out of Google and then resubmit. Therefore this technique is for sites that have built up many links (that would not quickly change to the new domain), built up a brand, or have other sources of traffic that make a new domain more trouble than spending 3 months without Google traffic.

I have taken this from the reports of others, since I have never personally had a site get the PR0 penalty. I'm not gloating, I simply always played safe generally because most of my clients are not the risk-taking types. Others told me this worked, and people who asked my advice after months of the PR penalty who tried this found it worked when nothing else had. By all means do try everything else first - leave this till last.

djgreg

10+ Year Member



 
Msg#: 6587 posted 11:11 am on Nov 3, 2002 (gmt 0)

How do you ba a bot completely from a site by robots.txt?
I thougth with robots.txt you only can disallow folders, but the index.html does not lie in a folder.?

Black Knight

10+ Year Member



 
Msg#: 6587 posted 11:27 am on Nov 3, 2002 (gmt 0)

disallow: / to disallow everything from the root.

djgreg

10+ Year Member



 
Msg#: 6587 posted 11:45 am on Nov 3, 2002 (gmt 0)

ah , äh, of course , stupid question thnaks Blacknight

Dante_Maure

10+ Year Member



 
Msg#: 6587 posted 11:47 am on Nov 3, 2002 (gmt 0)

to exclude Googlebot is a last resort that I have never heard fail

I too have heard of this tactic working successfully while never having to (knock on wood) resort to it myself.

That being said, based on direct statements made by GoogleGuy there definitely seems to be instances where a domain can retain a penalty even if the site is taken out of the index for some time.

A few months back there was a lengthy discussion about the risk of inheriting penalties when buying an expired domain.

GoogleGuy confirmed it's being possible, and warned webmasters of purchasing a used domain without thoroghly researching it's potentially "black" past.

It seems possible that certain manually applied penalties can stick regardless of a site's being out of the index... while automated penalties may be responsive to the robots exclusion tactic above.

djgreg

10+ Year Member



 
Msg#: 6587 posted 11:55 am on Nov 3, 2002 (gmt 0)

I got a penalty on some domains most of them have gone automatically with the last update, but there are 3 which are very persistant. I think I'll try the robots.txt method.

mellonhead

10+ Year Member



 
Msg#: 6587 posted 2:50 pm on Nov 3, 2002 (gmt 0)

I've recebtly joined a site which had innocently run duplicate content. It ran a page on www.site.com/offers.html as well as on www.offers.com. Clearly these were duplicate content and therefore were both pr0'd.

I had thought that by just removing the duplicate content and linking straight to www.offers.com, the penalty would be removed - but it doesn't seem to be working.

The problem is that on the offers.com page, it linked backed to our home page www.site.com, which is now being affected by (PR is 5/6 in the directory, but only 2/3 on the tool bar).

Trouble is now, I'm worried that the whole site is going to be affected by this bad neighbourhood thing. Does Google not have the ability to spot when people have just inadvertently made a mistake and allow them to correct it?

All advice welcome!

julinho

10+ Year Member



 
Msg#: 6587 posted 3:34 pm on Nov 3, 2002 (gmt 0)

Yidaki wrote:
But i never use something like "?http://www.andheretheurl.com". I use a database / table with numbered records and search and resolve them on click. Maybe google takes the url from the anchor link at your directory pages, strips everything before the "http://" and adds this url to "the new urls" table? One never knows!?


Someone wrote something related to this a few weeks ago, and did it in a peremptory way (sorry, canīt remember who it was).
If the dynamic link includes the URL of the targeted page, this linked to page gains PR (and so, I suppose that the dynamic link is seen as a backward link); if the dynamic link makes reference only to a entry in a table, the page linked to doesnīt receive any PR (and so, I suppose that the link is not seen as a backward link).

The question and answer were both from the point of view of someone trying to get an inbound link; however, I think the same logic applies if you are trying NOT to give an outbound link.

I hope this was clear (I mean, sometimes itīs difficult to express yourself in a foreigner language).

Gregory

5+ Year Member



 
Msg#: 6587 posted 5:04 pm on Nov 3, 2002 (gmt 0)

I'm in a similar situation as Tristan. My home page dropped from pr6 to pr 4 in September's update. My inner pages were either greyed or PR0-ed. (This i do not understand. Why did google decide to PR0 my second level pages while my 3-d level pages - where all the content is - got greyed out? ).
When I go to Google and type

link:www.mygomainname.com - I get NO BACKLINKS whatsoever, which is not right as there MANY HIGH QUALTY sites linking to my portal.

I believe the reason I got PR0 was shared (virtual) hosting.
Someone did something, now I have to pay for it...

:(

I was advised to move to a new host which I did on October 12. As soon as I moved GOOGLEBOT stop coming (it used to visit my site on a daily basis). Today is Novemeber 3-d and I have not seen googlebot since my move on October 12. With the last update (October) my home page lost another pr unit. Now it stands at 3 (Interestingly enough, google traffic recovered a bit. Unfortunately the traffic is not as targeted as it used to be.)

Now, that you guys know my situation I wonder if some knowledgable person(s) could address the following points.

~~~~
Assume that the reason for my penalty is not a fault of my own. Assume that it was realted to virtual hosting.
~~~~

1) Did I make the correct decision to move to a new host?

2) If the answer to the above question is "YES",

a. do I need to write a letter to GOOGLE explaining that I no longer host with the penilized host.(As if I'm going to get an answer... keep dreaming!)

or

b. There is no need for a letter. With the next update googlebot will see that i have new host and will automatically remove the penalty.

3) Is it to optimistic to hope that googlebot will find me with the next update in November? I assume it will because I'm listed in Yahoo, looksmart and many other directories crawled by googlebot. I have a lot of incoming links ( not to worry, they are all theme related to my site ).

4) Now, that I moved to a new host should I pay extra $4.00 a month for a unique i.p. address? If I do get a unique i.p. address will it prevent this whole mess from happening again (getting penilized for somebody else's action)?

Is there anything else I CAN DO? (Please, don't tell me to get a new domain and start from the scratch. I'd rather shoot myself :(

djgreg

10+ Year Member



 
Msg#: 6587 posted 7:17 pm on Nov 3, 2002 (gmt 0)

I think it was the correct decision to move to a new host.
I don't think that it brings any effort writing a letter to Google. They get 1000s of letters every day.
Of course I don't know what will happen, but if the penalty was coaused by your host Google should notice the change and maybe the penalty will be revealed.

jayq

10+ Year Member



 
Msg#: 6587 posted 8:25 am on Nov 5, 2002 (gmt 0)

OK - I have a novel question...who know FOR SURE that the all white toolbar is indeed penelty, and not just really low pagerank?

I've see domains start all white and go up. I have a couple sites where the index page is all white but the inside shows some ranks and you call also freak yourself out by removing the www and appear to be pr0!

Any thoughts?

tristan

5+ Year Member



 
Msg#: 6587 posted 8:33 am on Nov 5, 2002 (gmt 0)

Well, for my site I know for sure it's a penalty - last index I had a PR5 on my mainpage (and +160 backlinks in google), now I have a PR0 on every page of my site (and the links to my site still exist)...

tchannon

10+ Year Member



 
Msg#: 6587 posted 11:32 pm on Nov 3, 2002 (gmt 0)

If someone could review the following site and provide some insight to the below issue, it would be appreciated.

www.example.com

The above site was designed to provide a sense of locality to those seeking widgets service. Our marketing message is that we screen through multiple widget providers that service a certain area (e.g. city, state) and provide our customers with the best service option for that particular area.

We create a static HTML page for each city and state and have designed the web site making it possible for any person to browse through any state and then city from any page on the web site. We also provide some information on various widget service options as well as some advice on how signing up for widgets taht can save you money.

Regarding search engine placement, we were appearing in the top 10 for phrases such as "widget location1" or "widget location2". However, literally, over the past day or so, we have totally dropped off the search engines and someone noted that our page rank is now "zero" (they said it was a 4 before). I was informed that sometimes occurs because of a penalty or perhaps an error occurred when Google performed its monthly update (e.g., our server was down perhaps).

Also noteworthy, that some pages are a "ZERO" while others are not "ranked"

I did some reseach and regarding the possibility of the penalty, the consensus seemed to be that a page rank zero could be assigned if someone were using automated rank-checking or submission programs or was automatically and dynamically generating "doorway" pages. Another cause was linking to a site that had a pagerank of zero. On www.example.com, each static HTML page is created independently and is not automatically or dynamically generated; doorway pages are not used; nor could I find anyone on our web site that we were linking to that may have brought this about.

Could someone please advise or provide some insight on this issue? We do not feel that we have used techniques that merit a "penalty." If our server was down, perhaps the next update in 4 weeks will rectify the situation. However, if there is something that we are doing, most likely inadvertently or ignorant of its possible consequences, please advise or provide comments.

Thank you for your cooperation and attention to this matter.

Terence Channon

[edited by: heini at 11:41 pm (utc) on Nov. 3, 2002]
[edit reason] sorry, no urls/specifics please - thanks! [/edit]

fathom

WebmasterWorld Senior Member fathom us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 6587 posted 11:45 pm on Nov 3, 2002 (gmt 0)

Hi tchannon and welcome to WebmasterWorld.

Do these web pages have any backlinks (other links outside of your site) pointing to them.

PageRank is developed by external links and a good place to start.

hurlimann

10+ Year Member



 
Msg#: 6587 posted 11:59 pm on Nov 3, 2002 (gmt 0)

Brett will pass by soon and remove your url and contact details and also move your post. See the TOS.

In the meantime yes your site is PR0 or greyed. A "+www.XYZ.+com" search shows the url but you appear to have no backlinks.

As 99% of the content, other than links appear to be the same for 99% of the pages I would reckon it is a pretty heavy penalty.

My advice:

1) Ditch domain
2) Redo site and contract someone who knows about SEO to do it.
3) Read the Google threads here ( Many se's would also ban this site.)

Hope this helps

hurlimann

10+ Year Member



 
Msg#: 6587 posted 12:10 am on Nov 4, 2002 (gmt 0)

1 more thought:
Worth waiting for 3 days just in case it is an update dance problem. I doubt it but it could be.

fathom

WebmasterWorld Senior Member fathom us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 6587 posted 1:05 am on Nov 4, 2002 (gmt 0)

Although hurlimann brought up some valid issues that should be considered, the fact that "no backlinks are present" means there is no PageRank to receive a penalty.

This doesn't mean that "no penalty exist" nor does it mean a penalty could not be applied in the future.

In all likelihood -- right now: 0 backlinks equals 0 Pagerank, and no penalty has been applied.

Can I be 100% certain of this: "no"

But 99.9% certain, since you need to be on "Google's radar" with sizeable SERPs, existing PageRank, exposure, and most importantly Backlinks to get a penalty.

If none of these exist: you are an orphan and google doesn't penalize "orphans", it just doesn't use them as "quality sites for Google users".

IMHO :)

tchannon

10+ Year Member



 
Msg#: 6587 posted 4:57 am on Nov 4, 2002 (gmt 0)

Everyone, thank you for your replies. I considered the fact that since the content on each page is just about the same, there could be an issue.

However, here is something worth considering:

Take a look at the following URLs:

<url's snipped>

This site appears to have no real backward links either and is essentially the same page. However, this site has not received a "penalty" or anything and has retained its position longer than mydomain.com as it was there before mydomain.com was indexed.

Why has that one not been penalized?

Regards

[edited by: NFFC at 5:16 am (utc) on Nov. 4, 2002]

ScottM

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 6587 posted 5:14 am on Nov 4, 2002 (gmt 0)

It should be.

Bismarck is not in Eastern North Dakota:>)

(By the way-I'd remove the URL's...see TOS at the bottom of the screen.)

This 61 message thread spans 3 pages: 61 ( [1] 2 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved