Welcome to WebmasterWorld Guest from 54.167.175.218

Fetch as Googlebot to remove over-optimization penalty?

   
10:28 am on Dec 10, 2013 (gmt 0)

5+ Year Member



Hi

I was hit on Nov 26th , with several subdirectories being removed for the index

I had not being doing any external link building etc, all content unique.

But there is one thing that had been done to the site based on Matt Cutts video on internal linking and that is I added keywords to a proportion on internal links within the site.

4 days later, I lose several subdirectories.

I have to assume I have crossed the Penguin boundaries and hence the punishment.

However
I have corrected the links almost immediatly and yes I know Google will not access these pages as reguarly as normal.

So I gave been testing using fetch googlebot to tell google that I have changed the pages back.

Of course these pages have not yet returned, so wondering if anyone has done this before and did it work?

Is it still a cases of I just needed to wait until Google indexes naturally, or the fetchbot would have worked, and there are other things at play?
12:54 pm on Dec 10, 2013 (gmt 0)

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month



I do not think it was a Penguin as Penguin has to be run separately.

You have probably hit some Over Optimisation algo threshold or perhaps you were hit with something like Google's Rank Modifying Patent for Spam Detection [webmasterworld.com] where the algo is trying to identify changes that attempt to manipulate rankings. I suggest you read the above linked thread for details.

Since you have changed your pages back, all that you can do is now wait - if rank modification is in play, your patience may be tested a bit.
1:45 pm on Dec 10, 2013 (gmt 0)

5+ Year Member



mmmm I thought Penguin was part of the algorthym now.

There were a few sites that were hit on that date, although Google denies anything was happening.

This was not extreme linking, I can see competitors with similar theshold, and nothing else on the page that I could suggest would also be seen as spam.

Looks like I will have to wait and see.

Thanks
MArk
2:03 pm on Dec 10, 2013 (gmt 0)

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month



mmmm I thought Penguin was part of the algorthym now.
Panda is. From what I understand, Penguin still needs to be run separately.

Looks like I will have to wait and see.
Yes, this is what I would do.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month