Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Lurching from one penalty to another

         

rosebres

6:57 pm on Sep 10, 2009 (gmt 0)

10+ Year Member



We seemed to be lurching from one penalty to another.

In June, it was 'duplicate content problem' and we managed to trace the problem to a directory in China which hijacked our website. After filing a DMCA complaint to Goggle and the offending webpage was subsequently removed, our traffic and sales returned to normal.

This time, we're not sure where the problem is. We discovered that our whole inventory has been loaded onto a relatively new shopping search engine without our knowledge.

Can this invoke a 'duplicate content / excessive link' penalty by Google? There is a link from the website to each of our product pages and a pop up - copy of each of our product descriptions. We have blocked that shopping site's spider in our robot.txt but nothing happens.

Do short meta descriptions and short / duplicate title tags on our website invoke some form of penalty? Any advice most appreciated.

[edited by: tedster at 8:50 pm (utc) on Sep. 10, 2009]

tedster

9:17 pm on Sep 10, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There are several relatively new shopping search engines that spider other sites for products - I've never heard of them causing a problem for the original site, especially not their back links. Google has no problem knowing who they are.

Do short meta descriptions and short / duplicate title tags on our website invoke some form of penalty?

Not a true penalty, no - but it can make for some problems getting good rankings, or seeing the snippet you want (in the case of meta descriptions).

We have blocked that shopping site's spider in our robot.txt but nothing happens.

Are they ignoring the robots.txt and continuing to spider your server? Take it up with them directly.

Blocking a spider is not the same as getting already spidered content taken down. If that's what you want to see happen, again you will need to deal directly with that site.

jd01

9:32 pm on Sep 10, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Adding to what tedster said about taking it to them directly, if you know the User Agent string they use (it will probably have the spider name in it) and they are not obeying robots.txt, you can use this in your .htaccess file if you are on an Apache Server with Mod_Rewrite installed: (It's one of my fave's.)

RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} TheBadBotUserAgentHERE [NC]
RewriteRule .? - [G]

rosebres

8:33 pm on Sep 17, 2009 (gmt 0)

10+ Year Member



Thanks Tedster and jd01,

Update on our situation. I've discovered today that in the middle of August, our website has received a huge number of spammy links.

As soon as the links were spidered by Google, the pages were taken down. I can only see the 'cache' version of those pages.
The perpetrator has used a few domains - same name but 2 different prefix such as
ucthedodgysite.com, obthedodgysite.com and twthedodgysite.com

What is the best way to resolve this problem - spam report or reinclusion request?

The only problem when I look at google webmastertools, our website has 400 products with short descriptions which need amending, can I still send a reinclusion request?

tedster

9:52 pm on Sep 17, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sure you can send a reconsideration request even though you have some short meta descriptions. Document the spammy backlinks on a url and include that url in your request. Check out the reconsideration tips video [webmasterworld.com] so you do the best job you can.

rosebres

10:56 pm on Sep 27, 2009 (gmt 0)

10+ Year Member



Thank you Tedster. The penalty has been lifted but it seemed ‘someone’ is definitely making a concerted effort and spending a lot of time trying to damage our website.

Today, I discovered that our site has received another 280+ links from the comments section of a website. I have notified the owner of that website and sent another reinclusion request to keep Google Team busy.

As you could imagine, we are absolutely exasperated since this problem is bordering to becoming ridiculous. But I would like to give the perpetrator - a professional SEO hitman, ‘a run for his money’.
Any idea how I can sabotage his efforts? If I were to pursue more quality links to create a firewall around our website, will that do?

Our link profile is weak because I have great difficulties getting one way links from other relevant websites. It is far easier for me to get one way quality links from unrelated websites such as universities and professional organisations.

tedster

11:03 pm on Sep 27, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Glad to hear your penalty was lifted. Yes, backlinks from quality, trusted sites are a good thing, and a protective element, too - even if the topic of the linking page is a bit of stretch.

rosebres

8:23 pm on Sep 30, 2009 (gmt 0)

10+ Year Member



Tedster,

The 1st reinclusion request has been reviewed and the penalty appears to have been lifted, but Google has still not removed the first batch of spammy links from its serp.
Should I resubmit?
I have uploaded the list of spammy links to a new URL (a new blog on blogspot.com).
Perhaps it would have been better if I had shown these links on our actual website and blocked the spiders from spidering this page to avoid two way links?

tedster

8:50 pm on Sep 30, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You definitely should not let Google spider and index that list of bad links. How you take care of that technically is up to you.