Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Post Update Site Repair Planning

         

robert76

2:10 am on Mar 1, 2011 (gmt 0)

10+ Year Member



In searching thru our data to try and find clues to our downfall in position, I've come across numerous spam type sites each linking to a different keyword on our site. I'm not sure if it's the cause of the downfall, or a contributing factor to it, but I doubt it is good for us.

Many of these sites are blogs with generic product names like widgetsrus.net. They are heavy with adwords and/or affiliates. The link to our site is a product or category related to the name
of the blog, such as red widgets, and appears on the blog under headings such as More Info, Related Sites, and Helpful Links.

The same link appears on every page of the blog and some of them have many thousands of pages.

What do I do about this? Is this important enough to be part of my site repairs? If they are doing something wrong, what exactly is the wrong? It's not as if they are using our site name or original product descriptions which would create an intellectual property infringement. Can you prevent somebody from merely linking?

Of course, the Whois data is all hidden so it's not even easy to contact them and ask them to stop.

Could these be related to our downfall in Google?

tedster

2:32 am on Mar 1, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I wouldn't worry about backlinks at all for this update. If they're directly under your control, sure fix anything that looks bad. Otherwise, I don't see the gain.

Instead I would focus on content. Any place where I cut corners with the content, I would improve. Prioritize the pages that lost rankings most especially - my feeling is that this update did not target by site but rather by page.

Us computer folk are always looking for improved efficiency through automation. But that motive, on its own and left unchecked, can start to mess with the real humans in our audience.

tedster

2:57 am on Mar 1, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



One of our members messaged me with an idea. Remember the leaked internal training documents from Google? A couple leaked versions of the Spam Recognition Guide for Raters have made the rounds over the past years.

Now Google has no doubt moved on in the intervening time. They understandably did not like having those guides leaked. But even today, we can get a top level idea of the way Google thinks about content that deserves to rank from the language they used.

The idea is that quality raters were supposed to grade the urls that appeared in a SERP for a specific query phrase. The top three grading categories back in those days were called Vital, Useful, and Relevant.

Even back then, quality was more important than relevance - for human raters.

Vital: Pretty clear. It essentially means "this search result should definitely include this page."

Useful: They often have some or all of the following characteristics: highly satisfying, high in quality, authoritative, entertaining and/or recent. Useful pages are pages you trust.... Useful pages are usually high quality: they are well organised and up-to-date. Useful pages are 'as good as it gets' for queries that do not have Vital pages.

Relevant: Might be less comprehensive, come from a less authoritative source or cover only one important aspect of the query. Relevant pages must have some utility for the user, in addition to being on-topic. These pages are average to good, but are not 'as good as it gets.'

-----

There's no special sauce leaked in there, just a mindset. But you might well examine any pages that took a rankings hit with those ideas in mind. Do those pages aim to be the very best they can be? Or are they just trying to do the bare minimum needed to rank?

So I think the biggest message here is "Don't cut corners" when it comes to content.