Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Moving Hosting To Get From Under Penalty

         

Super_Chunk

11:56 am on Jul 1, 2010 (gmt 0)

10+ Year Member



Hi, I won't bore everyone with the nitty-gritty details (I posted a thread in the WW Supporters forum about the issue and got some useful advice but still we are struggling). To cut a long story short we took a big rankings hit with Google in November 2008 (everyone who has looked into our problem agrees we are being penalised) and have been working hard to get it back ever since. We are very proactive ourselves and have spent more time then I would like to admit trying to fix the problem, eventually seeking advice in forums and hiring a SEO to no avail.

We feel we have exhausted all avenues and are getting to the point now that we are thinking perhaps Google has taken a disliking to our server (dedicated server but we host a few sites – the site in question has a dedicated IP) and we are considering moving hosting in the hope that this might provide a clean slate (or we could change the IP of our site and stick with our current hosting). We have invested 11 years in our domain name so changing that is not really an option for us.

I am wondering if anyone has any experience of this? Or any advice on possible issues and things to consider? It is a e-commerce site so there will be some synchronisation issues to iron out while the DNS propagates if we did move sites – but things have gotten to the point where we are prepared to try almost anything.

Thanks in advance for any input.

tedster

5:35 pm on Jul 1, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Have you investigated your current IP address to see if it's record is "poisoned" in some way? It's rare. But it happens.

Also, have you checked your pages with the "Fetch as Googlebot" tool to look for any cloaked parasite content?

RP_Joe

12:34 am on Jul 2, 2010 (gmt 0)

10+ Year Member Top Contributors Of The Month



"but things have gotten to the point where we are prepared to try almost anything."

How about a new URL? Just a thought.

Super_Chunk

1:29 pm on Jul 12, 2010 (gmt 0)

10+ Year Member



Apologies for the delay in replying to the helpful advice guys.

Hi tedster, I did try looking into this and didn't find anything to suggest that the IP is poisoned - but I wasn't quiet sure where to look. Are there any tools or sites you could recommend to check if the IP has been poisened? Also I do keep an eye on the "Fetch as Googlebot" tool and everything is squeaky clean.

RP_Joe, thanks for the suggestion however as I mentioned in the post with 11 years invested into the domain name this would be like completely starting from zero. We are doing so badly in the search engines that we fear changing the URL might reduce our sales even further - which is something we can not risk.

Thanks again for your suggestions.

bwnbwn

2:04 pm on Jul 12, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Super_Chunk you said "search engines" so all 3 are knocing the site down?
If all 3 are doing it have you taken a hard look at the other sites hosted on the server to see if there is a possible hack coming from one of them.
I do know we had one site that had bad links added to it on a dedicated server that knocked the other sites on the server down. I never could discover how this site was breached. There was only one site on this server hacked but I don't feel it was a server wide hack. I think there was an old file on the site that the hack came through, so maybe checking all the old files on the server and deleting all of the ones not being used might be something worth looking into.

Super_Chunk

3:28 pm on Jul 12, 2010 (gmt 0)

10+ Year Member



Hi bwnbwn, it fluctuates but we actually do quiet well in yahoo and Bing - just google in the last 18 months that has seen a dramatic decrease.

We did have a forum which was on a separate domain name but hosted on our dedicated server that had a vulnerability and was receiving spam posts. We got advice on this in another part of WW and took the site offline and manually requested the pages be removed from the index in Google Webmaster tools. This was a couple of months ago however and we have seen no change.

When you had this problem how did you resolve the problem? I guess you fixed the site in question and plugged the hole - but did you have to move hosting of take any further measures?

bwnbwn

3:50 pm on Jul 12, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Super_Chunk the sites on this server are not sites we really work on more of a testing server than anything else. I wish I could help ya but can't. I totally deleted all files on the server with this site.

Did you delete the files off your server?

I do know if you had problems with the sever and took the site offline two months really isn't a long time. If the others are right and the site is in a penality it could come out in 2 months to a year.

Do you know how long the site was linking to bad links?

HuskyPup

3:53 pm on Jul 12, 2010 (gmt 0)



I see you're in the UK.

Are your sales mainly UK, European or global?

If global are you using a .co.uk?

Where is your hosting based?

I'm just trying to get a handle on what Google may have done to you since I have a .com site hosted in the UK for an Indian-based company and Google has deemed it to be Indian-only and I can't get it back ranking well in the .com SERPs! It's there but it may as well not be.

A similar time-scale to yours as well.

Super_Chunk

4:47 pm on Jul 12, 2010 (gmt 0)

10+ Year Member



Hi HuskyPup, thanks for the reply. That is very interesting, in answer to your questions -

I see you're in the UK.
Are your sales mainly UK, European or global?

Yes we are UK based. Our sales are mainly for the UK, but we do receive Europe/Worldwide orders. The geographic target in Google Webmaster tools shows United Kingdom.

If global are you using a .co.uk?

Yes, we use a .co.uk

Where is your hosting based?

UK based.

I have read a little about the factors used by Google to decide the geographic target but it is Google UK that we are not doing well in. I could understand a poor ranking in .com if we were doing well on .co.uk

If you are dealing with a similar kind of time scale i'm sure you can appreciate how frsutrating and damaging this sort of problem is.

HuskyPup

5:06 pm on Jul 12, 2010 (gmt 0)



Hi Super_Chunk...quick response since I am going out.

At first glance it looks as though you are being "Smart Geo-targetted" however that would not explain the poor performance in Google UK.

Anyone else with an angle on this whilst I have a ponder?

Super_Chunk

5:15 pm on Jul 12, 2010 (gmt 0)

10+ Year Member



bwnbwn, yes all the files are deleted and we followed all of Google's guidelines from removing a site from the index.

I do know if you had problems with the sever and took the site offline two months really isn't a long time. If the others are right and the site is in a penality it could come out in 2 months to a year.


The problem was actually fixed in January but on further advice we completely did away with the forum. I appreciate that these things can take time to come back but the frustrating this is we don't know for sure if this was the problem.

Do you know how long the site was linking to bad links?


I would say at worst the spam links would not have been online for more than 2 months. We were originally deleting the spam posts manually but they increased in volume over time and some were slipping through.

Super_Chunk

5:18 pm on Jul 12, 2010 (gmt 0)

10+ Year Member



HuskyPup, could you or someone else explain "Smart Geo-targetted" and if it is a good or a bad thing?

When you click on "learn more" it mentions that you can select "Unlisted" but there is no option to do so - is this what you mean when you say you can't change your own geographic target?

Many thanks.

pageoneresults

5:26 pm on Jul 12, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



We did have a forum which was on a separate domain name but hosted on our dedicated server that had a vulnerability and was receiving spam posts. We got advice on this in another part of WW and took the site offline and manually requested the pages be removed from the index in Google Webmaster tools. This was a couple of months ago however and we have seen no change.


I'm guessing a vulnerable server comes into play here. Are you absolutely positively sure that the current server has no more vulnerabilities? Have you checked your cache pages (if available)? What are the cache dates? Have you fetched your site as Googlebot to be sure that everything is in order?

Two months does seem like an extended period of time for recovery. What type of errors do you see being reported in GWT?

mhansen

6:44 pm on Jul 12, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Just thought I would chime on the "Change of Server" aspect, when a penalty is thought to be present.

I have had my fair share of sites slapped around by Goog over the years. I guess you're not really try hard enough, without a failure every now and again right?

Anyhow... I have never had a server move alone make a difference to a site.

IE:

- Site seems to be penalized
- Moved it to a dedicated or different server
- Site no longer penalized

Moving it to a new dedicated server made no difference by itself, in my case, unless the issue was a speed issue.

I guess the only thing I wanted to portray with that... is make sure you know, or at least have a better idea of the issue before spending time on things that may not make a difference. (Like we did)

Have you used WMT to request reinclusion? It sounds like your forum may have something to do with it... so it may be worth doing a site:domain.com checkup, making sure all references to the forum are gone, and explaining everything about why you think it was penalized, and what you did to fix it... through the reinclusion request form.

MH

HuskyPup

8:12 pm on Jul 12, 2010 (gmt 0)



could you or someone else explain "Smart Geo-targetted" and if it is a good or a bad thing?


It's a phenomenon I've seen a few times whereby Google, for whatever reason, gets its knickers in a twist and can't comprehend where a site is from or to where it is targetted.

I've also seen this with Bing when it couldn't recognise .eu as being in Europe!

Whether or not the same person created the AdSense Smart-Pricing algo I have no idea but any AdSense publisher here will tell you just how crazy that thing is.

The geographic target in Google Webmaster tools shows United Kingdom.


Certainly you need to change this.

Yes, we use a .co.uk


I'm finding these days that Google does not love a .co.uk for the global market no matter that you have an old site. Obviously some will disagree however you only have to look at Google.co.uk to see how .co.uk sites dominate the SERPs in many widget sectors yet try and find them under other Google.TLDs and they quite simply usually do not exist let alone compete.

I'm guessing a vulnerable server comes into play here.


Have you used WMT to request reinclusion?


I'm reckoning that you have several issues here and combined they're crucifying you by the sound of it.

Super_Chunk

10:09 am on Jul 13, 2010 (gmt 0)

10+ Year Member



Hi everyone, we really do appreciate the advice so thanks for taking the time. I will answer in order :

@pageoneresults

I'm guessing a vulnerable server comes into play here. Are you absolutely positively sure that the current server has no more vulnerabilities?


The server is fine - the issue was un patched forum software that had the venerability. This is now completely gone, but on your advice today we are going to double check ever single account on the server to ensure there is nothing else we are missing.

Have you checked your cache pages (if available)? What are the cache dates?


Do you mean in the Google cache? New pages are getting cached and cached versions of our pages seem to be no more than a week old.

Have you fetched your site as Googlebot to be sure that everything is in order? .... What type of errors do you see being reported in GWT?


Yes everything in order when fetched as Googlebot. No major GWT errors - a few 404's that I need to get ontop of (our domain name is 11 years old so this is inevitable). There are alot of pages "Restricted by robots.txt" but these are all pages we don't want crawled so is correct as far as I am concerned.

@mhansen - thanks for joining in

Moving it to a new dedicated server made no difference by itself, in my case, unless the issue was a speed issue.


This is really good to know from someone who has tried this

Have you used WMT to request reinclusion?


Yes - 2 over the last 18 months since the problem started after pretty much working full time to fix the issue.

so it may be worth doing a site:domain.com checkup


Thankfully since following Google's guidelines this now brings no results :)

making sure all references to the forum are gone, and explaining everything about why you think it was penalized, and what you did to fix it... through the reinclusion request form.


Thanks for this - I believe at the time of the last request we sent there were some bad forum pages still in the index, but we did explain that we thought this was the issue.

We are planning our third request now and were wondering what to put, BUT we can't be sure it is not the forum.

@HuskyPup

Certainly you need to change this.


Could you explain further? I mean we are in the UK after all, but are you saying we don't want any geographic target set?

A big thanks again to everyone helping out with this, especially which such useful advice. We really do appreciate the help, it been a long battle!

tedster

6:30 pm on Jul 13, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Certainly you need to change this.


AFAIK, a .co.uk TLD automatically is assigned a UK target in Webmaster Tools and you cannot change that.

bwnbwn

7:59 pm on Jul 13, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks for this - I believe at the time of the last request we sent there were some bad forum pages still in the index, but we did explain that we thought this was the issue.
this should not continue to be the reason if the pages in the index were throwing a 404 when clicked.

Were these forum links in the index throwing a 404 at the time of the last request?

Super_Chunk

10:08 am on Jul 14, 2010 (gmt 0)

10+ Year Member



AFAIK, a .co.uk TLD automatically is assigned a UK target in Webmaster Tools and you cannot change that.


Just trying to get my head around this - knowing this, would you always recommend that a business dealing worldwide try to obtain a .com? It seems that if a .co.uk is going to be treated in this way then the targeting can't be beneficial. I mean the company would want to do well in all Google search results, not just UK specific... or am I missing something?

Were these forum links in the index throwing a 404 at the time of the last request?


You hit the nail no the head bwnbwn. No they were not sending out a 404 which is exactly what is prompting us to send an updated reconsideration request. We were originally ill-advised on how to treat the forum and since the last request have successfully followed Google's guidelines on removing all of the pages from their index.

While I have the attention of you guys (thanks) I would like to ask a slightly more general question if I may. Reading above everyone seems to agree that a vulnerable forum posting spam on our server is likely to have caused our current Google situation - but what if it wasn't this? What should we be looking to do?

I am not exaggerating when I say we have spent the last 18 months trying to fix this as it stands now no one can see anything wrong with our site. We are an 11 year old e-commerce business and the thought of walking away from our domain name worries us. I guess what i'm asking is what would YOU do when you seem to have exhausted all other avenues?

CainIV

7:06 am on Jul 15, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Just trying to get my head around this - knowing this, would you always recommend that a business dealing worldwide try to obtain a .com


Long and short - yes. If you expect to do business and want organic positioning in Google.com you need to consider the options. I own lots of .ca domains (I live in Vancouver, Canada) and the same applies to my websites - consider the full audience first, because Google does lock in those domains with the assumption that you chose them based on the country you wish to do your business in.

In terms of the issue, I can tell you that I have more than my fair share of what appear to be IP related issues. Switching to a different dedicated ip resolved problems in more than half of the issues (one of which is recent and is posted here in WebmasterWorld)

It is an easy switch that is worth trying first before proceeding with a host switch.

pageoneresults

1:51 pm on Jul 15, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



There are alot of pages "Restricted by robots.txt" but these are all pages we don't want crawled so is correct as far as I am concerned.


How many pages? I ask because the last thing I would recommend is using robots.txt to prevent crawling. Actually, robots.txt invokes crawling of those URIs and they end up in the index as URI only listings, that is not what you want.

When I find large numbers of robots.txt entries, I typically know that the site is losing equity in this scenario. For example, if you have 10,000 documents that you didn't want indexed and have a robots.txt Disallow for them, that's 10,000 URI entries in many instances. That's 10,000 URI only listings that appear to suck equity from the domain overall.

I'd noindex, nofollow those and get them completely out of the index. Also, if I were a low life down and dirty competitor, I might take all of those robots.txt entries and link to them from a disposable domain with some PR. That should upset your crawl routines a little bit.

No, robots.txt is a black hole. Google broke the protocol when they started showing URI only listings.

tedster

2:10 pm on Jul 15, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Actually, robots.txt invokes crawling of those URIs and they end up in the index as URI only listings, that is not what you want.

I'm going to pick on you here a bit, pageone. It's true that URI-only listings may appear for addresses that are disallowed in robots.txt, but that is definitely NOT because those URIs are crawled.

It happens because there are links that point those URIs. Once Google records a robots.txt disallow rule they will not crawl that URI. It might happen rarely because of a bug in the crawl team's code, but that is only about .00000001% of the time

A noindex,nofollow robots meta tag will keep the URI out of the index, but in order for that to happen, the URI must be crawled regularly, or else the meta tag would never be seen.

So what we have here is probably a vocabulary cross-up between "crawled" and "indexed".

Crawled = request the content from the server
Indexed = make that content available to the search results

Super_Chunk

11:21 am on Jul 19, 2010 (gmt 0)

10+ Year Member



@CainIV

Thanks for the targeting info - I didn't quiet appreciate the full impact of this before.

Switching to a different dedicated ip resolved problems in more than half of the issues


Thanks - I found your post and this is really interesting. Unfortunately this is complicated by syncronisation issues for our database driven e-commerce site - but if this could fix the issue then it should be worth it.

@pageoneresults / tedster

Thanks both for this - so am I hearing you right : remove the robots.txt disallow rules in favour of noindex,nofollow robots meta tag so the pages get crawled and therefore instructed to not be indexed?

Super_Chunk

1:51 pm on Jul 19, 2010 (gmt 0)

10+ Year Member



Just a correction to my last post. If we are just changing dedicated IP address and not hosting (no nameserver change) then there won't be any synchronisation issues. So if this has worked for you CainIV then it is definitely worth trying. Thanks.

RP_Joe

10:48 am on Jul 22, 2010 (gmt 0)

10+ Year Member Top Contributors Of The Month



Tedster, are you saying that pages not in the index never show up in the SERP?

So if I have a website with 100 pages, but WMT says "11 URLs in web index" those are the only ones that will show up in SERP's

Duvash

12:20 pm on Jul 22, 2010 (gmt 0)

10+ Year Member



Yeah, thats what it says... Thats the way it should be.
(Although I did hear voices lately saying that they get indexed no matter what you do).

tedster

4:11 pm on Jul 22, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



you saying that pages not in the index never show up in the SERP?

That's how I define "not in the index". I mean you paste the URL into the search box and get back no results. However, WMT data is not accurate enough to know how whether a URL is in the index.