Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Added a duplicate test page - big ranking drops followed

         

gouri

7:50 pm on Jan 31, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Say you have a 10 page site. I added a page to it, which was a copy of one of the ten pages (a copy of one of the inner pages), and the file name of the page was slightly different than the original. The purpose was to test font appearance, layout. I tested a few possibilities by adding to the CSS code of the template (the template already comes with some CSS and I added to it and changed around a little of the code).

I then published so the site now had 11 pages. Then I would remove the page I added and publish again, so the site was now the original one. I did this several times, testing different fonts, layouts and at the end went back to the 10 page site after all the tests. So now only the CSS code that came with the template is there. The orignial CSS code of the template. In this time, it could have been visited by the Googlebot, I am not sure, and I am not sure if this is important.

The next day I noticed a drop in some long-tail keywords that the site is ranking for. Maybe not for long-tail keywords that the page I made a copy of is trying to rank for, but long-tail keywords that another page is targeting. A couple of days later, keywords (not long-tail keywords) that the website is trying to rank for (the homepage ranks for the keywords) have seen a big drop. A drop of 100 positions and a drop of over 250 positions.

Is this something that can happen? What could be the cause? Is there something that I can do about this?

I don't know what I have done that Google doesn't seem to like.

tedster

1:40 am on Feb 1, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



First, as I am sure you understand, it's not a good idea to serve a test page in a way that Google can index it - the best practice is to put it in a password protected area.

That said, creating and removing one "duplicate content" page should not affect other rankings. I have trouble seeing how one duplicate page could possible make this trouble.

More likely is one of these reasons:

1. You are making too many and too frequent changes altogether and are being hit with a penalty for trying to game the results.

2. Your site is relatively new and lost some of its "honeymoon" factor.

3. Google changed the algo in a way that affected your particular rankings.

caribguy

1:48 am on Feb 1, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



remove the page I added and publish again,

I have experienced and can understand why that page (and its copy) would drop, but not the site as a whole. It's generally not a good idea to do your testing on a live/public server.

g1smd

1:50 am on Feb 1, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Set up a password protected folder or subdomain on your site, or install Apache to a local PC and use as localhost, for testing.

gouri

2:27 am on Feb 1, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



1. You are making too many and too frequent changes altogether and are being hit with a penalty for trying to game the results.

2. Your site is relatively new and lost some of its "honeymoon" factor.

3. Google changed the algo in a way that affected your particular rankings.

About No. 1, that day (Wednesday) when I was making changes it was for a period of about two or three hours when I was constantly testing. I must have published many times in that period, whether is was testing different possibilites on the test page and publishing and also removing the test page and publishing the site so it was back to the oringinal one. Then, the next day, Thursday (it was in the morning on Thursday that I noticed the drop for the long-tail phrases), I must have also tested a couple of times in the afternoon, and I actually made a change to the original page (not the duplicate, that has stayed). Today, Saturday, I noticed the big drop in the keyword phrases that the homepage is trying to rank for.

A day before I began testing this page (this would be Tuesday), I noticed that one of the pages of the site did not have a cache version in Yahoo when I did link:operator. So what I did was I added a word to the page (an inner page) that didn't have a cache version and published, hoping that Yahoo would see this change and that would help me to have a cache version of that particular page. I then removed the word later that day, published and when on the next day (which I believe is the first day that I started testing on the duplicate page) I did not see a cache version for the page in Yahoo when I did link:operator I think I added a word again and published and then removed it.

To summarize, this week there was a lot of publishing of the site, a lot more than usual. The majority of it was done on Wednesday and Thursday. Do you think that even I wasn't trying to do anything, Google might see all this publishing differently?

About No.2, I don't think that could be it because just at the beginning of this week I went up for some long-tail phrases that I was targeting and the site is over a year old.

About No.3, I really don't know enough about the algo to say a lot about it, but has there been some sort of change in the algorithm from the beginning of this week and Thurday?

If you can tell me what you think about what I wrote about No.1, it would be a big help and maybe No.3 but I am not sure if No.3 is it. I am trying to figure out what may have happened and if there is anything that I can do about it?

[edited by: tedster at 3:05 am (utc) on Feb. 1, 2009]
[edit reason] disable graphic smile faces [/edit]

gouri

7:11 pm on Feb 2, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I don't think I can set up a folder on my site so I was wondering if I could make a copy of a page and password protect it or if that is not enough password protect the copied page and also put a disallow in robots.text rule?

g1smd

7:13 pm on Feb 2, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If you have FTP access you can make a folder.

If the server runs Apache, it is very easy to set up a .htpasswd file to put inside the folder.

gouri

7:14 pm on Feb 2, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I don't have access to the root host file. I think that might be necessary in order to do what you said but I am not sure?

g1smd

7:42 pm on Feb 2, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



You don't need that. You only need FTP access to your webspace.

ov3rrun

6:19 am on Feb 3, 2009 (gmt 0)

10+ Year Member



Just put the test page as protected contentd (password) or use this tag on your test page:

<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">

cheers,

gouri

11:30 pm on Feb 3, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I don't believe that I have FTP access so I have to go with the suggestion of adding the meta tag to the duplicated page or password protecting the page.

Does putting a page as password protected prevent it from being visited by Googlebot or having some sort of affect in the SERP?

If I made a lot of changes testing things and published many times, if the page is password protected would it have no negative affect in the SERP? Would Google not see this as too frequent changes? Then I can do things this way.

Robert Charlton

11:49 pm on Feb 3, 2009 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I don't believe that I have FTP access so I have to go with the suggestion of adding the meta tag to the duplicated page or password protecting the page.

gouri - To password protect the page, you need to set up a folder, and you need FTP access to do that and to upload the page. To add the meta tag and upload the page you also need FTP access.

I think you probably have FTP access, but don't quite have jargon down... though it's possible you're using some sort of site building application with a web interface that loads files directly from your local computer.

What application do you use to upload pages to your web server? This might help us guide you.

gouri

12:57 am on Feb 4, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I use a site building application. I don't have FTP access but I believe that I can password protect a page and add a meta tag to it.

If you are using a site building application and cannot create folders, is it possible that you can password protect individual pages because you can't set up a directory for testing purposes?

I think sometimes you have to test things by creating a duplicate page so I have to find a good way of doing this.

Robert Charlton

3:06 am on Feb 4, 2009 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I use a site building application.

That's going to limit your options a lot, not just with regard to blocking spider access, but probably for most of what you need to do to create a "well-tuned" and optimized site.

No telling what your host means by password protecting a page. For now, with regard to temporary pages that you work on, I would suggest the robots meta tag approach, but just for a copy of the duplicate page you're working on, and just while you're working on it. Remember to remove the tag on your final copy.

I would also suggest, though, that you change web hosts. It's going to be a big dislocation and learning curve for you, but it's the only way you're going to be able to have enough control of your site to do what's needed to optimize it properly.

Get a hosting account on a "shared" Apache server, with enough server access so you can use .htaccess. I'd probably begin asking questions in the New To Web Development [webmasterworld.com] forum here on WebmasterWorld.

Robert Charlton

7:45 am on Feb 4, 2009 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



PS to the above... gouri, there's currently a discussion in New To Web Development that's touching on setting up a site using FTP....

Just HTML?
[webmasterworld.com...]

I've noticed you've been participating in the HTML and CSS forums. I also recommend checking the WYSIWYG and Text Code Editors [webmasterworld.com] forum, as you're going to need software to build your pages.

gouri

1:13 pm on Feb 4, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Robert,

Thank you for your excellent advice.

I really appreciate it. I think I will go with using the robots meta tag for pages that I am testing.

Robert Charlton

7:28 pm on Feb 4, 2009 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



gouri - It may well be, btw, that your current host also offers hosting with adequate features for what you need. I'm seeing with some hosting companies that offer business grade Apache hosting that there's a optional site building software available, even on advanced hosting plans.

So it's very possible that your current account offers FTP access, and that you've just chosen not to use it. It would certainly be simpler not to have to move your hosting account. The host may even provide free FTP software for you to use.

Since I've gotten this discussion a bit off topic, let me get back to two earlier questions you had....

Does putting a page as password protected prevent it from being visited by Googlebot or having some sort of affect in the SERP?

The point of using password protection or a noindex robots meta tag is to keep Googlebot from indexing the pages you protect. You would protect only private dupe pages that you are working on. This enables you to test layout, appearance, etc, and to publish them privately without Google seeing them.

Of course, when you want Google to see your changes, you have to replace the original page with your new page, and... if you've used the robots noindex meta tag... you need to to remove it from your final publicly published version.

If I made a lot of changes testing things and published many times, if the page is password protected would it have no negative affect in the SERP? Would Google not see this as too frequent changes? Then I can do things this way.

As I describe above, Google should not be seeing your private test page at all. This is the point of blocking it while you're testing it.

I would not recommend making frequent SEO changes on publicly published versions.

gouri

8:57 pm on Feb 4, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



This is a lot of very useful information that you have given me. Thanks.

The features that are offered I feel are excellent, and I am going to check with the web host if FTP is a feature that is offered that I might be able to add on. If not, then maybe they do offer a hosting plan that will allow me FTP access but it would be at the same webhost so that would be cool.

I will password protect or put a noindex robots meta tag to pages that I make a duplicate of so the Googlebot will not index it but at the same time I can see how different things would look. After testing, if I decide to go with the changes, I will replace the old page with the new one and remove whichever option I used.

What I did last week is I duplicated a page, made changes and kept trying new things and kept publishing but the page was not blocked so it was out there for Google to index. The site with the additional page was public. I am not sure that they picked up the duplicate page but I think the frequent changes were probably not looked at favorably by them.

Maybe this will help to analyze a little further. I looked at Webmaster Tools and the number of pages visited by Googlebot in a day was a lot higher than usual at the end of January (which was when I did the duplicate test). Do you think that this unusually high number was something that Google didn't like and that is why I saw a drop in rankings?

But I don't think the duplicate page was indexed.