Welcome to WebmasterWorld Guest from

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

No. 1 on Google became No. 34 after a site change

12:20 am on Nov 26, 2008 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 26, 2008
votes: 0

Hi everyone,

been a lurker for sometime now... but I got a problem so decided to post, I was doing good on many of my main keywords no.1 on google for several months with several of my other keywords at no. 1, 2 or 3 and suddenly I got dropped down to no. 34 on google, this includes most of our good keywords that we were getting traffic on.

the most recent change we did was we added a feature on our site that will track prices of our products and show product price changes and related items. This increase our site's pages by about 70% or so, and its loaded onto one of our subdomain. This is the main change we did.

Does anyone know what options we have to get our good ranking back?

1. Maybe put in a robots.txt to limit the access to those pages.
2. Maybe get rid of all those page all together and deploy them at a lower rate relative to the no. of current site pages.
3. other options?

Any help is appreciated, its just a bummer when the holidays is around the corner and we suddenly get this hit. In hindsight we should have not deployed all the pages that quickly. But we think those pages are really useful for our users.

2:26 am on Nov 26, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
votes: 0

Hello herculano, and welcome to the forums.

In practice there are many features that benefit end users but are not wise to expose to Google indexing - especially if they involve generating new urls whose content contains a lot of similar data to existing data. One example I thought of immediately was different "sorts" for various kinds of product lists.

I think you've got the right idea - either remove the pages from your site or block them from being indexed.

If you do let them stay live, you'll probably want to add a robots meta tag with noindex instructions to the source code for those urls. A robots.txt disallow rule would block them from future spidering, but since they've already been indexed then those URLs might hang around for quite a while and continue to wreak havoc.

12:57 pm on Nov 26, 2008 (gmt 0)

Junior Member

10+ Year Member

joined:Sept 1, 2003
posts: 91
votes: 0

How do you know those changes are the reason for the drop? If I wash my car today doesnt mean it is going to rain tomorrow. You have to tie the cause and effect together more. Why not change everything back?
2:44 pm on Nov 26, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 25, 2005
votes: 48

Sounds to me it is a manual review that most likely bumped the site. Being in the top really gets ya under the Google eye and certainly under a manual review that may expose issues with the site.

Google has tools at their disposal that can expose any linking issues and or other things that may get ya filtered. If you read up on the 30 60 90 900 filters this looks like what you were hit with.


3:57 pm on Nov 26, 2008 (gmt 0)

Junior Member

10+ Year Member

joined:June 17, 2003
votes: 0

I would suggest either blocking access to the new pages via robots.txt or adding your keywords to the new pages in its meta tags and content.
5:19 pm on Nov 26, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 6, 2002
votes: 22

Have you also added 70% more content to existing URLs?
6:09 pm on Nov 26, 2008 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 26, 2008
votes: 0

hi guys...

thanks for the comments.. let me go back and do some research... I'm not aware of the 30 60 90 900 filters.. I'll read up on this and I'll report back...

incrediblehelp... good point.. let me do some more research...

webmeister, I have meta tags and content filled up properly before I deployed the pages... so the option of having the robots.txt is definitely at the top of my list.

seopti, no I did not do that... I did add very specific content to several of my main pages that are near the top on google...

FYI... most of my pages have
1. unique title tags
2. unique meta description with the correct number of characters to prevent "..."
3. typically 1 h1, 1 or 2 h2, and probably some h3 depending on the need.
4. not all keywords are the same, probably top 3 keywords of the page are integrated within the content.

Keywords are results of adwords testing...

few stats: upto 300-400 clicks a day down to probably 10 or 20 clicks...

1:24 am on Nov 30, 2008 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 26, 2008
votes: 0

few updates on this.. I have since added a robots.txt as well as gotten my pages deleted from google wmt.

I notice my site went down to minus 60 few days ago.. and today it was in the range of minus 45... I'm hoping the changes I made would push it back up to the first page if not the top position..

that's my best keyword which is also my domain name is at no. 45. so far... its been about 15 days.. since I deployed those pages.. I lost about 300-400 clicks a day.. ouch!

4:40 pm on Nov 30, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 9, 2001
votes: 0

You don't say exactly how long you've been online with this particular site, so my observations may not directly apply. But here's what I've seen with Google a number of times over the past few years:

- A new site goes up that is well designed -- meaning validated code, good text content, not much duplication, some inbound links, no black-hat SEO.

- After a reasonable time the Googlebot finds the site, indexes it, then shortly thereafter it starts showing up in the SERPS with surprizingly strong position.

- The site owner starts to see good visitor stats and becomes very hopeful. It is at this point that I say "don't quit your day job".

- Then at some further point down the road, the site drops back to a lower level, and panic sets in, with everyone wondering how that can be, as any additional changes were (seemingly) minimal.

So I've come to the conclusion -- rightly or wrongly -- that as a matter of policy Google will often give new, well designed sites a boost at the front end, but then after a difficult-to-determine time period the algo will re-position them into another slot in the "natural order" of things.

As the months/years pass, if that site continues to attract quality inbound links and continues to add good quality content -- so they become more of an authority -- then they will rise again. If not, then they tend to stay back in that lower position.

So what throws people off is the "Google Gift" at the start ... but to paraphrase the old saying, "what the [Google] gods giveth, they can taketh away".

So as I said, this may not apply to you if you've been online for a few years, but on the other hand if your site is relatively new, you may be falling into the pattern.


7:38 pm on Nov 30, 2008 (gmt 0)

Preferred Member

10+ Year Member

joined:July 25, 2006
posts: 460
votes: 0

Reno, yes that Google Gift has been seen by many, and it's in line with a statement, long ago, somewhere (probably Matt Cutts blog) that Google makes an effort to give new sites a chance (at visibility).

My take on it is that in addition to preventing old sites from dominating a search result so completely that it creates an uncrackable barrier for newer and possibly better ones, putting a new site high in the rankings also gives Google a chance to measure the site's performance metrics such as bounce rate, whether people continue down the SERPs after visiting that site (possibly indicating that they didn't find what they wanted), etc. In other words, if the site never gets a referral, there are some measurements Google can't obtain.


Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members