Welcome to WebmasterWorld Guest from 22.214.171.124
been a lurker for sometime now... but I got a problem so decided to post, I was doing good on many of my main keywords no.1 on google for several months with several of my other keywords at no. 1, 2 or 3 and suddenly I got dropped down to no. 34 on google, this includes most of our good keywords that we were getting traffic on.
the most recent change we did was we added a feature on our site that will track prices of our products and show product price changes and related items. This increase our site's pages by about 70% or so, and its loaded onto one of our subdomain. This is the main change we did.
Does anyone know what options we have to get our good ranking back?
1. Maybe put in a robots.txt to limit the access to those pages.
2. Maybe get rid of all those page all together and deploy them at a lower rate relative to the no. of current site pages.
3. other options?
Any help is appreciated, its just a bummer when the holidays is around the corner and we suddenly get this hit. In hindsight we should have not deployed all the pages that quickly. But we think those pages are really useful for our users.
In practice there are many features that benefit end users but are not wise to expose to Google indexing - especially if they involve generating new urls whose content contains a lot of similar data to existing data. One example I thought of immediately was different "sorts" for various kinds of product lists.
I think you've got the right idea - either remove the pages from your site or block them from being indexed.
If you do let them stay live, you'll probably want to add a robots meta tag with noindex instructions to the source code for those urls. A robots.txt disallow rule would block them from future spidering, but since they've already been indexed then those URLs might hang around for quite a while and continue to wreak havoc.
Google has tools at their disposal that can expose any linking issues and or other things that may get ya filtered. If you read up on the 30 60 90 900 filters this looks like what you were hit with.
thanks for the comments.. let me go back and do some research... I'm not aware of the 30 60 90 900 filters.. I'll read up on this and I'll report back...
incrediblehelp... good point.. let me do some more research...
webmeister, I have meta tags and content filled up properly before I deployed the pages... so the option of having the robots.txt is definitely at the top of my list.
seopti, no I did not do that... I did add very specific content to several of my main pages that are near the top on google...
FYI... most of my pages have
1. unique title tags
2. unique meta description with the correct number of characters to prevent "..."
3. typically 1 h1, 1 or 2 h2, and probably some h3 depending on the need.
4. not all keywords are the same, probably top 3 keywords of the page are integrated within the content.
Keywords are results of adwords testing...
few stats: upto 300-400 clicks a day down to probably 10 or 20 clicks...
I notice my site went down to minus 60 few days ago.. and today it was in the range of minus 45... I'm hoping the changes I made would push it back up to the first page if not the top position..
that's my best keyword which is also my domain name is at no. 45. so far... its been about 15 days.. since I deployed those pages.. I lost about 300-400 clicks a day.. ouch!
- A new site goes up that is well designed -- meaning validated code, good text content, not much duplication, some inbound links, no black-hat SEO.
- After a reasonable time the Googlebot finds the site, indexes it, then shortly thereafter it starts showing up in the SERPS with surprizingly strong position.
- The site owner starts to see good visitor stats and becomes very hopeful. It is at this point that I say "don't quit your day job".
- Then at some further point down the road, the site drops back to a lower level, and panic sets in, with everyone wondering how that can be, as any additional changes were (seemingly) minimal.
So I've come to the conclusion -- rightly or wrongly -- that as a matter of policy Google will often give new, well designed sites a boost at the front end, but then after a difficult-to-determine time period the algo will re-position them into another slot in the "natural order" of things.
As the months/years pass, if that site continues to attract quality inbound links and continues to add good quality content -- so they become more of an authority -- then they will rise again. If not, then they tend to stay back in that lower position.
So what throws people off is the "Google Gift" at the start ... but to paraphrase the old saying, "what the [Google] gods giveth, they can taketh away".
So as I said, this may not apply to you if you've been online for a few years, but on the other hand if your site is relatively new, you may be falling into the pattern.
My take on it is that in addition to preventing old sites from dominating a search result so completely that it creates an uncrackable barrier for newer and possibly better ones, putting a new site high in the rankings also gives Google a chance to measure the site's performance metrics such as bounce rate, whether people continue down the SERPs after visiting that site (possibly indicating that they didn't find what they wanted), etc. In other words, if the site never gets a referral, there are some measurements Google can't obtain.