Forum Moderators: open
In the middle of August our traffic and rankings dropped significantly (horribly actually). We were getting approx. 3,000 unique visitors a day and now we get approximately 600 visitors per day. This is mainly for google as most of our traffic comes from google anyway. This supposedly happened to others and I participated in some related threads. However, time went on and other webmasters' traffic came back. However, our traffic stayed stagnant.
I have been working on this for over 3 months trying to figure out WHAT could be wrong, but I can't find anything at all.
The odd thing is that our PR is now a PR6 - and we have over 3,000 pages and most of them are now a PR5 and PR4. We even have quite a few internal pages that are a PR6! Now, I realize PR doesn't mean too much... but odd that our PR is a 6 now and our alexa rating improved as well.... while at the same time our rankings are HORRIBLE / Non-existant!
When our rankings were very good we used to be a PR 4/5 with internal pages of PR1,2,3
Again, I have attempted to find the cause for over 3 months and have been unable to find out what is causing it. Our site has done really well since 2001 - but I'm affraid this little mom/pop shop will have to close the site if I can't figure it out because we've used our funds to keep it going the past 3 months and spent money on adwords trying to generate sales.
....we NEED our rankings back. Originally I would have said I wanted better rankings... but at this point, I just want them back to where they were.
I would really be humbled by any advice that could be offer on how to track this down. I've used many ideas posted on webmasterworld over the past 3 months - but I'm sure there are a few fresh ideas out there.
Also, if you know of a reputable company that is experienced in researching and finding why a site dropped in rankings, please sticky me. I've tried a couple SEO companies who say they can help... but then just say everything looks good.
Sorry if it sounds dire... but it really has come down to that point for me.
A solution may be to rephrase the duplicate content.
You have a site with a few thousand pages.
Your traffic is high do to each page recieving some search engine traffic.
Google updates and you traffic drops off.
This is because the few thousand pages you had are no longer competetive to the new pages that have replaced them.
(sites are listed soley based on seo)
Dont let anyone tell you diferently.)
Its a pretty simple concept. As time goes on all keywords no matter how obscure will become competetive.
Now there is only one method to get your traffic back.
Check your logs for keywords and entry pages.
Optimize each page and then seo them.
you will notice and increase in traffic if you work hard at it.
The days of easy traffic are coming to an end.
I have some clean pages on a decent site that are on page 40 now instead of page 1 of the serps. After page 2 of serps there are VERY few sites that are even close to relevant. Say I have a page that used to rank well for 'widget washing'. I will see a site ranking well for that term now even though they don't mention widgets, just washing.
I could understand being pushed down by more relevant sites, but that's not what myself and others are seeing.
You need to compare with other sites that now rank better and figure out what you are lacking. A few tips:
-backlinks, are they still there, is there enough
-seo : use the cache tool, look at the html; figure out how different your page is from others
-are all your pages still in the index? maybe some were lost, maybe some need more links to them
I used to own a domain that sold a fruit drink
I let the domain expire and a new company bought it.
this company changed the entire layout and theme to be just information about some company.
(The name of the fruit drink is not on the page or in the metatags.)
More than a year later the site still shows up for some of the fruit drinks search terms.
This shows that the googlebot are looking for specific seo techniques.
Therefore a site that is irrelevent can show up if the person doing the seo wants it to show up.
I might add that other search engines fall for this as well.
An example would be typing in 'widget washing' and getting a few sites that are relevant, and then a site where some high school kids are washing a car for a fundraiser. Some teacher just put up the pic, and there is no mention of widgets at all, low PR and few if any links to the page, just the word washing appearing once or twice.
I could maybe see my site coming up on page 40 because I use too many SEO techniques, but rewarding sites with one keyword of a two word phrase seems counter productive too.
guitaristinus -
I have done a test and did find some sites that have stolen text from our site. I got pretty upset and started to write copyright infringment emails... but you know, there will always be people out there stealing content. I can't even think of analogy to portray how hard it would be to continuously test 3,000 pages to see if someone is stealing content. I really do not know what to do about this - I know it happens and has most likely been going on since we started the site... if I were to devote myself to writing legal threats to people - I would be wasting my day every day because nothing else would get done. thoughts?
Also, there were a couple of comments suggesting that competition may just be moving up. I'm not sure I agree with that in this case because our traffic/rankings dropped to the floor very quickly. Our rankings did not slowly go down.
Regarding links, you know... now that we have a PR6 and strong PR on all internal pages - we can basically pick and choose who we want to exchange links with. When we had a PR4/5 we had to work hard looking for people to link to us... now higher PR sites are happy to link to us and we have a ton of people asking for us to link to them... We try very hard to make sure we deal with sites that are not questionable or in bad neighborhoods.
Hugene -
I took your suggestion and went back - did a search on some of our product names (of course we didn't come up at all) - and it makes me sick the sites that are at time (they are not consitent in seo method). This is where we used to shine... was on product names because they were so specific... not many people search for the specific product names (purple silly mountain fountain widget). In one example the #1 site had 15 words of text...that's it! In #3 position was a site that wasn't describing the product - it was just mentioned once among many other product names (so no meta tags or title mentioning it).
I have checked and all of our pages are indexed by google.
The benefits to Google of such a system are obvious. They kick out a bunch of spam sites, and they make many commercial & SEO savvy sites pay for Adwords. Their results lose a lot of good sites in the process, but it's a trade-off and they don't anticipate losing too many users. That's also why we are seeing a lot of semi-relevant sites in the SERPS - they don't receive any OOP penalties because they aren't optimized for those phrases.
Obviously, there are lots of exceptions. There are the most clever spammers which defeat the penalty, and there are sites that have established authority status and therefore are immune to the penalties.
Altogether, it's a very smart strategy for Google. But it is frustrating for many SEO-types who are optimizing their sites but haven't yet achieved authority status (and may never get there). This can also partially explain the sandbox theory, as well as how & when sites come out of the sandbox.
This isn't a perfect theory, but it does seem like it could explain a lot of weirdness that is going on.
I'm starting to wonder if multiple hyphens (I have 3) are a problem in my case.
Both of our (AprilS and I) domains are without a single hyphen.
Not sure about this theory of an SEO filter. I want my pages to have accurate titles and link anchor text so I'm afraid I won't be changing these for Google.
I use too many SEO techniques
Google migth be penalizing both of your sites as it could consider them to be spam.
One suggestions, simple html seems to be still very liked by G; keept the <h1> tags.
Avoid jamming ALT with keywords
Therefore, for example, Yahoo can spam all it wants and it avoids the penalties (not that it would actually do that. . .).
In my theory, a site can receive "negative points" for even basic SEO (e.g., Brett's guide, which many of us follow), possibly just for competitive phrases.
However, if that site has 1000 quality inbound links from other "trusted/authority sites" (determined by Google), the site receives a golden star and the penalty is turned off. The site can still be penalized for cloaking and other more serious spam techniques.
The penalties apply only to new sites added since this algorithm element was introduced (Feb?) and to sites that are similar in form to new sites (e.g., an existing site to which many new pages have been added and which has been radically re-SEO'd since Feb, effectively a "new" site).
This cuts down on spam, while keeping the big players (and unfortunately the best spammers) in the SERPS. Regular SEO-types with decent sites are the collateral damage, but as noted above, it is in Google's $$ interest to keep us out and make us pay, at least until we manage to earn the gold star.
Given the size of some of these authorities' sites the number of white hat links and fresh content that they have it is no wonder they have enough legit positive point to weather a few penalties.
Even so it IS posible to out rank these sites from time to time. The difficult bit is to work out how you did it!
The two basic premises of my theory are that (1) it is possible to receive negative points in the algorithm for SEO/OOP (which is entirely plausible), and (2) that if you manage to achieve authority status, Google stops applying the penalty.
To me, this seems very simple and logical (from both a technical and business standpoint), and it can partially explain the so-called sandbox.
I have other sites in Google for very competitive (one and two) words in the top 10 that are basically optimized the same way. The major difference is that the successful sites were launched/re-optimized prior to Feb '04. I am generally quite conservative in my optimization, but I do use the keyphrase in the title, description, H1, on-page text, inside a 3-word internal link, and in the majority of the anchor text of my independent backlinks. I didn't realize that this was a "crime" - I thought this was SEO 101.
If I had to score my "sandboxed" site against the competition by Brett's SEO 101 rules, I would say that it beats the top 10 sites by a score of 50 to 1 (yet I am #128). 75% of the sites above mine mention my targeted phrase one time and, as I noted earlier, have no backlinks containing that anchor text.
The only logical explanations that I have been able to come up with are (1) that this is a penalty as I described, or (2) that the sandbox does exist and does not let in new (or newly-SEO'd) sites. And if it is the sandbox, I believe that the sandbox uses a severe penalty to keep sites out of the top (50/100) until they overcome the increasingly steep hurdle to join the main index (i.e., overcome the penalty).
It is the 'gold star' stuff that I am less inclined to believe without it coming openly from either google or a 'gold star' holder. What do google say that a site/company must have to earn this award?
I actually also agree with Scarecrow's posts in the official sandbox thread, and I believe that it may work in conjunction with the new SEO penalty I describe.
Really, I think this toolbar data is now a major component of their algorithm, maybe receiving greater weight than PageRank, a measure of "importance" calculated from link probabilities. Toolbar data would provide a different, more immediate measure of "importance" which would enable G, for example, to highly rank new content describing a new virus sweeping the Internet, within minutes with enough traffic. Or to identify & rank top news, fads or other time-sensitive content.
Anyway, if I were you, I'd consider what the toolbar measures and look at your usage stats to identify your site's weak entry points. Maybe some top entry pages could use content not geared directly to sales but to keep visitors on-site-- for example providing content to help visitors understand products, how to evaluate, how to use or care for, etc etc.
I've experimented with google myself and know the toolbar picks up on changes fast, in some tests within a day or two.
Another thing that I think is the case but haven't proven is that spam filtering is in effect in the google algo-- such that keyphrases listed too often or in too close proximity, or in a manner that doesn't sound "natural" draws a penalty. The point here is to try not to over-optimize individual pages...much better to design them to read well.
One last thing that's easy to ignore is link text, which has a strong effect on how pages rank on specific keyphrases/words. Check for lame links-- they should speak accurately of the target page, using targeted words. I've also tested this and it matters quite a bit, even if the links are on your own site.
Good luck scrapping it out!
Also we all know that google don't like popups. So why not use the toolbar to find out what pages use popups? The pages with popups get a small penalty. You are still indexed but your rankings drop because your site was placed into the supplimental index.
On top of that you get a few people clicking on the frownie face because of your popup/under that annoyed them? Or your competetor stumbled on your site and thought they should click the frownie face because you rank higher than them?
Who knows? But it wouldn't take much to add those factors into your pages ranking. Just 2 bytes of information can change your ranking that quick.
I have a site with a PR 5 homepage that got 6,000 uniques at least a day and suddenly it just dropped in rankings? The site has one popunder per visitor. Do you think someone clicked a frownie face one too many times?
There is nothing wrong with the site and it was doing fine a month and a half ago. It has just your basic SEO but nothing special or black hat? I haven't done any major updates to the site since I built it and I update the sites pages at least twice a month.
How could a site do fine for months and months without any major changes and then get put into the supplimental index for no reason? I think they are using the toolbar data.
A site that ranked #1 in Google for months then falls to around the #120 mark for a keyword phrase.
I obviously think my site deserves a higher ranking, but the idea that more relevant sites have been added - hence my decline - is not the case. Most of the sites that rank above me aren't remotely relevant, starting from the first page.
In fact the second page of results has eight entries all with strings of random words and the phrase BLUE WIDGET appearing. By the third page you are hard prezssed to find even this level of relevance.
One thing that has changed though - a big increase in the number of "sponsored links" being displayed for BLUE WIDGETS.
Which makes me think this.
A page is served and Google makes money only if sponsored links are clicked. What if they served a page and found that nobody was clicking on the sponsored links but going straight to the #1 listing, finding what they needed and ordering there.
Could it be that by burying very relevant sites and feeding the public mainly spam, directories and other sites of little relevance that those "pay per click" links that are nearly always very relevant (after all webmasters choose the keywords they want to be relevant for) become more attractive and Google's earning.
May sound like sour grapes on my part, but something is penalizing my BLUE WIDGET site while another of my sites - built exactly the same way - remains at #1 for its keyword phrase where there are no sponsored links (or one at most from time to time).
Elgrande, does your keyword phrase have lots of sponsored links?
What toolbar data are you thinking of? This is interesting, but I would like to know what you think they are gathering and how they are using it.
The data I'm thinking of is click data-- where people go-- the toolbar sends a little note to google for every page you visit. If I were Google, I'd add backbutton data to it to know exactly how long people stay, but this info can be approximated from click data.
BTW, I think Microsoft is starting to do the same thing from its browser, though it may be just coincidence that MSNbot visited a few pages right after I updated them. I'll have to experiment with this more thoroughly. This is very revealing data for a search to have.