| 12:51 pm on Nov 23, 2004 (gmt 0)|
I'll come back and post again later with a fuller reply, but are you completely dependent on Google natural listings? Has the demand for your product(s) dropped for external reasons? Are you doing any PPC? What are the competition doing?
| 1:40 pm on Nov 23, 2004 (gmt 0)|
Around Sept 5 a few of my sites took a big drop (66%) in traffic from Google. I think it is because Google turned up the "duplicate content filter". Meaning my pages were very similar to others in their database. To see if this has happened to you, take a sentence or two from one of your pages and do a search in Google for that exact phrase (put it in quotes). Are there any other sites with that phrase? Is much of their other content the same as yours? If so, your site may have dropped because of the "duplicate content filter".
A solution may be to rephrase the duplicate content.
| 1:51 pm on Nov 23, 2004 (gmt 0)|
Most drops I've seen are caused by linkage rather than on-page stuff, you can PM me the Url if you like, and I'll have a look...
| 1:05 pm on Nov 24, 2004 (gmt 0)|
You site may not have dropped as much as others have moved up. Look at the top listings for your search terms. What do these sites have in common?
| 5:52 pm on Nov 24, 2004 (gmt 0)|
My 2 cents would be to look for ad change the duplicate content you have on those 3000 pages and get some links to those inner pages.
| 10:20 pm on Nov 24, 2004 (gmt 0)|
Just want to try to clear something up for you.
When you pages went deeper into the index other pages moved up.
This happens for many reasons but here is an example.
See if this fits your situation.
You have a site with a few thousand pages.
Your traffic is high do to each page recieving some search engine traffic.
Google updates and you traffic drops off.
This is because the few thousand pages you had are no longer competetive to the new pages that have replaced them.
(sites are listed soley based on seo)
Dont let anyone tell you diferently.)
Its a pretty simple concept. As time goes on all keywords no matter how obscure will become competetive.
Now there is only one method to get your traffic back.
Check your logs for keywords and entry pages.
Optimize each page and then seo them.
you will notice and increase in traffic if you work hard at it.
The days of easy traffic are coming to an end.
| 2:51 am on Nov 25, 2004 (gmt 0)|
Bak, I'm having a similar problem to April, and while I see your point about some sites go down, but others rise, I don't think it reflects what's happening.
I have some clean pages on a decent site that are on page 40 now instead of page 1 of the serps. After page 2 of serps there are VERY few sites that are even close to relevant. Say I have a page that used to rank well for 'widget washing'. I will see a site ranking well for that term now even though they don't mention widgets, just washing.
I could understand being pushed down by more relevant sites, but that's not what myself and others are seeing.
| 3:30 am on Nov 25, 2004 (gmt 0)|
I totaly agree with mumbledawg
You need to compare with other sites that now rank better and figure out what you are lacking. A few tips:
-backlinks, are they still there, is there enough
-seo : use the cache tool, look at the html; figure out how different your page is from others
-are all your pages still in the index? maybe some were lost, maybe some need more links to them
| 4:02 am on Nov 25, 2004 (gmt 0)|
I think you hit the nail on the head.
What is relevecy according to google?
The only way for google to check is with a gogolebot.
And if you know seo than you can feed the googlebot whatever it takes to get a site high in the serps.
I used to own a domain that sold a fruit drink
I let the domain expire and a new company bought it.
this company changed the entire layout and theme to be just information about some company.
(The name of the fruit drink is not on the page or in the metatags.)
More than a year later the site still shows up for some of the fruit drinks search terms.
This shows that the googlebot are looking for specific seo techniques.
Therefore a site that is irrelevent can show up if the person doing the seo wants it to show up.
I might add that other search engines fall for this as well.
| 4:46 am on Nov 25, 2004 (gmt 0)|
But in my example, the sites showing up for "widget washing" don't actually benefit at all from any traffic they might get for this term. While I could understand that Google might reward them for not having obvious SEO, the sites I'm talking about don't really have any SEO, or profit model to make SEO worthwhile.
An example would be typing in 'widget washing' and getting a few sites that are relevant, and then a site where some high school kids are washing a car for a fundraiser. Some teacher just put up the pic, and there is no mention of widgets at all, low PR and few if any links to the page, just the word washing appearing once or twice.
I could maybe see my site coming up on page 40 because I use too many SEO techniques, but rewarding sites with one keyword of a two word phrase seems counter productive too.
| 6:39 am on Nov 25, 2004 (gmt 0)|
I'm starting to wonder if multiple hyphens (I have 3) are a problem in my case.
| 7:55 am on Nov 25, 2004 (gmt 0)|
encyclo - Yes, we are very dependant upon Google. Yahoo (along with the many search engines that use Yahoo) has not listed us from day one. When we first went live in 2001 the session ID was being displayed in the URL... so Yahoo came and saw the same pages many times - so we got penalized. Of course we immediately solved that issued in 2001... but Yahoo has still penalized us. I have spent 3 years trying to get back in with them. In regards to demand, no - demand for our product has not gone down at all. We are not doing any PPC programs except for adwords(can't afford afford more now). Our competition is doing VERY well.
I have done a test and did find some sites that have stolen text from our site. I got pretty upset and started to write copyright infringment emails... but you know, there will always be people out there stealing content. I can't even think of analogy to portray how hard it would be to continuously test 3,000 pages to see if someone is stealing content. I really do not know what to do about this - I know it happens and has most likely been going on since we started the site... if I were to devote myself to writing legal threats to people - I would be wasting my day every day because nothing else would get done. thoughts?
Also, there were a couple of comments suggesting that competition may just be moving up. I'm not sure I agree with that in this case because our traffic/rankings dropped to the floor very quickly. Our rankings did not slowly go down.
Regarding links, you know... now that we have a PR6 and strong PR on all internal pages - we can basically pick and choose who we want to exchange links with. When we had a PR4/5 we had to work hard looking for people to link to us... now higher PR sites are happy to link to us and we have a ton of people asking for us to link to them... We try very hard to make sure we deal with sites that are not questionable or in bad neighborhoods.
I took your suggestion and went back - did a search on some of our product names (of course we didn't come up at all) - and it makes me sick the sites that are at time (they are not consitent in seo method). This is where we used to shine... was on product names because they were so specific... not many people search for the specific product names (purple silly mountain fountain widget). In one example the #1 site had 15 words of text...that's it! In #3 position was a site that wasn't describing the product - it was just mentioned once among many other product names (so no meta tags or title mentioning it).
I have checked and all of our pages are indexed by google.
| 9:36 am on Nov 25, 2004 (gmt 0)|
Things are starting to come into focus. It seems to me that Google just has a very aggressive spam (OOP) filter/penalty. If you trip it *and don't have an established authority status* (which is determined by them), you get a -100 or -50, etc. After all, why would an algorithm include only positive "points" - why couldn't a site just as easily receive negative points for doing certain things?
The benefits to Google of such a system are obvious. They kick out a bunch of spam sites, and they make many commercial & SEO savvy sites pay for Adwords. Their results lose a lot of good sites in the process, but it's a trade-off and they don't anticipate losing too many users. That's also why we are seeing a lot of semi-relevant sites in the SERPS - they don't receive any OOP penalties because they aren't optimized for those phrases.
Obviously, there are lots of exceptions. There are the most clever spammers which defeat the penalty, and there are sites that have established authority status and therefore are immune to the penalties.
Altogether, it's a very smart strategy for Google. But it is frustrating for many SEO-types who are optimizing their sites but haven't yet achieved authority status (and may never get there). This can also partially explain the sandbox theory, as well as how & when sites come out of the sandbox.
This isn't a perfect theory, but it does seem like it could explain a lot of weirdness that is going on.
| 9:52 am on Nov 25, 2004 (gmt 0)|
|I'm starting to wonder if multiple hyphens (I have 3) are a problem in my case. |
Both of our (AprilS and I) domains are without a single hyphen.
Not sure about this theory of an SEO filter. I want my pages to have accurate titles and link anchor text so I'm afraid I won't be changing these for Google.
| 3:03 pm on Nov 25, 2004 (gmt 0)|
elgrande - what exactly is "authority status"? How is it gained what effects it?
| 6:27 pm on Nov 25, 2004 (gmt 0)|
I agree with elgrande, and with quotes like these it really does come into perspective:
|I use too many SEO techniques |
Google migth be penalizing both of your sites as it could consider them to be spam.
One suggestions, simple html seems to be still very liked by G; keept the <h1> tags.
Avoid jamming ALT with keywords
| 9:42 pm on Nov 25, 2004 (gmt 0)|
buvar, I'm not the best person to define "authority status", so try searching on the term at WW. I was using that term basically to label sites that have received the gold star of immunity from Google's penalties. In my theory I am assuming that if a site has XXX "points" of links from other qualified, independent, on-topic sites, that they get their gold star (i.e., are considered authorities themselves).
Therefore, for example, Yahoo can spam all it wants and it avoids the penalties (not that it would actually do that. . .).
| 11:30 pm on Nov 25, 2004 (gmt 0)|
|received the gold star of immunity from Google's penalties |
You have a hypothesis that you can be *immune* to penalties?
You must explain this!
| 12:24 am on Nov 26, 2004 (gmt 0)|
sure. . .
In my theory, a site can receive "negative points" for even basic SEO (e.g., Brett's guide, which many of us follow), possibly just for competitive phrases.
However, if that site has 1000 quality inbound links from other "trusted/authority sites" (determined by Google), the site receives a golden star and the penalty is turned off. The site can still be penalized for cloaking and other more serious spam techniques.
The penalties apply only to new sites added since this algorithm element was introduced (Feb?) and to sites that are similar in form to new sites (e.g., an existing site to which many new pages have been added and which has been radically re-SEO'd since Feb, effectively a "new" site).
This cuts down on spam, while keeping the big players (and unfortunately the best spammers) in the SERPS. Regular SEO-types with decent sites are the collateral damage, but as noted above, it is in Google's $$ interest to keep us out and make us pay, at least until we manage to earn the gold star.
| 9:14 am on Nov 26, 2004 (gmt 0)|
I'm not sure elgrande, this all sounds a bit like a conspiricy theory. Do you have and hard evidence for this? Do google admit to this? If so do they have published criteria?
Given the size of some of these authorities' sites the number of white hat links and fresh content that they have it is no wonder they have enough legit positive point to weather a few penalties.
Even so it IS posible to out rank these sites from time to time. The difficult bit is to work out how you did it!
| 9:47 am on Nov 26, 2004 (gmt 0)|
Well, the only evidence I have is my experience with my own sites and from reading about other people's experiences at WW. As for my own experience, my site is now #1 in Yahoo and #128 in Google - it is also the ONLY site in the SERPS optimized for this particular phrase, and I now have 30 links from sites that rank above mine for this particular term (80 total on-topic backlinks), with the 2-KW term in the anchor text (30% add an additional word, with some deeplinks). NONE of the other sites even have this term in a SINGLE external backlink. As far as I know, my site is white hat - boilerplate optimized, yes, but nothing sneaky.
The two basic premises of my theory are that (1) it is possible to receive negative points in the algorithm for SEO/OOP (which is entirely plausible), and (2) that if you manage to achieve authority status, Google stops applying the penalty.
To me, this seems very simple and logical (from both a technical and business standpoint), and it can partially explain the so-called sandbox.
I have other sites in Google for very competitive (one and two) words in the top 10 that are basically optimized the same way. The major difference is that the successful sites were launched/re-optimized prior to Feb '04. I am generally quite conservative in my optimization, but I do use the keyphrase in the title, description, H1, on-page text, inside a 3-word internal link, and in the majority of the anchor text of my independent backlinks. I didn't realize that this was a "crime" - I thought this was SEO 101.
If I had to score my "sandboxed" site against the competition by Brett's SEO 101 rules, I would say that it beats the top 10 sites by a score of 50 to 1 (yet I am #128). 75% of the sites above mine mention my targeted phrase one time and, as I noted earlier, have no backlinks containing that anchor text.
The only logical explanations that I have been able to come up with are (1) that this is a penalty as I described, or (2) that the sandbox does exist and does not let in new (or newly-SEO'd) sites. And if it is the sandbox, I believe that the sandbox uses a severe penalty to keep sites out of the top (50/100) until they overcome the increasingly steep hurdle to join the main index (i.e., overcome the penalty).
| 1:09 pm on Nov 26, 2004 (gmt 0)|
let me clarify - I fully accept the notion of a sandbox - the evidence for google applying a handicap to new sites (almost certainly due to some major problem they have rather than design) is too great.
It is the 'gold star' stuff that I am less inclined to believe without it coming openly from either google or a 'gold star' holder. What do google say that a site/company must have to earn this award?
| 8:58 pm on Nov 26, 2004 (gmt 0)|
well, Google cannot disclose the "gold star" because it is part of their algorithm. I was thinking about the gold star because established authority sites can get away with very aggressive SEO. But new sites seem to be penalized for SEO until they get over a steep hurdle (i.e., become authority sites).
I actually also agree with Scarecrow's posts in the official sandbox thread, and I believe that it may work in conjunction with the new SEO penalty I describe.
| 10:22 pm on Nov 27, 2004 (gmt 0)|
April - I think google now relies a great deal on data retrieved from its widely-installed toolbar, which measures what pages/sites draw traffic and on average whether visitors seem to like the content better than related sites. I also think G shifts the rankings arbitrarily from time to time to run little "experiments" built into their algorithm, to check whether a site has improved its measure of "usefulness" or at least internal "attractiveness" for visitors to stay on-site. This is good because it enables competition to move up in the rankings.
Really, I think this toolbar data is now a major component of their algorithm, maybe receiving greater weight than PageRank, a measure of "importance" calculated from link probabilities. Toolbar data would provide a different, more immediate measure of "importance" which would enable G, for example, to highly rank new content describing a new virus sweeping the Internet, within minutes with enough traffic. Or to identify & rank top news, fads or other time-sensitive content.
Anyway, if I were you, I'd consider what the toolbar measures and look at your usage stats to identify your site's weak entry points. Maybe some top entry pages could use content not geared directly to sales but to keep visitors on-site-- for example providing content to help visitors understand products, how to evaluate, how to use or care for, etc etc.
I've experimented with google myself and know the toolbar picks up on changes fast, in some tests within a day or two.
Another thing that I think is the case but haven't proven is that spam filtering is in effect in the google algo-- such that keyphrases listed too often or in too close proximity, or in a manner that doesn't sound "natural" draws a penalty. The point here is to try not to over-optimize individual pages...much better to design them to read well.
One last thing that's easy to ignore is link text, which has a strong effect on how pages rank on specific keyphrases/words. Check for lame links-- they should speak accurately of the target page, using targeted words. I've also tested this and it matters quite a bit, even if the links are on your own site.
Good luck scrapping it out!
| 4:47 am on Nov 28, 2004 (gmt 0)|
What toolbar data are you thinking of? This is interesting, but I would like to know what you think they are gathering and how they are using it.
| 7:59 am on Nov 28, 2004 (gmt 0)|
Don't let google fool you! They would love to be able to use their toolbar data for ranking sites and I bet they are at least using the happy and frownie face data in their algo right now.
Also we all know that google don't like popups. So why not use the toolbar to find out what pages use popups? The pages with popups get a small penalty. You are still indexed but your rankings drop because your site was placed into the supplimental index.
On top of that you get a few people clicking on the frownie face because of your popup/under that annoyed them? Or your competetor stumbled on your site and thought they should click the frownie face because you rank higher than them?
Who knows? But it wouldn't take much to add those factors into your pages ranking. Just 2 bytes of information can change your ranking that quick.
I have a site with a PR 5 homepage that got 6,000 uniques at least a day and suddenly it just dropped in rankings? The site has one popunder per visitor. Do you think someone clicked a frownie face one too many times?
There is nothing wrong with the site and it was doing fine a month and a half ago. It has just your basic SEO but nothing special or black hat? I haven't done any major updates to the site since I built it and I update the sites pages at least twice a month.
How could a site do fine for months and months without any major changes and then get put into the supplimental index for no reason? I think they are using the toolbar data.
| 10:55 am on Nov 28, 2004 (gmt 0)|
I share elgrande's experience almost down to a letter.
A site that ranked #1 in Google for months then falls to around the #120 mark for a keyword phrase.
I obviously think my site deserves a higher ranking, but the idea that more relevant sites have been added - hence my decline - is not the case. Most of the sites that rank above me aren't remotely relevant, starting from the first page.
In fact the second page of results has eight entries all with strings of random words and the phrase BLUE WIDGET appearing. By the third page you are hard prezssed to find even this level of relevance.
One thing that has changed though - a big increase in the number of "sponsored links" being displayed for BLUE WIDGETS.
Which makes me think this.
A page is served and Google makes money only if sponsored links are clicked. What if they served a page and found that nobody was clicking on the sponsored links but going straight to the #1 listing, finding what they needed and ordering there.
Could it be that by burying very relevant sites and feeding the public mainly spam, directories and other sites of little relevance that those "pay per click" links that are nearly always very relevant (after all webmasters choose the keywords they want to be relevant for) become more attractive and Google's earning.
May sound like sour grapes on my part, but something is penalizing my BLUE WIDGET site while another of my sites - built exactly the same way - remains at #1 for its keyword phrase where there are no sponsored links (or one at most from time to time).
Elgrande, does your keyword phrase have lots of sponsored links?
| 2:12 pm on Nov 29, 2004 (gmt 0)|
|What toolbar data are you thinking of? This is interesting, but I would like to know what you think they are gathering and how they are using it. |
The data I'm thinking of is click data-- where people go-- the toolbar sends a little note to google for every page you visit. If I were Google, I'd add backbutton data to it to know exactly how long people stay, but this info can be approximated from click data.
BTW, I think Microsoft is starting to do the same thing from its browser, though it may be just coincidence that MSNbot visited a few pages right after I updated them. I'll have to experiment with this more thoroughly. This is very revealing data for a search to have.