Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Graybar, sandbox effect and 3 days long success

         

darkyl

12:02 am on Dec 16, 2008 (gmt 0)

10+ Year Member



Here's my experience with the graybar/sandbox/-950 effect (I don't know which filter google applied or why).
I own a directory site listing stores of well known brands, the sites has around 2300 pages, most of them being the info pages of singles shops.

The structure is a very classic one: from the homepage you can see the whole directory skeleton, and all the categories and subcategory pages are listed (that's around 25 total internal links).

Each category (brand page) shows an original textual description of the brand and the links to the single stores pages.

At the beginning, when the site was 2 months old, almost all the pages ranked well, then suddenly most of them disappeared from serps or listed in the 100+ pages range.

The same thing had happened to another site of mine with an almost identical structure (not content) and that site reappeared without any changes after a few months and it's now very stable in the first positions for all it's keywords.

So I assumed it was something related to the sandbox effect (both sites were new).

In the past 2 months I started to fear it was something else: the site is on top of serp's for some keywords (city+brand name) and quasi non-existant for others and many pages are affected by the graybar phenomenon.

I decided it was worth to try some de-optimization, here's what I did:

- Decrease keywords density in meta descriptions, in H1 and H2, image alt's and in the other parts of the pages.
- Remove completely the keywords meta
- Remove the left menu, it was a copy of the top menu (it contained links to the main brands)
- Shortened some text that was the same in every store pages to reduce the repeated content percentage of each page.

... and other several smaller modifications.

I did these modifications in an arc of ten days and then, boom, ALL pages went from oblivion to ranking in the very first positions with many, many of them at google's number 1 spot.

Unique visitors skyrocketed 1000%, from 180 daily to 1800-2000.

Bad part of the story: it stayed like that for 3 days (the site was making good money also), then the Serps reverted to non-ranking (or ranking very bad - 100+ or 1000è positions).

I basically think I somehow managed to momentarily lift the filter and the site stated to rank how it deserved.
The problem is that, having implemented the modifications sequentially but without waiting to see the results of each modification, I can't tell what brought the site up and what brought it down again.

I thought sharing my story would be constructive...
Any comment? Ideas?

tedster

12:51 am on Dec 16, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Here's what I think was effective:

no - Decrease keywords density in meta descriptions
yes - in H1 and H2 image alt's and in the other parts of the pages.
no - Remove completely the keywords meta
YES! - Remove the left menu, it was a copy of the top menu (it contained links to the main brands)
yes - Shortened some text that was the same in every store pages to reduce the repeated content percentage of each page.

darkyl

1:35 am on Dec 16, 2008 (gmt 0)

10+ Year Member



Thanks for your reply tedster.

I assume you think removing the left menu was the best of the modifications done...

As I said the homepage contains the top menu, with links to main categories and subcategories AND the directory skeleton which contains basically the same thing.

Do you think I should further remove the duplication?

Also, do you have any idea on why the good rankings lasted 3 days only?

tedster

1:43 am on Dec 16, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I assume you think removing the left menu was the best of the modifications

Without seeing the site, that was my first take. Repeated important keywords in anchor text can be a devil of a thing.

Do you think I should eliminate further the duplication?

If it's not needed, yes. Some redundancy can be important for usability but you want to avoid too much, especially in links.

do you have any idea on why the good rankings lasted 3 days only?

There are too many variables to say for sure. Lots of times these days Google awards short term good rankings as a test.

How old is the site altogether now, and how long ago did your three days of good rankings happen?

darkyl

2:05 am on Dec 16, 2008 (gmt 0)

10+ Year Member



Thx again,

the site is 8 months old and the good days have been december 11-12-13 ... why are you asking?


If it's not needed, yes. Some redundancy can be important for usability but you want to avoid too much, especially in links.

I think you're right.
I'm also worried about the "brand" pages listing the stores.
They contain the links to the single stores pages and the anchor text of the links (hundreds of them, split up in page 1-2-3 etc) is always like this:

Brandname London
Brandname Manchester
Brandname Liverpool

... you got the idea. So basically "brandname" is repeated hundreds of times in the anchor text in the same page.

The alternative would be to use only the city as anchor text but in that case it would be less descriptive of the link page (consider that "brandname city" is exactly the phrase keyword i'm targeting) and I would have many identical link's anchor text sitewide since brandname A, brandname B, brandname C mostly have stores in the same cities.
What do you think?

tedster

6:39 am on Dec 16, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think you risk not being able to rank well for "brandname" - although it may not necessarily affect the longer tail queries, it might even do that.

the site is 8 months old and the good days have been december 11-12-13 ... why are you asking?

Many of Google's patterns today unfold over time, rather than once and done, at least until the next off-page or on-page changes occur. You are at the time where a newer site can begin to come into its own on Google, but this doesn't happen all at once. The improved ranking can yo-yo [webmasterworld.com] in and out, especially if there's any element of borderline over-optimization.

Google does want to serve their users search results that make them happy, and they wouldn't lightly exclude any site that can help them do that job. But the patterns they now use to "prove" a site's mettle and bona fides can be a good bit different than in earlier times.

darkyl

6:03 pm on Dec 20, 2008 (gmt 0)

10+ Year Member



A small but interesting update:

After the modifications I listed above and some minor modification to meta descriptions regarding the density of keywords, some pages are being lifted the graybar penalty.

Serp's from keywords that already ranked are fluctuating up and down (between 1 and 10 positions) right now.

The pages that had the graybar lifted finally show a white bar but seem still "filtered" out, anyway I think it might be a first step towards a "normalization".

I will post more updates when google recrawls more pages.

darkyl

11:10 am on Dec 25, 2008 (gmt 0)

10+ Year Member



Another update:

today, 25th December, the site again rised and it's ranking 1st or 2nd page for basicall ALL of it's pages intended keywords.

I did some more modification, more deoptimization (unlinked images, removed link titles) but at this point I'm not sure if anything I do has something to do with the site going up and down.

If this is the way of google testing a relatively "new" site, what are the good factors they're looking for?

I think that user behaviour should be the n°1 factor... bounce rate, internal navigation and so on.

If that's the case my site has a bounce rate that I consider very low for this kind of directory, between 15 and 20% so it should pay in the long run.

Let's wait and see.

tedster

1:46 pm on Dec 25, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Recently both John Mueller and Matt Cutts from Google have stated that Google is not using bounce rate as a factor in organic ranking. They point out that such a factor would be 1) very noisy and 2) open to abuse.

So bounce rate can clearly be a useful analytic for the webmaster to take note of, it's apparently not part of getting as good ranking on Google.

darkyl

4:48 pm on Dec 25, 2008 (gmt 0)

10+ Year Member



Yes but while they said bounce rate it's not a factor for organic ranking, it may still be (my opinion) a factor for google in deciding if a site considered "borderline" has to be filtered out or not.

So it doesn't influence your ranking, but it may influence the fact if you rank or not at all... does it make sense to you?

darkyl

2:46 am on Dec 27, 2008 (gmt 0)

10+ Year Member



Update:

After 2 days with 1000% (1 thousand percent) increase in traffic, the site went down again, back to ranking for only a fraction of keywords.

This yo-yo thing is making me nervous.

Whitey

6:03 am on Dec 27, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Brandname London
Brandname Manchester
Brandname Liverpool

... you got the idea. So basically "brandname" is repeated hundreds of times in the anchor text in the same page.

Are you using this in the meta title & description ?

Also , how much original [ words ] in content do you have per page ?

As a broad generalisation, which may / may not apply to your site, i think Google is looking for sufficient differentation between pages to rank the overall site better. A good start is the document structure , then the links pointing to it , both internal and external.

Tweaking these elements may see the site re scoring - too many tweaks may endanger it's stability [ IMO ]. Would be interested to hear what others think.

darkyl

11:12 am on Dec 27, 2008 (gmt 0)

10+ Year Member



Title is "Brandname London"
Meta Description are something like: "Brandname London: info, map and contact of the shop located in *street*"

As I said, original text is not very rich for each shop page since they contain basically only the main info about that specific shop.

All shops pages are linked from the the main brand pages and the anchor text of the links are all "brandname city" (what else could I call them?).

Please note that I have another site which is identical in terms of structure (documents, navigation and linking) and is ranking VERY well.

Whitey

9:02 pm on Dec 27, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have another site which is identical in terms of structure (documents, navigation and linking) and is ranking VERY well.

How old ? Older ?
Same links ? Interlinked ?
No of pages compared to others ?
Has this site always been stable ?
Same TLD ?

Your meta title seems to contain too many identical characters as a % of the total characters. Coupled with similar page content Google may be trying to say you have "duplicate content". In your case it might be called "similar content".

Your linking strategy is probably another aspect to look at. However, I'd be interested to see how your successful site compares.

[edited by: Whitey at 9:07 pm (utc) on Dec. 27, 2008]

darkyl

12:07 am on Dec 28, 2008 (gmt 0)

10+ Year Member



Yes, same TLD.
The successful site is almost 1 Year old, and also suffered for 4 months from a similar penalty or filter (but without yo-yo effects). The site then came back into serp's (without any action at all) and it's 4 months that its rankings are very strong. So strong that it outperforms the official sites of the brands for all intended and competitive keywords. The site has almost no incoming link.
It has around 400 pages.
The site is VERY optimized for search engines, but this doesn't seem to be triggering any filter or penalty.
PR (for what it's worth) it's 1.

The other site is younger (8 months), and has similarly fallen into a filter after 3 months after its birth, and basically disappeared from serps (like the other site did) for 3 months.
Then it came back, but only partially, and it's ranking for 10% of it's pages (usually 1st or 2nd page), and completely absent for the other 90%.
The site has 2200 pages and PR2.

The "successful" site has 301 redirects to the second site for it's single store's pages. The redirects have been put in place after the second site partially came back.
The goal was to abandon the first site completely and concentrate on the second.
The second site doesn't link towards the first.

Now, in december, the second site started to reappear 100% into serps for short periods, 2-3 days. This happened after the deopimization i described previously.

There's a ton of elements to analyze and many theories can be done.

I'm pretty sure part of the problems are caused both by the scarcity of text in the store pages and by the similarity of titles and anchor text.

The problem is that I can't think of any more coherent way of creating titles and anchor text: it's a yellow pages style site, so "Brandname City" seems to be the most appropriate way of naming them.

darkyl

11:48 pm on Jan 11, 2009 (gmt 0)

10+ Year Member



Here's a quick overview of the 2 sites:

Site A doesn't suffer from any penaly or filters.
Site B seems to suffer from -950, but now and then the filter is lifted and ranks incredibily well (yo-yo).

Site A has PR0
Site B has PR3

Site A is really over-optimized. Hasn't been modified in months. Very thin content.
Site B has been carefully deoptimized and has somewhat improved. Content is less thin and more original content is being added to all the pages.

Site A ranks 1st on very, very competitive keywords. The pages targeting the 3 "perfect keywords" have more than 100 links on them, basically no content, and 120 (120!) repetitions of that keyword on each page. They all rank #1.
Site B, as I said is yo-yoing very often: keyword density has been reduced and all aspects of seo have been taken care of (content, density, inbound and outbond links.

Site A has only 1 good thing that site B doesn't have: an incoming link from a PR8 site.
Can such a link be enough for google to "forget" about all the flaws?

tedster

1:19 am on Jan 12, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Site A has PR0

To be precise, the toolbar PR is gray - no information available, correct? A true PR0 woould be an all white toolbar.

Site A has only 1 good thing that site B doesn't have: an incoming link from a PR8 site.
Can such a link be enough for google to "forget" about all the flaws?

A really good backlink can go a long way in helping a url, but that does sound surprising. The first time I saw a url get released from the -950 (this goes back nearly two year) it happened withn 48 hours of a strong new backlink.

darkyl

1:52 am on Jan 12, 2009 (gmt 0)

10+ Year Member



No, it is actually a PR0, had pr1 and now it has PR0, white bar.
The whole site has PR0, except for the page receiving the backlink from the pr8 site, which has PR1.

It sounds surprising to me too (and frustrating).
The only real "good" aspect of siteA is that backlink.

The 3 top pages I mentioned that rank #1, rank for VERY competitive terms, top searches for a particular brand.
I can't believe they're at #1 place (3 months now) and the main keyword is repeated 120 times on the pages! (i would penalize those pages myself!).

darkyl

6:20 pm on Jan 17, 2009 (gmt 0)

10+ Year Member



Another update:

The site (site B) keeps yo-yoing, and I'm noticing other sites are also alternating in the first 10 positions.

But it is interesting to notice that almost all pages are "healing" from the greybar syndrome and I think eliminating the second duplicated nav menu may have helped this.