Forum Moderators: open
I thought the sandbox filter applied only to new sites and/or new links, which here it's not the case. Do you have an explanation to this?
[edited by: Marcia at 11:05 pm (utc) on May 21, 2004]
[edit reason] Fixed generic keywords. [/edit]
No, that's not being sandboxed, it's something else. Sandboxing refers to new sites that are included in the index but can't rank for any keyword searches until a certain amount of time has elapsed.
We have a 4 years old web (pr 5), most links to it are older than one year, but... for search keyword1 keyword2 we are #8, and for search keyword 1 keyword -widget1 -otherword... we are #1.
Back on topic, can anyone explain what is happening, and what that search is showing?
I'm curious myself, I've never understood the "nonsense filter" and what it is meant to do....
Thanks,
TJ
i haven't come across any new penalties lately
To extend your analogy:
"Children are made to stay in the sandbox when they're not being allowed out to play. New sites seem to be held back for a period of time before they're allowed out on the playing field to rank for search terms."
Then while one is in the "sandbox" one needs to learn how to play within these confines...keep working on your site..building out content...acquiring links until you are "released" from the sandbox...count the grains of sand...
Google deploys the "sandbox" for obvious reasons...way too much ready made, ready linked, pr rich sites showing up for only one purpose...to dominate the serps....and the some aggressive ones are for dishing out adsense...yikes..but really don't offer real value to the targeted end users..usually a bunch of jibberish...troubling
My company has a similar network structure and after being #1 on many highly competitive keywords we began to drop for the main keywords and stayed #1 for the “keywords + some other term”, but not on the keywords alone.
If you are experiencing a constant drop, I'm afraid you can expect this to continue and to get even worse if you don't make some changes.
I couldn't tell you exactly what to do, as I'm experimenting myself, but so far the 3 approaches I'm testing are:
1- Consolidating all of my sites into 1 megasite
2- Eliminating excessive interlinking, basically not to use more than 1 link to a specific page per page ( widgets.html shouldn't have more than 1 link to widgets2.html )
3- Pull out the wallet and spend some money on AdWords.
So far I can tell you that the 2 last approaches have at least stopped the drop, and have even gained me back some main positions for "solo" keywords; the 3rd one, I don't know yet, but anyway, I'm convinced that consolidating everything into 1 megasite has much better "long term" benefits.
Hope this helps.
I'm not sure what measures you are talking about, but I assume they must be at least similar, as you might consider the use of excessive interlinking and the use of duplicate content/nav structure as "overoptimization".
Remember that it is not the use of a magic key what will achieve and maintain good positions on SEs, but instead it is the combination and balance maintained in the use of several optimization elements.
Remember that those elements increase or decrease in value from time to time, according to SE's momentum and your site must be flexible enough to adjust as they change.
My advice... don't worry too much about optimizing for the "momentum", but instead try to use the most balanced combination of "magic keys": balance every part of your pages for approximately 66% keyword density, always use fair marketing tactics, build a content rich site and work the hardest you can on building a strong network of high quality external links focusing on those sites relevant to your own site and always try to avoid excessive linking sites.
I am trying the following measures to "deoptimize" my site:
- Avoid some cross-linking among domains
- Lower target keyword density.
- Erase all alt tags
- Change internal links text from "target keyword" to "home"
66% keyword density
...really?
HarryM:
I know this number isn't absolute for all page areas, but according to my experience, if you stay close or below this density, you can play around with all the other optimization elements, trying to always keep them within a decent range to avoid “over-optimization”. But other than that, there's not much you can do in regards of optimizing your pages; the overall success or failure of your site will depend on many "external" tactics, as external links structure, banner and pay-per-click ads, etc.
Samba:
I am trying the following measures to "deoptimize" my site:- Avoid some cross-linking among domains
- Lower target keyword density.
- Erase all alt tags
- Change internal links text from "target keyword" to "home"
I agree with the first two measures, but I'm not so sure about the last 2: Normal use of Alt tags doesn't seem to be damaging my SERPs, but overuse might; keywords on links text should be balanced as in any other part of the page. IMO the key is BALANCE.
Good luck.