Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: mademetop
I have to believe that the ultimate goal of a search engine is to reward good content. Regardless if a sandbox exists or not, I just keep writing.
I figure that my content is up to around 250,000 words. If I reach a million and I am still in the sandbox, I will back off. Until then... good thing I have a day job.
By having design and content KISS I've been able to endure update after update for years always with moderate gains and zero tragedies.
If I'm doing search for 'widgets', am I looking for information about the widgets, or for site that sells them?! If I were looking for widgets online shopping, I'd do 'widgets shop' search, wouldn't I?!
New Google algos are going in direction of forcing us to make sites valuable for users. For example, the outbound links issue: if you want your site ranking high, you'd better link to some valuable on-topic sites. It means, you have to provide some valuable information.
It's a matter of time, when Google will include HTML parser able to see all css and js tricks used by SEO, or maybe they are doing it already? Maybe some of these people complaining of falling in SERPS just forgot to mention, that they used css to hide stuffed keywords or js to hide affiliate links?
If you're really advanced in all web-related technologies, and you're real SEO expert, you could still devise some tricks to boost your SERPS. But you'd never know, when they would lead to penalty, because Google has smart guys working to detect the tricks, and knowing the algo, the can easily predict what tricks will SEOs play.
So in long-term web projects, it's much wiser to do just an optimisation, to make site readable for crawler, and to make PageRank concentrating on desired sections of your page, and focus on natural content building which will attract users.
(1) professional web developers often put content in graphics instead of text; unfortunately Google can't "read" graphics. I can't tell you how many companies have home pages that—as far as Google is concerned—are about *nothing*.
(2) writers write for readers, not search engines. For example, a writer might title (<title> and <h1>) a short essay on tourism in Paris, "Getting the most from the city of lights." But Google is not aware that the "city of lights" is Paris, and nobody looking for "tourism info" is going to be putting "getting the most" into Google. (Gerunds are search-engine losers.)
(3) after learning the canonical <h1>, <h2> ... tags, many web developers decide they're ugly to start and a pain to change, and go with <font> tags or DIVs with names like "primaryhead," "secondaryhead," etc. But Google doesn't know that "primaryhead" is any more important than the DIVs "footnotetext" or "minoraside." So, use semantic coding—h1, h2, h3, blockquote, ul, ol, p, etc.—to tell Google what's really important to you.
Anyway, this list could be extended quite a bit, but the idea is to understand how Google sees your site, and do whatever you can to help it. I like to imagine Google's a very literal minded 6-year old.
More advice along these lines can be found in Peter Kent's "Search Engine Optimizaton for Dummies."
Google's Webmaster Guidelines are a good starting point for people who want to provide easily digestible "spider food" without getting themselves into trouble.
However the last years of being member here haven't passed by without learning. I did not notice at once, but now I know a whole lot more about SEs than before. As a consequence I took babystep after babystep to change my site. Switched to CSS to reimplement the <Hx> tags while maintaining a certain look. Implemented a site map. Implemented internal linking. Trying to get external links. A little tweak here, a little tweak there.
Apart from content, I think it's these little tweaks that kept me in the top-5 for any given keyword of my niche. Without these, I guess I'd still be on page 1 or 2, but not in prime spots.
The good thing, though, is that it was for the benfit of my visitors too! They can read my site on slow connections now. They can resize the text now. They have a much better navigation now. They can print the pages now with a clean printout.
Only by learning all this here on WW, my site became a better site - both for visitors and for spiders.
If only it was that easy, for some categories it is virtually impossible to get a link.
I agree with all the usual stuff about writing quality content etc, but the DMOZ listing is something that we have no control over (unless you are the editor of that category).
Also, it does no harm to know as much as possible about the search engines and SEO, for me the most important lessons are not "what to do", but "what not to do".
They don't really seem to help the content empty sites for long - I have seen some be listed high for a while then gradually fade away.
DMOZ does not carry much weight any more from what I have seen. We have one site listed there, and two sites not listed, and the listed site does not do as good as the others.
I think the main trick is in learning what NOT to do and what mistakes to avoid, and how to make your site Google-friendly. Beyond that I think any gains drop off quite a bit.
[edited by: Wlauzon at 11:49 pm (utc) on Feb. 22, 2005]
Brett wrote the 26 steps a while back, and I for one feel its dead on.
But since Allegra, the ranking suddenly dropped after 2 years of smooth sailing white-hat 26 steps approach.
Naturally, one interprets that the rules must have changed at G.
4. Make sure that non-www redirects to www with a 301 redirct so you don't get mistaken for having duplicate content.
5. Keep your code light: headings, paragraphs, lists, tables and forms, styled using an external CSS file. Get rid of all font tags, inline styles, spacer GIFs, divs and spans, and all other clutter.
6. Use the title tag, meta description, and headings wisely. They are quite important.
Umm, just read and implement Bretts 26 steps and you'll be fine.
Funny how so many report seeing a lot of the really spammy sites disappear with this update, but then complain that their site is gone - not sure they realize the correlation between the two.
We have sites that were designed exactly the way your comments suggest they shouldn't be yet dominate highlly competetive industries, without comparisson.
Sites can and do benefit from longevity and content as opposed to css or xhtml coding - they do however fall under the radar when it comes to 'over-optimisation'.
[edited by: conor at 12:24 am (utc) on Feb. 23, 2005]
Looks like you guys will all go to webmaster heaven.
I'd say it's an excellent strategy for hobby webmasters and employees.
But seriously: yes, it definately works. The downside is time is money and you have to ask yourself if the marginal time you spend on your site could be better spent somehow else.
My approach is to create multiple sites for the same vertical. Some are safe and hopefully these KISS projects stay at the top. Others are aggresive and used to learn how to go beyond the fundamentals. I lost about 30 million pages that were indexed in Google due to the Allegra update but I'm glad it happened. I would much rather be in a situation where sites get banned and I'm learning from it rather than one where I put all my faith in Google's guidelines and learn nothing. Plus it was pretty funny.
[edited by: iblaine at 1:13 am (utc) on Feb. 23, 2005]
Just as many people report seeing rubbish sites appearing at the top of serps that were not there before.
There are many posts in the forum here about "legit" sites (MIA) having dropped severely in ranking, with no question of over-optimization or 26 steps negligence.
That is the puzzle of Allegra, and creates questions about what to do, if anything.
That is the subject of this post, not how-to-make-a-good-website-for-top-ranking-in-google.
Rephrasing the question: blame Google (and sit it out), or blame yourself (and change your website).
Sit it out is honorable, but fretful.
Changing your website, if you follow the 26 steps plus some of those other wellknown tips, sounds like rewriting the rules somehow.
If only we all agreed/concluded "poor" sites ("poor" being..... just fill it in) got dumped, we could all be talking about the weather and congratulating G on an excellent upgrade of the algo.
(1) EVF and I do pretty well, thank you, and never hang by our hairs.
(2) My family's financial security doesn't depend upon not getting caught.
(3) If you have to worry about SEs "finding" your site, you've already lost.
(4) As your post count indicates, you get those "life pleasures" at Webmaster World.
(5) Does fleeting advantage in search engine positioning really make up for being unable to communicate in English?
And for 2 members here that are on track to do 7 figures from adsense alone this year. They rarely worry about their rankings (they have so many, they simply can't track them all).
I didn't even begin to mention the major seo firm who's strategy is massive quality content and nothing else. Neither one is really white or black.
I think it is every webmasters dream to get to the point were they can pull the plug on the search engines (addiction is a two way street).
Alot of one liners and off topic comments were removed. Just as the se's want...so do we want quality content.
It all comes back to one central question every real webmaster has to ask themselves, will this help my visitors?
I'm sure MSN or Yahoo doesn't do any search engine optimization.
Unfortunately, last time I checked, we're not all MSN or Yahoo..
Another example is Google versus Procter and Gamble.
Google has a pretty miniscule advertising budget (they don't SEO television, I guess) - just build it and they will come is their attitude.
P&G, the company that just swallowed up Gillette, would toss you out the door so quick if you tried that kind of craziness around them.
They SEO the heck out of television, magazines, etc.
[edited by: blaze at 3:29 am (utc) on Feb. 23, 2005]
And for 2 members here that are on track to do 7 figures from adsense alone this year
There are people on this board that have regularly have 7 figure MONTHS gaming SE's. Not sure what your point is. If it is that you can do ok with largish content sites that rely heavily on SE traffic, sure i would agree. If it is that gaming the SE's is not lucrative, i would disagree.
However, it won't be search engines that will do it. It will be blogs.
I believe more and more people will use blogs to refer them to content and products they are interested (buy and do what smart people you respect suggest to you).
Gaming blogs will be very very very difficult.
Which means that what Brett says is accurate, in that scenario.
Currently, though, short term hit and runs are very juicy, indeed!