Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: open
I have static content and I've just checked 62 search phrases.
On average I'm up about 15% in terms of the average placing in the SERPS. One or two top spots have gone and I'm second, other top spots have been gained where I used to be number 2.
But as I wrote in another thread yesterday, one page has vanished from one search, except if I add -waffle (or -anything not there). It's still there for other searches....
So I'm not sure that the distinction you ask for actually exists!
I was curious to know if people here who have sites which have remained top have limited visibility (according to the traditional view) - perhaps better to ask if any site which has remained top has no tables or has text outside the tables.
We have a lot of older pages that uses tables. However most of the content is outside the tables. The style tags have been all stripped out and added to an external css file.
New content doesn't use tables (Thanks to all of you here @WW). The older content when it needs updating is losing the tables.
There doesn't appear to be an advantage to either method.
Visits are 2 1/2 times what they were 2 mos. ago (approaching 1000/day), with 2/3 of the hits on internal pages.
Very few keywords in anchor text ( anchor text might once have been an indicator of relevant content; now it's an indicator of SEO spam used to artificially inflate page rankings).
CONTENT is King. EVERYTHING else revolves around the Content.
[edited by: proton at 5:53 am (utc) on Dec. 7, 2003]
With Florida we initially lost a couple of our major keywords but over the last 24 hours these seem to be coming back.
We have one site that has lost one major commercial keyword. This site has the key word real estate as a file i.e. www.xyzxc.com/realestate (an exception to our common rule)
Our sites are very content laden and generally have domain names with keywords eg Geographic-generic.com. These have done well. Domains with several hyphens eg geographic-generic-widget.com have not done well...but we only have two sites with this type of domain architecture hence it is possible that this is an abberation.
Stemming has also markedly helped us particularly with the plural and non plurral eg maps and map
The negative experience that many sites are now having with Florida happened to us with Dominic and its taken us 7 months to recover. For our site, Florida has been a real life saver.
I have a simple layout and design methodology using tables and an external CSS. Tables usually aren't the problem when there is good use of a CSS. The problem comes when tables' attribute tags are used entirely too heavily.
Now my URLs may be static but there is a great deal of dynamic content on my site.
Also I thought the final end result of all Googles alogorithms tweakings (probaly not there for several years yet) were to rate sites as if they were 'naturally crerated' as if you were say writing a book or a newspaper article on a subject.
After briefly looking for people to link with we have now stopped and leaving people to find and link to us if they want-from what Iv'e read about recipricol lionking it just seems to be too risky and asking for the wrath of Google.
We have no java no flash, a few tools driven by asp
Tables usually aren't the problem when there is good use of a CSS. The problem comes when tables' attribute tags are used entirely too heavily.
Would you mind expanding on that please and maybe point to evidence of this. I use a visual (WYSIWYG) editor for some of my sites. It has been criticised in the past for producing too many table tags. I assumed Googlebot only read what was outside of <tag> when indexing a page content so it didn't matter what was inside <tag> in terms of content indexed.
original quality content + smart subtle keyword placement = long term positive results.