Forum Moderators: open
The panic is settling down, the whine of worry is receding to a steady hum in the back of my head, and several recovery plans are forming...
I lost my index page entirely, due to lazy keyword stuffing. My fault! Unfortunately, mine is a very small business: no listing = no food (let alone xmas).
I was planning on overhauling the website anyway, and I've given myself until 1/1/04 before I accept an opening with another business and abandon my own. The question now is: overhaul the index page and resubmit to Google immediately, overhaul the entire website and resubmit the whole thing in a few weeks, overhaul the website (starting with the index page of course) and wait for Googlebot. Time is most definitely a factor.
...are any of these plans likely to restore my index page to the directory before I have to throw in the towel in January?
There are also longer range options of starting over with a new website and closing the old.
Mahalo Nui Loa! (Thank you very much!)
I don't think its a keyWORD density penalty. I think its a keyPHRASE density penalty.
I just checked a three word search with 3,300,000 results. None of the top ten results had the exact search phrase on the page more than twice. Half of them didn't even have the exact phrase on their page at all.
Sorry, but:
for one of my key phrases, I'm number one with three occurrences of the exact phrase (density 10,34%), 5,560,000 hits for query.
For another phrase, I'm #5 with 6 occurrences of the exact phrase (16,22 %) 3,910,000 results for that query.
Didn't see much of a change actually.
Laurenz
I agree this is exactly what I have been seeing.
Titles that contain a KW phrase more than once are hammered. However, a title with the KW phrase and then a supporting KW are doing very well.
This begs the question: What is now too relevant? Seems crazy that you now have to have obscure relevance to be the most relevant....
Our site may have gotten penalized for duplicate pages. We were in the
top 10 for atleast 20 keywords. We provide a text version of our site
for low-bandwidth purposes.
Should we have disallowed robots from indexing such pages? We have a
PR 6 for our home page but were are not even in the top 100 for 19 out
20 keywords.
Could this have been the reason?
Sure - if it was duplicate content.
Well I should have now, but I thought the point of the robots.txt
files was to not let sensitive information be indexed by crawlers and
not to help crawlers determine what is duplicate content for valid
reasons as oppose to a spam technique.
If they have all these filters to detect spam, why can't they detect
VALID DUPLICATE CONTENT?
Did I disrespect you in anyway bro?
Was I talking to you?
I won't start a flaming war here I just find your post offensive and lack of bad taste.
If you would like to talk to me sticky me and I will glady forward you my phone #
This isnt the place for insults.
Dan
Please look at the earlier posts - news.bbc.co.uk has duplicate content all over the place - graphics intensive and text-only news pages with the same textual content. It's not a problem for them...
DerekH
If you read the guidelines, google points out the robots read pages like it would appear under a text browser so the robots would see them the same way. Which would be duplicate.
Google Guidelines
Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would.
[edited by: lasko at 10:02 pm (utc) on Nov. 19, 2003]
Hmmm www-sj (the home data center) is off line and has been for some time and in just the last 18 hours or so www-zu has went off line as well...
As I have said before there two and only two reasons for Google to be acting the way it has been.
1. Google is broken.
2. Google is acting this way on purpose.(I personally find this hard to believe)
Take your pick...
<edited for grammar>
There is a tool out there that gets close to doing this, but I won't mention it by name ;P{/quote]
Just tried to find this tool by searching on Google and gave up after a few attempts - all I kept getting was bulletin board postings.
Don't I remember reading in these threads that scholarly searches are better now that all the spam has been cleared out?
WBF
Oct 17, 2003: SJ Datacenter is down? [webmasterworld.com]
Of course Google is not broken, they're just doing an update/upgrade/tweak/something...
/claus
[edited by: claus at 10:31 pm (utc) on Nov. 19, 2003]