Forum Moderators: open

Message Too Old, No Replies

Update Florida - Nov 2003 Google Update Part 3

         

LaBonne

5:41 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



continued from: [webmasterworld.com...]

The panic is settling down, the whine of worry is receding to a steady hum in the back of my head, and several recovery plans are forming...

I lost my index page entirely, due to lazy keyword stuffing. My fault! Unfortunately, mine is a very small business: no listing = no food (let alone xmas).

I was planning on overhauling the website anyway, and I've given myself until 1/1/04 before I accept an opening with another business and abandon my own. The question now is: overhaul the index page and resubmit to Google immediately, overhaul the entire website and resubmit the whole thing in a few weeks, overhaul the website (starting with the index page of course) and wait for Googlebot. Time is most definitely a factor.

...are any of these plans likely to restore my index page to the directory before I have to throw in the towel in January?

There are also longer range options of starting over with a new website and closing the old.

Mahalo Nui Loa! (Thank you very much!)

Jakpot

9:30 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



Every thing that I have looked for in Google in the last few days I have found with no problem - results look good and relevant for what I have searched for.

Google is better for me. My pages are the same or up in the SERPs and traffic from Google is up so its much better for me

lbobke

9:34 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



I don't think its a keyWORD density penalty. I think its a keyPHRASE density penalty.
I just checked a three word search with 3,300,000 results. None of the top ten results had the exact search phrase on the page more than twice. Half of them didn't even have the exact phrase on their page at all.

Sorry, but:
for one of my key phrases, I'm number one with three occurrences of the exact phrase (density 10,34%), 5,560,000 hits for query.

For another phrase, I'm #5 with 6 occurrences of the exact phrase (16,22 %) 3,910,000 results for that query.
Didn't see much of a change actually.

Laurenz

c1bernaught

9:36 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



selkirk:

I agree this is exactly what I have been seeing.

Titles that contain a KW phrase more than once are hammered. However, a title with the KW phrase and then a supporting KW are doing very well.

This begs the question: What is now too relevant? Seems crazy that you now have to have obscure relevance to be the most relevant....

Chndru

9:37 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>With the amount I spend on adwords every month they owe me their souls!

Yeah, Right! :)

agent10

9:37 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



What is the answer though to good sites that have been well placed in Google for over a year to now just vanish. I did as GG said and reported our site but have heard nothing and don't seem to find an answer here too.

Any advise, site was def bone fide, what has gone wrong?

fashezee

9:39 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



DUPLICATE CONTENT - LOW BANDWIDTH VERSION

Our site may have gotten penalized for duplicate pages. We were in the
top 10 for atleast 20 keywords. We provide a text version of our site
for low-bandwidth purposes.

Should we have disallowed robots from indexing such pages? We have a
PR 6 for our home page but were are not even in the top 100 for 19 out
20 keywords.

Could this have been the reason?

synergy

9:41 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



Should we have disallowed robots from indexing such pages?

Yup.

Could this have been the reason?

At this point, who knows?

Chndru

9:41 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



hmm..duplicate content..BBC has a text version of their site and they seem to be doing not bad at all.

bcolflesh

9:41 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Should we have disallowed robots from indexing such pages?

Sure - if it was duplicate content.

bcolflesh

9:43 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



bbc.co.uk/robots.txt

Disallow: /cgi-bin

webjockey

9:45 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



They should rename this update the "hokey cokey". One minute we're in then the next we're out, in out, in out shake it all about.

Chndru

9:48 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



ahh..thanks bcolflesh. i misunderstood the Q. And, i stand corrected.

fashezee

9:49 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sure - if it was duplicate content.

Well I should have now, but I thought the point of the robots.txt
files was to not let sensitive information be indexed by crawlers and
not to help crawlers determine what is duplicate content for valid
reasons as oppose to a spam technique.

If they have all these filters to detect spam, why can't they detect
VALID DUPLICATE CONTENT?

skipfactor

9:52 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Jakpot saw your last post word-for-word on another forum. Who signs your paycheck? ;)

killipso

9:54 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



Pretty soon I'm going to tell my Dad - then you'll all be sorry!

Did I disrespect you in anyway bro?

Was I talking to you?

I won't start a flaming war here I just find your post offensive and lack of bad taste.

If you would like to talk to me sticky me and I will glady forward you my phone #

This isnt the place for insults.

Dan

DerekH

9:54 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



fashezee wrote
If they have all these filters to detect spam, why can't they detect
VALID DUPLICATE CONTENT?

Please look at the earlier posts - news.bbc.co.uk has duplicate content all over the place - graphics intensive and text-only news pages with the same textual content. It's not a problem for them...

DerekH

lasko

9:55 pm on Nov 19, 2003 (gmt 0)

10+ Year Member




Because everyone would make Valid Duplicate pages to help their rankings.

If you read the guidelines, google points out the robots read pages like it would appear under a text browser so the robots would see them the same way. Which would be duplicate.

Google Guidelines

Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would.

[edited by: lasko at 10:02 pm (utc) on Nov. 19, 2003]

BradBristol

9:55 pm on Nov 19, 2003 (gmt 0)



Google is having some "problems" (That's a nice way of saying google is broken.)

Hmmm www-sj (the home data center) is off line and has been for some time and in just the last 18 hours or so www-zu has went off line as well...

As I have said before there two and only two reasons for Google to be acting the way it has been.

1. Google is broken.

2. Google is acting this way on purpose.(I personally find this hard to believe)

Take your pick...

<edited for grammar>

agent10

10:03 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



Any ideas as to which datacentre is fav to take the helm.

MyWifeSays

10:04 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



[quote]This could be a very interesting tool by the way: enter a URI and the keyword density is analyzed for all incoming local and/or external links.

There is a tool out there that gets close to doing this, but I won't mention it by name ;P{/quote]

Just tried to find this tool by searching on Google and gave up after a few attempts - all I kept getting was bulletin board postings.

soapystar

10:04 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



1. Google is broken.

2. Google is acting this way on purpose.(I personally find this hard to believe)

3. its all a bad dream.

take your pick.

seaboy

10:05 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



Just another oddity to throw into the mix.
A search for 'Foobar company' (no quotes) and a client is #1, but search for 'Widgettown Foobar company' (again with no quotes) and they are not in the top 200.
Now in this case Widgettown is a very competitive tourist destination, which I think may be a factor, but it is also the location of this particular Foobar company.
Any ideas/theories?

willybfriendly

10:07 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Getting on to other work, I was researching widget history for an article I am writing. Of the top ten results, 1 is an empty page, 6 are e-commerce sites (1 publisher, 2 bookstores, the others sell widgets), one is a museum page and 2 are ads for forums...

Don't I remember reading in these threads that scholarly searches are better now that all the spam has been cleared out?

WBF

moomelman

10:09 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



Im sort of wondering why Google waited for almost 6 months to do a hardcore update.

There must be a reason why they have waited so long.

And does this mean the fresh updates now will be replaced by the old once a Google Dance?

c1bernaught

10:16 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



BradBristol:

umm... I know... I meant anyone other than you and me...

James_Dale

10:19 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



There are many different ways to get relevant results using algorithms. IF Google were to pick a few of these and rotate them, then that would confuse SEO people looking for a quick exploit. Instead, page quality would become the primary focus. In my opinion, if none of us change anything and leave it three months, we'll be back in our former positions. The same applies to the current top listed results. The wheels on the Google go round and round, round and round...

seaboy

10:19 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



c1bernaught - yes, I can.

claus

10:20 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Datacenter sj has been down for a long time...

Oct 17, 2003: SJ Datacenter is down? [webmasterworld.com]

Of course Google is not broken, they're just doing an update/upgrade/tweak/something...

/claus


Added:
As to why they've taken so long, i think they've been preoccupied with distributing the spidering and the index, and getting the adwords thigy running... and a few things more i suppose, among them testing what they would do next, ie. now

[edited by: claus at 10:31 pm (utc) on Nov. 19, 2003]

James_Dale

10:22 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



sj has been offline for a month or so. It may be that this is no longer being used. It crashes my ABCPR :(
;)

c1bernaught

10:25 pm on Nov 19, 2003 (gmt 0)

10+ Year Member




Looks like the "Mistake" theory may have some credence lent by this latest confirmed news of sj and zu being offline...

Any comment? Anyone see this as normal?

This 688 message thread spans 23 pages: 688