Forum Moderators: open
1. Good. for webmasters with new sites because they can be included quicker.
2. Possibly Bad. I think everybody will be constantly tweaking their pages to improve ranking (yes, i know that happens now but my point is what effect does a quicker turnaround for changes have), similar to the type of thing which happens to some sites who have paid to be incuded into Inktomi for example, making the Google database a constant test index virtually.
I suppose my main question would be is point 2 really that much of a concern, can anybody else see an reason for it or against it.
We always want the latest and greatest. We live in a world of live tv. (Just look at the coverage of Sept 11 and this war) We want information on demand. FASTER just isn't enough. It makes only since that SE should update more offten.
What do you mean? Is it just me(maybe I have been under a rock somewhere) But I don't realy notice much SPAM on the web(E-mail is a whole nother story) but as far as SEs go sure there is a lot of listings that are not relivent to my search but that is uauly becasue the words I use have double meaning, and I'm am not specific enough. Maybe I have just been Lucky.
Around here, the word SPAM is often used as a synonym for the word competitor.
That may be the problem. Actually, I don't see much search engine spam, either. Especially off-topic spam. Admittedly, I do see a lot of competitors, and of course I like my sites better than theirs.
I think a constant update would be very helpful for webmasters and users.
PRO: No more gigantic monthly update thread. Not to mention the "I think it might be starting" threads for the week before... ;)
Exactly ;)
The downside for Google is how many more searches they will be doing a day from all the webmasters checking to see how their sites are ranking on all their chosen keywords each day (or multiple times a day!)
Wake up, coffee, WebmasterWorld, Check Google rankings...
Also
>>>>Around here, the word SPAM is often used as a synonym for the word competitor.
Perhaps that isn't such a good thing to do? It's kind of disrespectfull to them isn't it. I mean I wouldn't want to wrongfully accuse my worse enemy of SPAMING let alone my worse comp. Allthough I will admit I do sometimes complain doggedly aboult them.
So true. I definite spam as dominating the SERPs with multiple domains, such as one guy having 8 of the top 10 spots. Or by design a site coming up high on an irrelevant SERP, such as a porn site in the top 10 for "travel Europe". However, all to often people here use spam to mean "a site that sells the same thing I do, but has far less content and is of lower quality in my opinion, is beating me out." I see far less spam on commercial SERPs than all the whining by people here suggests. Thus, I suspect that spam as used here often just means competitors doing well.
How about hidden text, no script tags filled with keywords and redirects?
As for a rolling update this would be a good idea as sites would become more relevant, e.g. fewer 404 errors, and Google would carry the latest information on trends, products, services, etc.
[edited by: sem4u at 7:15 pm (utc) on May 8, 2003]
is this an exsample?
[edit]
(not sure if i'm allowed to post this, if posting it is violation.. sorry)
Also what do SE's do aboult this? Do they activly pursue this mater or do they just say be honest and don't do it?
[edited by: rcjordan at 1:03 am (utc) on May 9, 2003]
[edit reason] sorry, no specific references to sites. [/edit]
:::is this an exsample?
A list of web sites that get an award isn't, but if you find a place where you can add a link and it's added right then, then it's a Link Farm, like FFA pages.
[edited by: Jesse_Smith at 9:26 pm (utc) on May 8, 2003]
G has been quite clear that this is where they are going. G has also been quite clear to look for more activity from FRESHBOT in the future.
They want constant, all major engines do (cause fresh means the potential for more timely and relevant info), can it be done well is the question, and Freshbot is the testing ground.
My site sells widgets. But lots of people misspell it "whigets". I'd never put the word "whigets" on my site, because it's wrong. But I wish I could list it as a keyword so more people would find me.
I would guess that google can measure the success of a new algo by the decrease in spam reports. Once the spam reports exceed a certain threshold, or a certain technique becomes more widespread, they tweak the algo and the counter starts at zero.
If the web is a continuous update, something akin to guestbook spamming could become an "overnight" problem for a site like google.
I imagine that a technique, especially say, a bug exploit could instantly sour the SERPs for weeks while Google attempts to rewrite the algo to compensate.