homepage Welcome to WebmasterWorld Guest from 54.198.130.203
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld
Visit PubCon.com
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 161 message thread spans 6 pages: < < 161 ( 1 2 3 4 [5] 6 > >     
Google seems to be getting more difficult to use for me
Brett_Tabke




msg:135879
 7:12 pm on Feb 17, 2003 (gmt 0)

I am finding Google more and more difficult to use. I don't think Google is any worse or any better in that regard than it was a few years ago. However, I do expect more from them today than previosly.

This isn't anything new really, it's just a level of frustration setting and waiting for Google to come up with an easier way to search than they have at current. It feels like they have stopped dead in the water on search and settled for what they have.

I really didn't even know there could be a faster, more accurate way to search until using that 'other engine' with it's nifty suggestions. They cut search time by multiples and make the actual process much less of an intellectual brain teaser. Instead of figuring out the right keyword combos to get Google to generate the result I want, I could be putting that effort into viewing information I want, or getting back to work.

In alot of ways, Google today is feeling more an engine from the pre-Google days. I sure hope they have something up their sleeve.

 

photon




msg:135999
 3:31 pm on Feb 20, 2003 (gmt 0)

This is a great thread! I was recommending Google to people years and years ago. Back then I was constantly amazed at how often exactly what I was looking for popped up at or near the top of the listings.

Now I find myself looking through pages and pages of results, adding and sutracting keywords, trying to find relevant results.

Two recent (and related) examples:

#1--I'm planning a trip to Paris this weekend and was trying to find out information about my hotel, particularly independent reviews of it. No problem, I can enter the specific name of the hotel and avoid a lot of chaff. Unfortunately, what I got back was tons of affiliate sites wanting me to make the reservation through them, 90% of which used the exact same description of the hotel. I never did find the hotel's home page in the listings (I eventually found it buried in another site), and out of all the listings I was only able to find one or two that told me about individuals experiences there.

#2--I was looking for sound and video cards for a new computer I bought, as well as other add-ons such as memory, etc. I wanted to read information about the devices, get reviews from users, etc. Unfortunately, the majority of results were just sites that had the items for sale. Adding the keywords "review" or "opinions" didn't do much to reduce the number of selling sites.

I wish there was some way to say "ignore the sites that are trying to sell this to me unless they have significant information about it". I often have to go a few pages in (and I have my defaults set to 50 results per page!) to get past the commercial sites. Too often these days the "optimized" commercial sites are front-loaded, while the really significant, much smaller sites are way down the list, if they're there at all.

Thanks for letting me get that off my chest.

gopi




msg:136000
 9:15 pm on Feb 20, 2003 (gmt 0)

GG , the changes i suggest is not directly related with users but have a indirect impact on search relevency.

Remove that little greenbar from the directory and toolbar...it has created enough damages...It encourages SEO's to buy high PR Links , affects web's natural linking structure , makes it a bit easy to understand the algo ( this is bad, you want to keep the algo as mysterious as possible! ) ...

Also remove the link:<url> command ... i dont think any " real " searcher is using it ... only helps webmasters to analyze the rankings :)

I believe google search will be far better if it dont include any feature (not used by normal non-webmaster users) that helps (even remotely ) to crack the algo ...

daroz




msg:136001
 9:25 pm on Feb 20, 2003 (gmt 0)

Gopi:
you want to keep the algo as mysterious as possible!

Without wanting to deteriorate this conversation, but one can (IMHO easily) make the argument that more open algos, forumulas, processes, functions, etc. tend to survive longer and either sink or swim due to public scrutiny.

While I think that, in general, PR is A Good Thing(TM), it does have it's flaws. And the little vision or insight that the Googlebar and link: searches give us both help and hurt us.

The bigger issue that you raise is that there is a flaw in the rankings that is allowing link farmers / spammers / etc. to get abnormally high SERP rankings. The root cause here is not the insight into the algo, but the algo itself.

The challenge here is not to hide the information, but to produce an algo that can stand up to the full barrage and attempts to circumvent it when the algo is publically known.

gopi




msg:136002
 9:46 pm on Feb 20, 2003 (gmt 0)

Daroz ,IMHO , Open Souce like model dont work well in SE algo as it is in Software/OS Development ... because in the latter people help to develop a more robust system as nobody have a direct "monetery" advantage in breaking the system .

But in the case of SE algo a lot of people's business/living depends on good rankings , so there will always be a large group of people ( including me! ) who try to crack the system whatever the SE is!.

So the more close the system is the more difficult for people to break in. Nobody can make a foolproof SE but the goal is to minimize the number of people/sites exploiting it as much as possible!

I agree PR is not perfect but its still the best system we have and Google have to protect it at any cost

annej




msg:136003
 10:05 pm on Feb 20, 2003 (gmt 0)

Waaayyy back in this thread someone wrote

1) Give significant weight to the relevance of entire web sites rather than just the relevance of individual pages.

I'd like to see something done that would give extra weight to the homepage of a site that has several other pages relating to the topic. If I want to find out more about widget history an in-depth multipaged site on the topic would be far more useful to me than a single page that just summarizes the topic.

Anne

awoyo




msg:136004
 10:46 pm on Feb 20, 2003 (gmt 0)

I agree PR is not perfect but its still the best system we have and Google have to protect it at any cost

That may have been true back when PR first hit the scene. But when 3 of the top 10 serps contain marginally relative content and the rest are either dups or link farms I have to wonder if PR is still the best system we have. This thread really has me thinking about the way I use google.

If I'm not using google to scope out my competition then I'm using it to look for movies, or recipes, or other things on a regular basis. But when I begin to look for things I search for on a seasonal basis, say, a new fishing hole for my annual spring vacation, suddenly I find the serps are far inferior than they were a year ago. For example, I love to fish for catfish. Last year when searching for sites about catfish fishing I remember being impressed with some good sites with great local and national content, great on-line communities, etc... This year I'm shocked at the difference. 3 out of the top 10 serps are marginal at best (and that's being nice).

Could it be that since I use google every day for a, b, and c, I'm missing the bigger picture? I mean, back in the day when Alta was king, they did a little of this, changed a little of that, tweaked an algo or two, and the next thing you know they lost darn near their entire user base. So that if google's "normal" user (non-seo type person) is finding the same type of results I am for my newest fishin hole, then where does that leave google a year or so down the road?

daroz




msg:136005
 10:52 pm on Feb 20, 2003 (gmt 0)

Just to throw out a question....

A few people have mention some features to help refine or relate searches (myself included)... but I wonder what people think the biggest flaw in the SERPs are now.

Personally I would answer the keyword stuffing/spamming and the link farms making pages that seem to have little (if any) relivance rank highly.

Jakpot




msg:136006
 11:51 am on Feb 21, 2003 (gmt 0)

Spamming infringes on the fair competition rights of a competitor.

Many of the pages now placed in the top ten have hidden text, hidden links and keyword stuffing in graphic images.

They have invisible Text (keywords) at the top and bottom of the page that only become visible
when highlighted on the cached page.
They have stand alone Keywords visible at top and bottom of the page which is blatent keyword stuffing.
They have graphic images stuffed with keywords.

The Sites are abusing Googles quality
guidelines - Specific Recommendations (http://www.google.com/webmasters/seo.html):
Avoid tricks intended to improve search engine rankings.
Avoid hidden text or hidden links.

So, it is not a mystery why the SERPS report bad results.

Perhaps a massive spam reporting campaign might get Googles
attention.

steveb




msg:136007
 12:41 pm on Feb 21, 2003 (gmt 0)

Suggestions to improve Google (already by far the best search engine on Earth):

1) diminish the value of pages that have been completely unchanged for more than two years. Some pages make sense static, but as the Internet ages abandoned sites pollute the results not just via showing up themselves, but by casting obsolete "votes" in terms of PR. Essentially some sites continue to vote for McGovern.

2) Small point, but the Google toolbar is an invaluable tool to people trying to build quality, themed sites. A way to change search results for the better is to not change the toolbar one bit.

3) Buy DMOZ, hire 10 full-time editors to manage volunteer editors. That's the big one. That is THE thing that would make the search results as good as they can get. Look around for categories that have been honestly, fairly, well-edited for a year; do Google searches on that subject category; you will find outstanding search results returned time and again. The human element must mix prominently with the algorithm. I'll say it again... everything else completely pales in comparison to the value to search that a 98% updated DMOZ would bring.

gopi




msg:136008
 1:05 pm on Feb 21, 2003 (gmt 0)

>>Buy DMOZ, hire 10 full-time editors to manage volunteer editors.

I believe a DMOZ purchase by google make much more sense than buying a blogging company :) .... Also dmoz volunteer editors will get a big moral boost working for a "Good , Ethical" company :)

SlyOldDog




msg:136009
 1:15 pm on Feb 21, 2003 (gmt 0)

If Google bought DMOZ, why should anyone do volunteer work for it? It's power is that nobody owns it.

Unless of course, they have a vested interest.....

atadams




msg:136010
 2:16 pm on Feb 21, 2003 (gmt 0)

It's power is that nobody owns it.

I consider that it's weakness.

born2drv




msg:136011
 2:20 pm on Feb 21, 2003 (gmt 0)

No body owns it? Last time I checked, AOL owned it, they could just care less about it ;)

annej




msg:136012
 2:25 pm on Feb 21, 2003 (gmt 0)

There is a difference between devaluing abandoned sites and dropping the value of unchanged pages. For example a good history article may not be changed or a news item. Both would still be of value to someone researching a topic. Now that I think about it occasionally an abandoned site has some valuable information.

As for DMOZ it's just not working as it is. Things just aren't getting updated and a lot of excellent sites are never even submitted.

Anne

gopi




msg:136013
 2:31 pm on Feb 21, 2003 (gmt 0)

DMOZ is owned by AOL and the editors will be more than happy if its owned by a more ethical company like Google

Liane




msg:136014
 2:53 pm on Feb 21, 2003 (gmt 0)

diminish the value of pages that have been completely unchanged for more than two years.

Pfffffft. I haven't changed my home page for nearly two years ... and why should I?

What I do and the products I offer don't change in substance, so why should I change my home page? That is the page in which I tell people what the site has to offer. Sorry, can't agree with that one. Just doesn't make any sense.

jimmykav




msg:136015
 3:00 pm on Feb 21, 2003 (gmt 0)

Agreed Liane
It does not make sens to penalise mature pages whose data does not change over time.

Giacomo




msg:136016
 5:28 pm on Feb 21, 2003 (gmt 0)

what other improvements could we do?

1. Better spam filtering (it's just too easy to fool Google, or so it seems.)

2. More action taken against individual spammers (how many spam reports does one have to submit before anything is done?)

3. A responsive abuse desk for ethical webmasters who wish to report abuses without remaining anonymous.

I have repeatedly submitted the same spam reports, month after month, about a handful of pages using a combination of hidden text/links, cloaking, deceptive redirects, shadow domains and so on.

I feel very disappointed, now, to see that those pages are not only still in Google's index, but they have actually improved their rankings after the latest Google update: a couple of them have even climbed up to the #1 slot for some very competitive keyphrases (the exact same keyphrases for which those pages have been "optimized"!) :-(

GoogleGuy, I am pretty sure that the Google Team actually read every single spam report. The problem is that reading them just isn't enough.

I really wish that Google's anti-spam squad could be more responsive towards ethical webmasters reporting their competitors' blatant abuses.

As a webmaster and Google AdWords advertiser, it hurts me to see result pages for my own strategic queries getting spammed with impunity.

As a Google user, I get very frustrated whenever I get JavaScript-redirected to a page that does not match Google's description. So, I hit the back button, which of course will not work. I often end up having to use SamSpade's Safe Browser [samspade.org] (or disable JavaScript in my browser) in order to see the redirecting page's source code, only to find it is nothing but a keyword-stuffed doorway page: pure spam.

Please do kick spammers out of Google now.

I'm sure you can. And I honestly "believe the end result will be a better web experience for all users."

steveb




msg:136017
 12:37 am on Feb 22, 2003 (gmt 0)

"so why should I change my home page"

Duh, so it doesn't get devalued.

If you can't be bothered to change a period and reupload the page, knowing Google had such a rule, the page would be unimportant anyway. It's a trivial amount of work, even for sites with tens of thousands of pages. If a webmaster can't be bothered to reupload their site once every year or two, that site should not pass current PR. Votes should be current, and reaffirmed. Four year old votes don't count in current elections (outside Chicago).

dwhite




msg:136018
 2:33 am on Feb 22, 2003 (gmt 0)

Some good points all round; I'll support a few already suggested features, and add a few more...

All suggestions are rated in order of increasing complexity:

1: Numbered SERPs: A very small addition, but would be useful to know.

2: Definable description length: I agree with John316 how it should be possible to define the length of a description. Often, I wish I could see more of sentence before visiting the site. Naturally, this would probably make its way into the advanced preferences - along with the number of results per page.

3: Keyword weighting: The ability to give a certain keyword more weight if desired (e.g. put "*2", "*3", "*10" etc. afterwards). Also, I agree with Brett that the 10 keyword limit is unnecessary.

4: Caching of images: A big drain of Google's bandwidth, but an obvious addition when the technology supports it.

5: Site thumbnails: An option in the prefs to include a mini-picture next to the search results. Again, a bit of a bandwidth drain, and will slow results down, but the option would be nice.

6: Keyword theming based on click-thru rate: I think other search engines are trying this. If a searcher visits a site, then that site should rank higher for those search terms. This might have the undesired effect of causing higher ranked sites to remain high (and low to stay low), but not if you also take into account the original position (for example, if someone visited a site ranked 150th, then this would receive a disproportionate 'weighting' increase (if you see what I mean).

7: Finally, and this is the hardest one of all: Maybe a thesaurus based approach could be good, where certain terms mean the same (or similar to) other keywords. Very difficult to implement, and I'm not sure about the quality gained from such a system, but it would save me from trying out the same search where I'm constantly changing one the words. Same goes for the plural/singular business.

Finally, an option to try one of Google's old algorithms from all the previous months would be great. I have confidence in Google only improving the algorithm as time goes on, but this would preserve the quality 'forever'. Also, it would give people a chance to compare and contrast the advantages and possible disadvantages in the results. How would G's 2001 algorithm look with today's web? :)

SlyOldDog




msg:136019
 8:28 am on Feb 22, 2003 (gmt 0)

>>>>"so why should I change my home page"
>>Duh, so it doesn't get devalued.

Err, Steveb. What about all the people who are not webmasters? Why should they be penalized because they don't spend all day at Webmasterworld?

So only 4 year old sites which are owned by non-savy web page owners get the kick, right? I think I prefer it the other way round.

Powdork




msg:136020
 9:18 am on Feb 22, 2003 (gmt 0)

Thinking of users. It seems to me it would help to put a "how to surf with google" link from the homepage that explained how to use the various choices (quotes, minuses, plurality, how to drill down).
I also think the "search within results" function is an inferior method when compared to other engines as far as results and usability are concerned.

Maybe just "How to Google" instead.

If you use this I want a mousepad (or a wireless optical mouse minidvd stereo thingy). :)

steveb




msg:136021
 11:07 am on Feb 22, 2003 (gmt 0)

slyolddog you prefer dead sites in the serps? Okay, to each his own. In a system that values votes though, current votes should matter more than obsolete ones.

Another positive step would be to ignore the links on any page if more than 25% of the total links on that page are dead.

Spider the content on those ancient pages, that's fine, but don't pretend links from them have the merit of current, up to date links.

Hefner




msg:136022
 11:37 am on Feb 22, 2003 (gmt 0)

yes ive been noticing it too, and ive switched to ATW for deeper searches as well.. they just have better finds than google. Google will cut the search results off at 200 or so on a lot of keywords, and ATW will keep it going until it's in the thousands.

SlyOldDog




msg:136023
 1:17 pm on Feb 22, 2003 (gmt 0)

Steve

>>slyolddog you prefer dead sites in the serps? Okay, to each his own. In a system that values votes though, current votes should matter more than obsolete ones.

Well, yes. Is there any rule that says new votes are better than old ones? How about research papers? They never change. In fact new just means new. It certainly doesn't mean good. In fact new may mean immature.

Sounds like skewing the results in favour of the members of this forum to me.

I like the idea of demoting sites with too many dead links though. Easy to implement and a good measure of quality.

hakre




msg:136024
 2:02 pm on Feb 22, 2003 (gmt 0)

I like the idea of demoting sites with too many dead links though. Easy to implement and a good measure of quality.

if these are external links, it does not automatically mean that the page itself is out of date. it only means, the linked sites are. if you created a website one year ago and made a really cool link recherche for days or weeks, and the external links were fine and you stopped editing and careing about that site then this can happen after a year. does it mean the rest of the site has 'bad quality'?

i think not. its ever bad to get stuck in outdated links but you can't care about all links all the time on a site.

steveb




msg:136025
 2:23 pm on Feb 22, 2003 (gmt 0)

Missing the point. Whether the site itself is bad quality or a good research paper or not is not the issue. The issue is the links from these sites should be depreciated. It doesn't matter if the content is the Magna Carta. The way it works is that Magna Carta gets its pagerank from OTHER sites that view it as ongoingly valuable. Fine, and wholly irrelevant to how it PASSES pagerank. It is simply absurd to think that the Magna Carta page knew four years ago where valuable sites would would exist *today* to link to.

Outgoing links on pages not updated for over a couple years should not pass on votes/pagerank in the same way as sites currently updated. The old pages are simply clueless in their linking. They could be linked to 404s, new porn sites, or even very valuable content. It's nearly a random crapshoot.

Good content can easily live forever. Good linking seldom will.

GoogleGuy




msg:136026
 6:07 pm on Feb 22, 2003 (gmt 0)

Okay, let's call this a few votes to investigate the tradeoffs between staleness and ongoing value. It's definitely interesting because different types of pages attract different types of links. Are there other things (as a user) that you would like to see?

SlyOldDog




msg:136027
 7:30 pm on Feb 22, 2003 (gmt 0)

How about giving each site a vote out of 10 (pagerank) in the SERPS? That would certainly educate surfers that Google measures site quality.

spinmaster




msg:136028
 5:43 am on Feb 23, 2003 (gmt 0)

I'm not sure, maybe this was mentioned, but one thing I struggle with is how to find information on "how to do things" when my search terms are related to something with common search words like "building a tab delimited file" or finding good comparisons, reviews with Amazon-type relationships links...although in the second example I usually have better search terms to use. I usually end up searching the news groups...and might find a thread...

What would be a way to identify, tag content-rich type stuff like this...sort of an Ask Jeeves but better in terms of delivering on the esoteric. You know...like Jeeves meets the Amazon?

rfgdxm1




msg:136029
 5:58 am on Feb 23, 2003 (gmt 0)

>Outgoing links on pages not updated for over a couple years should not pass on votes/pagerank in the same way as sites currently updated. The old pages are simply clueless in their linking. They could be linked to 404s, new porn sites, or even very valuable content. It's nearly a random crapshoot.

Depends on the nature of the sites. In particular, consider those cases where the sites in question are on a very narrow topic. With the topic of my 2 sites (both are on aspects of the same topic), there are only a handful of other sites worth mentioning on the net about this topic. It wouldn't surprise me in the least that 3 years down the road I'll be linking to the same 4 sites that I am currently, as it wouldn't surprise me if another worthwhile site on this topic didn't come on the Net in the next 3 years.

This 161 message thread spans 6 pages: < < 161 ( 1 2 3 4 [5] 6 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved