homepage Welcome to WebmasterWorld Guest from 54.242.200.172
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe and Support WebmasterWorld
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 109 message thread spans 4 pages: 109 ( [1] 2 3 4 > >     
Google TO DO list
We all think Google has got problems, so lets make some HELPFUL suggestions
kaled




msg:62693
 11:24 am on Oct 15, 2003 (gmt 0)

It seems to me that, almost without exception, WW members believe that Google is suffering right now. So, instead of just blowing off steam, whinging and complaining, why don't we make some suggestions as to what they need to do to help keep webmasters and users happy.

Now, for the most part, I think users are still happy, but there is no doubt that computer-literate folks I know are beginning to use other search engines if they don't immediately find what they want on Google. This trend is likely to continue unless Google gets a grip on its problems.

I will kick off with the following suggestions.
1: Read javascript links (or publicly admit PR is broken and won't be fixed). Also, where possible, read cgi-links. (Where the link contains an url beginning http:// or www. this is easy. However, this might open a door to cheating SEO so this is debatable.)
2: Fix the link: tool so that it shows all links are displayed. Also change it so that it works like ATW and AV and can show backlinks for an entire site. It need not show more than, say, 200 but it should give the total count. This would allow webmasters to feel more confident that Google is working properly if nothing else.

Kaled.

PS
Please vent unhelpful comments and waffle in other threads.

 

Brett_Tabke




msg:62694
 4:21 pm on Oct 15, 2003 (gmt 0)

> WW members believe that Google is suffering right now

What!? Nah - It is doing as good as it ever has. The zoom lens on google just keeps getting twisted a bit more day by day.

> computer-literate folks I know are beginning to use other search engines

The real computer-literate folks never stopped using other engines.

> Read javascript links

They already do read any text with http in it - including js links.

> Fix the link: tool so that it shows all links are displayed.

That would only encourage algo investigation.

--------------
My suggestions:

Make the "did you mean" search suggestions a full time part of google (with some minor display tweaks). Love them, but they look like spelling suggestions.

Get rid of all the extraneous code on the google pages and get back to a clean/pure interface that works in all browsers.

Bring back the monthly update before the results degrade any further. The current "results problems" are nothing compared to what is going to hit in the next few months. They got your number G - atleast the monthly update made them 30 days behind you.

jimbeetle




msg:62695
 5:04 pm on Oct 15, 2003 (gmt 0)

I suggest that size doesn't matter. "Searching 3,307,998,701 web pages" doesn't mean a thing unless quality results are returned.

I would suggest:

--Stop including pages that can't be spidered. Sure, there might be a link to the page with relevant link text, but if the page can't be spidered how the heck do you know if it's actually relevant or not.

--In a similar vein, don't use link text as a sole criteria. The "The search term only appears in links to this page" (or however it's phrased), is a symptom of how link text can be used to manipulate the SERPs. If the search term ain't somewhere on the page don't return the page in the results.

--Knock out the PDFs. Unless you can tell me which page of the 200 page PDF document the information I want is on it's almost worthless to me (along with my dislike of reading PDFs in a browser anyway). If the page with the link to the PDF file is relevant then return that, otherwise can 'em.

There are more ways to slim down G's index, these are just my pet peeves.

And, along with Brett, get that drill down stuff working. As long as PR and link text skew results for many terms it makes it very difficult to find that small fact or bit of information I'm after. (Yeah, it's there, but on page 20 of the SERPs). Take a lesson from Teoma and make it easier for me to find what I'm after.

'Nuff for now.

Jim

EarWig




msg:62696
 5:10 pm on Oct 15, 2003 (gmt 0)

Good idea kaled
Brett - apply for one of the jobs with G and go sort them out!
An Idea:
Google to provide professional technical support to webmasters via paid membership.
Could be implemented via direct contact by Google with web design companies who can provide proof of, and conform to, a minimum business requirement level and have a multiple, varied and satisfied client base.
Client references to be supplied and verified.

EW

Kirby




msg:62697
 5:37 pm on Oct 15, 2003 (gmt 0)

Google Certification - sounds like a good idea for webmasters but still opens up the algo too much. I dont see Google jumping on that one.

Macro




msg:62698
 5:57 pm on Oct 15, 2003 (gmt 0)

I'd like them to sort out the millions of affiliate sites that appear in SERPS up whenever I'm searching for any product. How about limiting it to just one or two affiliates? Or having them in a separate box?

Those sites almost never give me info on the product I'm looking for :-(

Allergic




msg:62699
 5:59 pm on Oct 15, 2003 (gmt 0)

Putting more humans factor to clean the results.

They always try to solve problems by tweaking the algo. But 66% of the times, that way is creating others new problems.

Ivana




msg:62700
 6:14 pm on Oct 15, 2003 (gmt 0)

I find it very annoying that when you type in www.google.com, you automatically get the version that fits the country you are in, ie I get www.google.dk because I'm in Denmark. I would like it to be the other way around, so that the default is the .com in English and from there you can click on Google 'In Your Language'.

And on that note, I'd also prefer that Google used professional translators rather than volunteers, because some translations (at least the Danish) are really not very good.

bwelford




msg:62701
 6:19 pm on Oct 15, 2003 (gmt 0)

Have both the singular and plural versions of nouns to count in the search algorithm. In other words, have limited "stemming" taken into account.

Yidaki




msg:62702
 6:19 pm on Oct 15, 2003 (gmt 0)

>How about limiting it to just one or two affiliates?

Absolutely - as long as my site is one of the two!

Nuff said.

Macro




msg:62703
 6:24 pm on Oct 15, 2003 (gmt 0)

Absolutely - as long as my site is one of the two!

Valid point. But they like algos. Perhaps they can work one out to select a limited number of affiliate sites.

Yidaki




msg:62704
 6:31 pm on Oct 15, 2003 (gmt 0)

>Perhaps they can work one out to select a limited number of affiliate sites.

And which ones? I can't see any way how they could *fairly* limit the display of any portion of the search results.

kaled




msg:62705
 6:35 pm on Oct 15, 2003 (gmt 0)

Brett said of javascript links
They already do read any text with http in it - including js links

Are you saying that, assuming the link: tool worked as we would all hope, javascript links would be displayed. Currently, the link tool is only showing a few internal links for my website so I can't quickly test this.

Given the PR algo, there is clearly a difference between spidering pages of urls it finds on a webpage and recognising urls (in javascript) as links.

Kaled.

Brett_Tabke




msg:62706
 7:15 pm on Oct 15, 2003 (gmt 0)

> javascript links would be displayed.

No - but followed to be indexed (eg: the whole point) - yes.

martinibuster




msg:62707
 7:30 pm on Oct 15, 2003 (gmt 0)

Correct me if I'm wrong, but the way I understand it is that the JS link indexing won't count as a link per se, but can be indexed in the same way as the non-href letters www.yourdomain.com on a webpage will be indexed.

Bring back the monthly update before the results degrade any further.

What results are those, WW monthly traffic surges? ;)

What Google really needs to work on is those phony-baloney search result pages from fake search engines that keep littering the serps. Pure dreck.

Napoleon




msg:62708
 7:34 pm on Oct 15, 2003 (gmt 0)

>> Get rid of all the extraneous code on the google pages and get back to a clean/pure interface that works in all browsers. <<

Agree strongly. I keep getting a code error on XP for a particular search... I couldn't believe it at first. It shouldn't happen.

>> Bring back the monthly update before the results degrade any further <<

This was an incredible marketing dimension... and they just blew it away. I still can't believe it happened.

Others? How about:

a) PDF's (perhaps combined with Word and other docs) under their own tab

b) Try to buy custodianship of the ODP. A more reliable platform for editors will lead to more edits, which will lead to higher quality, which will improve Google. The ODP is excellent, but could be even better if Google contributed.

c) Stop the penalty on domain names that long since changed hands from black hats to white (having expired). It's not only unfair, but eliminates many excellent sites from the index.

d) If someone types in www.google.com... they actually mean it! Ivana's issue is actually a big one.

Apart from that.... keep the basics sound. Keep ads WELL distinct from true SERPS... if you create a gray area you are doomed. None monetization of your returns is actually one of your prime differentiators.

HayMeadows




msg:62709
 7:47 pm on Oct 15, 2003 (gmt 0)

Do not display the first page of a redirect and if the page it gets redirected to has a penalty, don't display that page!

Update daily, not monthly. Getting much much better <g>.

GranPops




msg:62710
 7:51 pm on Oct 15, 2003 (gmt 0)

Ensure that only visible page text is indexed, thereby removing many of the top 10 for a variety of keywords.

plasma




msg:62711
 7:54 pm on Oct 15, 2003 (gmt 0)

My suggestion:

Synchronize the bots, so that only 1 copy of a page exists.

At the moment the bots are requesting pages that are exactly 50 days old, making if-modified-since useless for dynamic pages. The bots come daily to me and do a full crawl => They have at least 50 unsynchronized indexes.
Where's the point in having an old index, anyways? (Or do they want to clone the WayBackMachine)

jimbeetle




msg:62712
 7:54 pm on Oct 15, 2003 (gmt 0)

phony-baloney search result pages from fake search engines that keep littering the serps

Should be one of the simpler problems for those PhDs to work on. Actually, don't think this one is rocket science, maybe a tech school grad could handle it.

PDF's (perhaps combined with Word and other docs) under their own tab

Good way to handle them.

HayMeadows




msg:62713
 7:54 pm on Oct 15, 2003 (gmt 0)

Eliminate the ability for this to show an arrow in the search results: &#9658;

plasma




msg:62714
 8:43 pm on Oct 15, 2003 (gmt 0)

Eliminate the ability for this to show an arrow in the search results: &#9658;

How shall they distinguish between legitimate foreign characters and funny symbols?
Ignoring all Unicode characters?

HayMeadows




msg:62715
 10:10 pm on Oct 15, 2003 (gmt 0)

Just a suggestion to ignore this one, at least in the <title> tag. Too easily abused.

twilight47




msg:62716
 11:24 pm on Oct 15, 2003 (gmt 0)

Google,
Please, update the Google directory.
Thank You.

leedslad73




msg:62717
 11:56 pm on Oct 15, 2003 (gmt 0)

I think people are forgetting what Google is about.

Google hasnt designed a search engine to make webmasters rich, they made the search engine so quality information and content can be taken from Google.

PDF's are classed as information, like it or not, PDF's will been seen even more within serps during the coming months.

austtr




msg:62718
 12:04 am on Oct 16, 2003 (gmt 0)

Higher rankings for pages that have actual content relevant to the search term.... it is after all, a pretty basic assumption that the user expects to be offered pages that actually contain information about the search topic.

By all means factor in those off-page considerations of backlinks and anchor text etc to fine tune the results... but not to the point where they smother on-page relevance.

Hardwood Guy




msg:62719
 12:26 am on Oct 16, 2003 (gmt 0)

PDF's (perhaps combined with Word and other docs) under their own tab

Fabulous idea!

GoogleGuy




msg:62720
 4:22 am on Oct 16, 2003 (gmt 0)

Ivana, if it helps, there should be a link in the lower right corner that will let you go back to Google.com. I think it does a setting in your cookie so that you won't get country redirected again.

I'm reading through this thread and dropping off notes to various people in Google to check it out. :) Three notes off so far. Keep the (helpful) suggestions coming..

mcavic




msg:62721
 5:33 am on Oct 16, 2003 (gmt 0)

Google Images seems to be quite stale lately. (I know that everyone at Google has their hands full with other things).

Kratzy




msg:62722
 5:37 am on Oct 16, 2003 (gmt 0)

Better handling of 301 Redirects

Over the past several months I have been trying to merge several domains together in an effort to simplify things and do the right thing (tm) by google. ie. not have duplicate content on duplicate domains.

Based on logs of Googlebot's trawling of the site which I have done this to, it really doesn't like 301 redirects all that well. For the past two months all it has done is hit the old domains and get a 301 back to the new domains, then it hit the robots.txt, and thats it, presently the only page google does index is the home page, Googlebot isn't going any deeper.

I implemented these changes about 4 months ago now and google still isn't indexing the site correctly. All the other search engines seem reasonably happy with it.

This 109 message thread spans 4 pages: 109 ( [1] 2 3 4 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved