> WW members believe that Google is suffering right now
What!? Nah - It is doing as good as it ever has. The zoom lens on google just keeps getting twisted a bit more day by day.
> computer-literate folks I know are beginning to use other search engines
The real computer-literate folks never stopped using other engines.
They already do read any text with http in it - including js links.
> Fix the link: tool so that it shows all links are displayed.
That would only encourage algo investigation.
Make the "did you mean" search suggestions a full time part of google (with some minor display tweaks). Love them, but they look like spelling suggestions.
Get rid of all the extraneous code on the google pages and get back to a clean/pure interface that works in all browsers.
Bring back the monthly update before the results degrade any further. The current "results problems" are nothing compared to what is going to hit in the next few months. They got your number G - atleast the monthly update made them 30 days behind you.
I suggest that size doesn't matter. "Searching 3,307,998,701 web pages" doesn't mean a thing unless quality results are returned.
I would suggest:
--Stop including pages that can't be spidered. Sure, there might be a link to the page with relevant link text, but if the page can't be spidered how the heck do you know if it's actually relevant or not.
--In a similar vein, don't use link text as a sole criteria. The "The search term only appears in links to this page" (or however it's phrased), is a symptom of how link text can be used to manipulate the SERPs. If the search term ain't somewhere on the page don't return the page in the results.
--Knock out the PDFs. Unless you can tell me which page of the 200 page PDF document the information I want is on it's almost worthless to me (along with my dislike of reading PDFs in a browser anyway). If the page with the link to the PDF file is relevant then return that, otherwise can 'em.
There are more ways to slim down G's index, these are just my pet peeves.
And, along with Brett, get that drill down stuff working. As long as PR and link text skew results for many terms it makes it very difficult to find that small fact or bit of information I'm after. (Yeah, it's there, but on page 20 of the SERPs). Take a lesson from Teoma and make it easier for me to find what I'm after.
'Nuff for now.
Good idea kaled
Brett - apply for one of the jobs with G and go sort them out!
Google to provide professional technical support to webmasters via paid membership.
Could be implemented via direct contact by Google with web design companies who can provide proof of, and conform to, a minimum business requirement level and have a multiple, varied and satisfied client base.
Client references to be supplied and verified.
Google Certification - sounds like a good idea for webmasters but still opens up the algo too much. I dont see Google jumping on that one.
I'd like them to sort out the millions of affiliate sites that appear in SERPS up whenever I'm searching for any product. How about limiting it to just one or two affiliates? Or having them in a separate box?
Those sites almost never give me info on the product I'm looking for :-(
Putting more humans factor to clean the results.
They always try to solve problems by tweaking the algo. But 66% of the times, that way is creating others new problems.
I find it very annoying that when you type in www.google.com, you automatically get the version that fits the country you are in, ie I get www.google.dk because I'm in Denmark. I would like it to be the other way around, so that the default is the .com in English and from there you can click on Google 'In Your Language'.
And on that note, I'd also prefer that Google used professional translators rather than volunteers, because some translations (at least the Danish) are really not very good.
Have both the singular and plural versions of nouns to count in the search algorithm. In other words, have limited "stemming" taken into account.
>How about limiting it to just one or two affiliates?
Absolutely - as long as my site is one of the two!
|Absolutely - as long as my site is one of the two! |
Valid point. But they like algos. Perhaps they can work one out to select a limited number of affiliate sites.
>Perhaps they can work one out to select a limited number of affiliate sites.
And which ones? I can't see any way how they could *fairly* limit the display of any portion of the search results.
|They already do read any text with http in it - including js links |
No - but followed to be indexed (eg: the whole point) - yes.
Correct me if I'm wrong, but the way I understand it is that the JS link indexing won't count as a link per se, but can be indexed in the same way as the non-href letters www.yourdomain.com on a webpage will be indexed.
|Bring back the monthly update before the results degrade any further. |
What results are those, WW monthly traffic surges? ;)
What Google really needs to work on is those phony-baloney search result pages from fake search engines that keep littering the serps. Pure dreck.
>> Get rid of all the extraneous code on the google pages and get back to a clean/pure interface that works in all browsers. <<
Agree strongly. I keep getting a code error on XP for a particular search... I couldn't believe it at first. It shouldn't happen.
>> Bring back the monthly update before the results degrade any further <<
This was an incredible marketing dimension... and they just blew it away. I still can't believe it happened.
Others? How about:
a) PDF's (perhaps combined with Word and other docs) under their own tab
b) Try to buy custodianship of the ODP. A more reliable platform for editors will lead to more edits, which will lead to higher quality, which will improve Google. The ODP is excellent, but could be even better if Google contributed.
c) Stop the penalty on domain names that long since changed hands from black hats to white (having expired). It's not only unfair, but eliminates many excellent sites from the index.
d) If someone types in www.google.com... they actually mean it! Ivana's issue is actually a big one.
Apart from that.... keep the basics sound. Keep ads WELL distinct from true SERPS... if you create a gray area you are doomed. None monetization of your returns is actually one of your prime differentiators.
Do not display the first page of a redirect and if the page it gets redirected to has a penalty, don't display that page!
Update daily, not monthly. Getting much much better <g>.
Ensure that only visible page text is indexed, thereby removing many of the top 10 for a variety of keywords.
Synchronize the bots, so that only 1 copy of a page exists.
At the moment the bots are requesting pages that are exactly 50 days old, making if-modified-since useless for dynamic pages. The bots come daily to me and do a full crawl => They have at least 50 unsynchronized indexes.
Where's the point in having an old index, anyways? (Or do they want to clone the WayBackMachine)
|phony-baloney search result pages from fake search engines that keep littering the serps |
Should be one of the simpler problems for those PhDs to work on. Actually, don't think this one is rocket science, maybe a tech school grad could handle it.
|PDF's (perhaps combined with Word and other docs) under their own tab |
Good way to handle them.
Eliminate the ability for this to show an arrow in the search results: ►
|Eliminate the ability for this to show an arrow in the search results: ► |
How shall they distinguish between legitimate foreign characters and funny symbols?
Ignoring all Unicode characters?
Just a suggestion to ignore this one, at least in the <title> tag. Too easily abused.
Please, update the Google directory.
I think people are forgetting what Google is about.
Google hasnt designed a search engine to make webmasters rich, they made the search engine so quality information and content can be taken from Google.
PDF's are classed as information, like it or not, PDF's will been seen even more within serps during the coming months.
Higher rankings for pages that have actual content relevant to the search term.... it is after all, a pretty basic assumption that the user expects to be offered pages that actually contain information about the search topic.
By all means factor in those off-page considerations of backlinks and anchor text etc to fine tune the results... but not to the point where they smother on-page relevance.
|PDF's (perhaps combined with Word and other docs) under their own tab |
Ivana, if it helps, there should be a link in the lower right corner that will let you go back to Google.com. I think it does a setting in your cookie so that you won't get country redirected again.
I'm reading through this thread and dropping off notes to various people in Google to check it out. :) Three notes off so far. Keep the (helpful) suggestions coming..
Google Images seems to be quite stale lately. (I know that everyone at Google has their hands full with other things).
Better handling of 301 Redirects
Over the past several months I have been trying to merge several domains together in an effort to simplify things and do the right thing (tm) by google. ie. not have duplicate content on duplicate domains.
Based on logs of Googlebot's trawling of the site which I have done this to, it really doesn't like 301 redirects all that well. For the past two months all it has done is hit the old domains and get a 301 back to the new domains, then it hit the robots.txt, and thats it, presently the only page google does index is the home page, Googlebot isn't going any deeper.
I implemented these changes about 4 months ago now and google still isn't indexing the site correctly. All the other search engines seem reasonably happy with it.
| This 109 message thread spans 4 pages: 109 (  2 3 4 ) > > |