Forum Moderators: open
Now, for the most part, I think users are still happy, but there is no doubt that computer-literate folks I know are beginning to use other search engines if they don't immediately find what they want on Google. This trend is likely to continue unless Google gets a grip on its problems.
I will kick off with the following suggestions.
1: Read javascript links (or publicly admit PR is broken and won't be fixed). Also, where possible, read cgi-links. (Where the link contains an url beginning http:// or www. this is easy. However, this might open a door to cheating SEO so this is debatable.)
2: Fix the link: tool so that it shows all links are displayed. Also change it so that it works like ATW and AV and can show backlinks for an entire site. It need not show more than, say, 200 but it should give the total count. This would allow webmasters to feel more confident that Google is working properly if nothing else.
Kaled.
PS
Please vent unhelpful comments and waffle in other threads.
What!? Nah - It is doing as good as it ever has. The zoom lens on google just keeps getting twisted a bit more day by day.
> computer-literate folks I know are beginning to use other search engines
The real computer-literate folks never stopped using other engines.
> Read javascript links
They already do read any text with http in it - including js links.
> Fix the link: tool so that it shows all links are displayed.
That would only encourage algo investigation.
--------------
My suggestions:
Make the "did you mean" search suggestions a full time part of google (with some minor display tweaks). Love them, but they look like spelling suggestions.
Get rid of all the extraneous code on the google pages and get back to a clean/pure interface that works in all browsers.
Bring back the monthly update before the results degrade any further. The current "results problems" are nothing compared to what is going to hit in the next few months. They got your number G - atleast the monthly update made them 30 days behind you.
I would suggest:
--Stop including pages that can't be spidered. Sure, there might be a link to the page with relevant link text, but if the page can't be spidered how the heck do you know if it's actually relevant or not.
--In a similar vein, don't use link text as a sole criteria. The "The search term only appears in links to this page" (or however it's phrased), is a symptom of how link text can be used to manipulate the SERPs. If the search term ain't somewhere on the page don't return the page in the results.
--Knock out the PDFs. Unless you can tell me which page of the 200 page PDF document the information I want is on it's almost worthless to me (along with my dislike of reading PDFs in a browser anyway). If the page with the link to the PDF file is relevant then return that, otherwise can 'em.
There are more ways to slim down G's index, these are just my pet peeves.
And, along with Brett, get that drill down stuff working. As long as PR and link text skew results for many terms it makes it very difficult to find that small fact or bit of information I'm after. (Yeah, it's there, but on page 20 of the SERPs). Take a lesson from Teoma and make it easier for me to find what I'm after.
'Nuff for now.
Jim
EW
And on that note, I'd also prefer that Google used professional translators rather than volunteers, because some translations (at least the Danish) are really not very good.
They already do read any text with http in it - including js links
Are you saying that, assuming the link: tool worked as we would all hope, javascript links would be displayed. Currently, the link tool is only showing a few internal links for my website so I can't quickly test this.
Given the PR algo, there is clearly a difference between spidering pages of urls it finds on a webpage and recognising urls (in javascript) as links.
Kaled.
Bring back the monthly update before the results degrade any further.
What results are those, WW monthly traffic surges? ;)
What Google really needs to work on is those phony-baloney search result pages from fake search engines that keep littering the serps. Pure dreck.
Agree strongly. I keep getting a code error on XP for a particular search... I couldn't believe it at first. It shouldn't happen.
>> Bring back the monthly update before the results degrade any further <<
This was an incredible marketing dimension... and they just blew it away. I still can't believe it happened.
Others? How about:
a) PDF's (perhaps combined with Word and other docs) under their own tab
b) Try to buy custodianship of the ODP. A more reliable platform for editors will lead to more edits, which will lead to higher quality, which will improve Google. The ODP is excellent, but could be even better if Google contributed.
c) Stop the penalty on domain names that long since changed hands from black hats to white (having expired). It's not only unfair, but eliminates many excellent sites from the index.
d) If someone types in www.google.com... they actually mean it! Ivana's issue is actually a big one.
Apart from that.... keep the basics sound. Keep ads WELL distinct from true SERPS... if you create a gray area you are doomed. None monetization of your returns is actually one of your prime differentiators.
Synchronize the bots, so that only 1 copy of a page exists.
At the moment the bots are requesting pages that are exactly 50 days old, making if-modified-since useless for dynamic pages. The bots come daily to me and do a full crawl => They have at least 50 unsynchronized indexes.
Where's the point in having an old index, anyways? (Or do they want to clone the WayBackMachine)
phony-baloney search result pages from fake search engines that keep littering the serps
Should be one of the simpler problems for those PhDs to work on. Actually, don't think this one is rocket science, maybe a tech school grad could handle it.
PDF's (perhaps combined with Word and other docs) under their own tab
Good way to handle them.
Google hasnt designed a search engine to make webmasters rich, they made the search engine so quality information and content can be taken from Google.
PDF's are classed as information, like it or not, PDF's will been seen even more within serps during the coming months.
By all means factor in those off-page considerations of backlinks and anchor text etc to fine tune the results... but not to the point where they smother on-page relevance.
I'm reading through this thread and dropping off notes to various people in Google to check it out. :) Three notes off so far. Keep the (helpful) suggestions coming..
Over the past several months I have been trying to merge several domains together in an effort to simplify things and do the right thing (tm) by google. ie. not have duplicate content on duplicate domains.
Based on logs of Googlebot's trawling of the site which I have done this to, it really doesn't like 301 redirects all that well. For the past two months all it has done is hit the old domains and get a 301 back to the new domains, then it hit the robots.txt, and thats it, presently the only page google does index is the home page, Googlebot isn't going any deeper.
I implemented these changes about 4 months ago now and google still isn't indexing the site correctly. All the other search engines seem reasonably happy with it.