| This 179 message thread spans 6 pages: < < 179 ( 1 2 3  5 6 ) > > || |
|2002, Part II|
Ready for the rest of the year?
The first half of 2002 was pretty eventful at Google. We rolled out major new products like Enterprise Search and AdWords Select, pulled back the curtain on innovation like Google Answers and labs.google.com, partnered with some great companies, and launched tons of little things to improve search that most people--except for maybe the posters here--never notice, but really improve search. What sort of things do you want to see Google doing next? And are you ready for the rest of the year? :)
- news articles are listed instead of the news item they are reporting about
- your search results should be more (or less) grammar sensitive, i.e., it should find declinations/stems of words
- port the Google Toolbar to Opera/Mozilla/Linux/Mac OS
- show some newsgroup articles on regular SERPs or add a "search all media" (similar to Amazon) and group them by "The Web", "Groups", "Images", ...
- Use valid HTML
- Beneath each listing, name the page elements, e.g., (t)ext, (i)mages, (js), (j)ava, (f)lash, (d)html
- offer some refine options, e..g., if someone searches for "pop-up windows", offer refine for MSIE, Netscape, Opera
- send that Googlebot out more often
- offer hints, e.g., "don't forget to visit the city library. ;)
- extend your News search to other countries (hint: Germany ;))
- put a date next to each search result
Please excuse me if this has already been said, I didn't read all 7 pages.
What I would like most of all as a business owner from Google is the ability to pay a fee to get an answer directly from the source. I would pay $25-$50 to turn in a competitor for spamming that was stealing my traffic, if I could get an email reply letting me know if they are taking action.
I am sure many people would pay money for definitive answers to questions they have about Google. I.E. What is, and what is not allowed?
Seems like a money making operation if done properly. With an IPO in the future, it makes sence.
|If you look closely enough at Google's HTML, much of the 'invalid HTML for speed and bandwidth' argument fades away, IMO. |
Most of the validation errors on Google's front page are designed to make the page load faster and save bandwidth:
- no doctype
- no type on style element
- no type on script element
- unquoted attributes (lots of these)
- no src on img element (clever)
- • is invalid (what does this mean?)
- unknown entity ie (the most dangerous error)
The only questionable error on the front page is "id must begin with a letter".
I think you're right when it comes to Google's serps. Ironically, the only HTML error in the serps that causes me trouble is not reported as an error by the validator: the paragraph tag for a spelling suggestion is not closed. As a result, the table containing adwords ads is inside a paragraph when there is a misspelling, and not inside a paragraph when there is not a misspelling. I set paragraphs to max-width:30em in my user style sheet, so pages that contain both spelling suggestions and ads look ugly. (The W3C considers <table> to automatically close a <p> tag, even in strict; browsers don't, even in strict.)
• is a HTML special character (bullet)
I'd like to see modal-based search [boston.internet.com] for the masses. Google has outride [google.com] and the toolbar is in place, so it seems going modal would be the next step.
Google guy... I would ban useless "content sites" they take up huge amounts of bandwidth.
Most of these "content" sites are nothing more than drivel times 50 pages , 100 pages 1,000 of pages..that may impress some but certainly google is wise to that trick?
It seems if companies and indivduals cant compete they resort to spamming Google with literally thousands of pages of useless "content" and then attack companies and indivuals that are providing products and services that people are actually looking for.
Think of the money Google would save buy banning any site with more than 20 pages..I mean really ..anyone writing 300 , 1000 pages should be paying Google a PUBLICATION FEE ..hey know there 's an idea !
GG - What erks me is when people put HUGE (and I mean HUGE) font text with keywords - but when you go to the site, it is irrelevant.
Also when people repeat keywords HUNDREDS of times.
Want to increase the popularity of google?
GET RID of PAGE RANK ALTOGETHER!
I think it serves absolutely no purpose other than to upset every one of the webmasters. Pray tell, how does it help the surfer find sites he or she is interested in?
And, how does the poor sap doing the searching even know the page rank if they have no wonderful googlebar?
I think this may be an upsetting ploy to ultimately scare the webmaster into buying adwords....could this be the case?
I have been hanging at a pro1 for months on a site that used to be ranked well...I can't afford to buy your adwords and I have no idea what kind of penalty I am under, if any at all....
So prove your good faith in fair listings and drop the PR, it is a totally unfair and discriminating practice that makes Google look bad.
Better yet, just get rid of all the PR1 sites. Send them over to Inktomi.
1) Place less weight on domain names, as above. Except for one instance - where it matches perfectly. Someone searching for Ford, would expect ford.com to be at the top of the results.
2) On local searches (Google.co.uk) you could put the UK flag next to the UK results if they search worldwide so the user can immediately see the differences.
One more, on AdWords it takes a good thirty seconds to pause one advert in a campain and yet you recommend we pause it if our site goes down. Now if I have thirty adverts, do you think I am going to spend 15 minutes to pause the adverts and 15 minutes to start them up again in a few hours time? Give us a big, pause/unpause all campains button and a pause/unpause option on the front page, next to the view/edit link. That would save us money (and time) when our site does go down and also give you more reliable AdWords links.
Or even better, how about a 'site down' button on the front page, upon clicking all adverts are paused. You could have a server repeatedly checking the site (every 30 or 60 minutes would be good) and automatically unpause the campains when the site is accessible again.
I did not cast aspersions on any post you or anyone else made nor do I appreciate having my right to an opinion slammed.
BTW, I do quite well on ink, thank you.
The only way to do well with INK is to spam it with keywords. If you're doing well with INK it may explain why you're not doing so well with Google. Many things one needs to rank well in INK will get them penalized in Google.
A little asperity is good for the character and equally, you just slammed my right to an opinion.
I do not spam.. and I do not have to defend myself from the likes of you.
I have been a good citizen on this board and expect the courtesy to be extended to me.
Why you chose me to contend with I have no idea. I simply made a post the same as others..go pick on someone else...I'm not interested.
My websites speak for themselves...no spam-ever!
Sorry, Googleguy. I won't post in this thread anymore as I do not like to get off topic like this and spoil your thread.
I thought about this a couple of weeks ago but then forgot about it. Anyway, I'd like a "zip it up" button/link on some file formats, e.g., .txt, .doc, .bmp results. A 101 K .txt file would approx. be a 10 K .txt.zip file and worth downloading.
I want to buy, let's say, a Leatherman tool. How could Google help me a) find information about it b) actually buying it. Google's prety good about a) but can't really help me on b).
It would be cool to get search results displayed as Mindmaps (similar to Touchgraph), so searching for "Harry Potter" would put "Harry Potter" in the center and show different clusters, e.g., Books, Movies, Merchandising, Fan-Pages. Klicking "Books" would show edges for each book. Klicking on the first book would show edges for reviews. To make a long story short: allow for visual browsing/surfing of the results. I guess it's too early for this since Java sucks and there is no cross-browser/cross-platform language/tool available yet (so make it stand alone, open source, more powerful API, whatever, ...).
Split crawling and ranking. Sell both to other search engines, so they can either apply your ranking to their crawl results or rank the stuff you crawled using their ranking engine.
How about establishing or using more meta data (either Dublin-Core or s.th. else, e.g., site wide meta data).
How about something that makes the following information readiliy available: incoming and outgoing links of a specific domain, theme, topics, pages crawled, status, file types, popularity, whois information, web server, availability, 404s, customer reviews, stock information, ...
Yeah, I like brain storming. And remember: You must not critizise during brain storming. ;) And you must stay on topic.
The only way to do well with INK is to spam it with keywords ? HUH ?
Dosent work I've tried it LOL!
When you find the right formula for INk please post it though ;)
Seriously I do not understand this attack of keywords..Keywords ARE very important ..and should be when doing a search on any engine..
the PROBLEM is that the SE's have not figured the formula for seperating rightful keywords matching the content of a web site to keywords used that have no bearing on a web site...
SE's CAN use algo's that can determine if a site is abusing keywords ..keyword density for example..your site is using a "keyword" at a 40% clip rate ...a se can send it to the back of the list so to speak..
It can be done...whats worse is determing a site by inbound links....just as easily abused ...and inbound links combined with PR from another site is simply pyramid building..we all remember those schemes now dont we..
Rankings should IMO be based upon some vote by visitation method ..length of visits , repeat visits , percentage of new traffic against new visits ..number of visiters in percentage of the size of the site (# ofpages,bandwidth etc..)
Bottom line if you attract traffic..attract repeat traffic for whatever keywords then you maintain your rankings or climb..you dont maintain and update your web site and your traffic drops ..then so does your ranking..
PR rankings is simply a means for webmasters to scratch each others back ..I'll trade you a PR 6 link to so and so web site if you'll do the same etc..so the rankings are determined by the webmasters INSTEAD of the traffic..you know the people ACTUALLY shopping for the product/service or information..
I want the customer to determine if I deserve to be on the first page or 2nd or 50th..not a gang of other webmasters with more resouces determing my rank ..
thats the way it SHOULD be..
Luma..mindmaps..I "see" it and really like the idea..
Flowchart SE ?
Google is shooting itself in the foot by not warning or helping webmasters if they make a mistake or over optimise etc.
Many site owners are very dependant on google for their income. So at the moment they have to have 2 or 3 sites as a back up if one gets slammed.
If Google warned site owners of problems with a site they would not need to have other sites as a fall back position. This would avoid semi duplicate sites clogging up the serps.
Maintain a fair chance for small busines sites, some of which find dificulty generating quality links. I do a lot for one man bands on a limited budget with 5 to 50 pages.
If a site has been "banned" for whatever reason, make it easier for me/us to find out when commisioned to fix it. Sometimes the worst has been corrected and no record exists of the ingenious attempt to rule the SERPs.
Give us a more certain means to get re-spidered (possibly with acknowledgement) when we have fixed previous transgressions whether ours or some previous clever sods.
PR ranking dosent work anyways ..searching under "Baby names" brings up the first position has PR 5 go 23 pages deep am finding a 6's and 7s..(with 4s in between)
so whats the point of PR then..?
Ann >> BTW, I do quite well on ink, thank you
Ditto, Ann, and Ink does quite well by me ... it's been a win-win situation. And the surfers love my site (ka-ching ka-ching) too, making it a three-way win.
This might be off the wall, googleguy, but I'd be interested in a search option that considers geographic proximity of a business, institution or resource to the searcher. It seems to me such a capability could replace difficult organizational structures such as Yahoo's & ODP's regional categories, while providing a completely new class of relevance. It may be possible, for example, for an algorithm to evaluate an address meta tag in relation to an address-based or map-based search; and/or, perhaps it could evaluate custom (standardized) geocoded shapes representing service areas or other areas of relevance? It may also be possible to cross-reference business address claims to yellow page entries in order to flag location spam.
One of the problems with current search technology is the diffulty of searching resources, businesses, and information that is/are geographically relevant without resorting to manually browsing through manually organized categories.
IMHOP -- I think that google should setup a webmaster FAQ that includes a concise step by step method to either Identitfy problematic websites or code that is causing PR0. Google has their dos and don'ts page but they have nothing stating HOW TO eliminate the PR0 penalty. There are a lot of legit SEO's who stumbled into the penalty and would like to get their site back on the map. Either setup a step by step process that would eliminate it OR provide a process where a site can get evaluated (on a TIMELY basis) by an editor or editorial team. If google wants to critique a site based on content, links, coding practices or whatever ... then a clear concise set of standards or rules should be set OR at the very least the process of evaluating a site to determin whether it should have the PR0 ban lifted.
Thats just my two cents.
From what I have read, everyone thinnks keyword domains are a pain. But it's more than the domain that makes them targetted as I am sure you all know.
I think what Google needs to do is first discover which domains are non-profit and which are ecommerce. Then give low PR/high relevency sites good ranking for non-profit or very specific searches, and high PR/less relevent sites good ranking for ecommerce searches. This way if someone is looking for something very specific or doing research, they will get their good results google currently serves up, and if someone is searching for products to purchase, they will get high-PR and well established sites (and not fly-by-day spam pages).
My 2 cents :)
Would it be possible to make an xml document available for google to slurp in that would contain the site structure,URLS,Modfacation dates etc. Thereby possibly lowering the bandwidth used up by the googlebot and maybe eliminating the need to spider some sites altogether if the files have not changed?
I'm up to four cents now.
Modification date yes, that would be great.
> port the Google Toolbar to Opera/Mozilla/Linux/Mac OS
There must be some way to obfuscate the checksum generation outside of Internet Explorer. The toolbar PageRank graph is the only reason I have left for sitting in front of a Windows box.
Don't change a thing. :)
Just a reply on geographic filtering. What about travel web sites? They are normally selling to someone on the other side of the planet. Geographic filtering would exclude the most relevant results. There is already a geographic filter built in, when you search your local google results only.
Folks, this is crazy!
I come back from holiday and now I want to post
my suggestions... impossible to run a dupe check
Brett: if we have direct line to googleguy here, it might be necessary to split suggestions for google into subcategories...
With a voting system?
Ok, here we go: some webmasters suggested a login/personalized MyGoogle. I would like that too, with a filter for certain keywords...
(new entries for your keyword "foobar")
Id like to see the catalog stuff in Germany as well and like weblamer suggested:
split the PDF and DOC serps into a serperate window:
1234 web results found, 234 in PDF, 321 in DOC files
leave the PDFs and DOCs out and make a link to them?
And please: could I have a grey PR tip on the left of the result (4/10), please ;-))
Keep up the good work! Keep in touch... and greetings to the bay area... been a while :-)
red eyed pontifex after reading 119 messages in a thread!
Tweak the algo to be a bit more sensitive to the proximity of words in the search query
search for: Google searches from your web page (without "")
sadly for Google, the number one result is Altavista..
and the Google pages presented are not the ones I copied the exact sentence from (it was in the first sentence of [google.com...] )
I know I could find that exact page by using "Google searches from your web page",
but how many people do that?
(Googleguy, just in case you do not have access to the internal stats [webmasterworld.com]) its approx. 1% ;)
And while on the subject of proximity, what about a variable control bar in the toolbar that allows you to adjust the proximity factor?
GG, as always, just add the regular stock options bonus for suggestions to my sticky, thanks.
| This 179 message thread spans 6 pages: < < 179 ( 1 2 3  5 6 ) > > |