Forum Moderators: open
rogerd posted a great one that inspired me to start this thread.
[webmasterworld.com...]
If Google could take one step that would mitigate webmaster claims of unfairness, it would be to have a better reinclusion/penalty-lifting response mechanism. Despite claims that Google is devoting more resources to this issue, we still get many reports of months going by with no reply to desperate inquiries. A simple reply of "lose the crosslinks, you bonehead, and we'll restore you" would go a long way for the well-intentioned webmaster who gets caught in a spam purge.
Parse plurality
Nah, leave the benefit of optimisation by including plural terms in your body text to those who have recognised that the answer to the question "What are you searching for?" is most of the time by definition the plural version of the word.... ;)
Seriously though, I second this because it would prevent you from having to sound stupid just to get plural versions of keywords into your body text, which don't make sense at the single site level, whilst they do at the search level.
Become (even) more international:
Google news in the other main non-English languages
Google glossary in other languages
Google search results offering of link to Dictionary in other languages
Google stock quotes link offering also for non-US countries
Google map offering also for non-US countries
Google telephone also for non-US countries
Google catalog for other languages (plus a put it in a box for my intranet at $750.-)
also mentioned here: [webmasterworld.com...]
make an update FAQ page on the Google site...with a link to it from the Google index page.
"I'm searching for information on..."
"I'm shopping for..."
(or words to that effect).
The Google toolbar search window could offer a similar choice via radio buttons.
The "information" results would filter out e-commerce pages; the "shopping" results would display only e-commerce pages.
II. Harsher penalties for spammers who knowingly employ deceptive practices to influence SERP's (e.g., hidden text).
III. Deploy a posting mechanism so webmasters will know if a site has been penalized. Posts would contain simple categories and durations (e.g., www.BlueWidgets.com: Excessive cross linking. Duration 12 months). Guarantees of reconsideration, one-time-only, but no guarantees of explanation or response from Google, other than updated posts.
Benefits of Point III. Above
1) Spammers
No difference expected here. Spammers already know the rules they are breaking, and posts like this won't change their behavior, nor will posts like this enlighten or help them - face it, most of these guys are way beyond that. Once they see they have been nailed (and they don't need posts to tell them that), they move on and start again anyway.
2) Overly Aggressive SEO'ed Sites
This will *encourage* people to stay on the right side of the "law" (same logic as publishing laws in civilized societies, rather than arresting people in the middle of the night and not charging them ;-) )
3) Well Intentioned SEO'ed Sites
This will *help* people to stay on the right side of the "law" (because these people want to do that)
4) People Who Don't Know Any Better
Gives them a chance to learn and clean up their sites.
If google said to you:-
"ok, no problem, we will do all of those things, but in doing so, google will be running on 3 cylinders only for a month or two and expect hazy results for a bit while we build it, test it and roll it out"
Would you still want them to go ahead?
In other words, until google stabilises, do we really know whether or not the new algo takes care of these things already?
I suspect a large chunk of what people have asked in this thread for their google "wish-list" is being created as we speak.
Spam is google's number 1 enemy and they know that. It will be as high on their wishlist as it is on ours.
I'm not picking a fight here I promise (so don't start flaming me rfgdxm1!) but I am curious...
TJ
Google should support one standard for having the location described in a META Tag. Blogs already do that.
After a bit of botting, google could offer: ' widgets near you'
simple yet powerfull. The last coding contest suggested to get the location automatically out of tiger if address could be found. Supporting a META tag for location would be so much easier.
FEEDBACK
One of the biggest untapped assets of google is there huge traffic.
The idea to evaluate the count of "NEXT" clicks as dmorison suggested is really great! Its already there, just needs a feedback loop.
Google could easily 'recruit' lets say 50.000 people signing up for a special toolbar that would allow high-quality feedback. The top spammers as seen by 50K people are probably a good starting point for manual spam penalties (I do not think that SPAM is the #1 problem for google (yet))
#1 problem is peoples inability to generate meaningful queries
TYPE GUESSING
Right now the web falls into a couple of pieces: Boards, Blogs and other stuff. It is not that hard to detect 80-90% of the B & Bm, since there are a limited number of tools. An option to include/exclude certain types of sites could make results neater
Can we have it by June?
:-)
This one is probably impossible, but what I'd really like to see: when I search for "cold feet", ask me if I mean the TV show, wedding jitters, or ways to deal with having cold feet. When I search for "moonlight", ask me if I mean work a second job, or the light of the moon. Often search terms have two or more completely different subject categories, one of which dominates the results (usually not the one you're loking for!).