Forum Moderators: open

Message Too Old, No Replies

Google TO DO list

We all think Google has got problems, so lets make some HELPFUL suggestions

         

kaled

11:24 am on Oct 15, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It seems to me that, almost without exception, WW members believe that Google is suffering right now. So, instead of just blowing off steam, whinging and complaining, why don't we make some suggestions as to what they need to do to help keep webmasters and users happy.

Now, for the most part, I think users are still happy, but there is no doubt that computer-literate folks I know are beginning to use other search engines if they don't immediately find what they want on Google. This trend is likely to continue unless Google gets a grip on its problems.

I will kick off with the following suggestions.
1: Read javascript links (or publicly admit PR is broken and won't be fixed). Also, where possible, read cgi-links. (Where the link contains an url beginning http:// or www. this is easy. However, this might open a door to cheating SEO so this is debatable.)
2: Fix the link: tool so that it shows all links are displayed. Also change it so that it works like ATW and AV and can show backlinks for an entire site. It need not show more than, say, 200 but it should give the total count. This would allow webmasters to feel more confident that Google is working properly if nothing else.

Kaled.

PS
Please vent unhelpful comments and waffle in other threads.

Tobias

9:15 am on Oct 16, 2003 (gmt 0)

10+ Year Member



1) Kick out all affiliate links - instead follow those tracking links and show the merchant site in the search results instead. Filtering all affiliates would really improve the index. If I search for a product or for information on a product, I want a merchant where I can buy it or where I can get a detailed product description. What I don't want is "buy it here!" NO "buy it here!" No here..... where actually those sites don't even sell that product and just have a link to the product. Usually they don't even give you information or a price.

2) Kick out sites with to high keyword density.

Monus

9:40 am on Oct 16, 2003 (gmt 0)

10+ Year Member



1) Google's SafeSearch for other languages (I know they have it for english and dutch but I miss it in spanish, 2nd language in the world).

GranPops

10:04 am on Oct 16, 2003 (gmt 0)

10+ Year Member



MONUS "spanish, 2nd language in the world"
Hombre, no es asi.

Another point, the power of backlinks too high?
Re; one of the sites on which I am allowed to write articles to achieve page one of G, information not product.
For a 2 word key, home page is listed as No.2 with the actual article at No.1. As the home page has absolutely no SEO, just the 2 words as the title of the article, is it justifiable that this appears as NO.2 just because of good PR backlinks to home page, with absolutely no mention of the keywords in those links?
There are 18 articles with exactly the same situation, although one against 250,000 the ranking is 3 and 4.

HayMeadows

3:31 pm on Oct 16, 2003 (gmt 0)

10+ Year Member



1. Improve your response time on spam reporting. This would encourage people to use this feature more.

2. Come up with a way to drop re-directs out of your results immediately.

3. Hand check sites that use "Nosnippet".

4. Faster full-update of sub-pages, and less popular websites. Even willing to pay for this!

5. Froogle Froogle Froogle

6. Froogle

kire1971

3:31 pm on Oct 16, 2003 (gmt 0)

10+ Year Member



If the search term ain't somewhere on the page don't return the page in the results.

Totally agree. The results have gone downhill since that started. Link text is not always correct and is subject to manipulation.

I find it very annoying that when you type in www.google.com, you automatically get the version that fits the country you are in.

I'd like to take that one step further and get a true US version and keep the UK, AU, etc. sites in their own index. I get way too many of them in my search results.

Kick out all affiliate links - instead follow those tracking links and show the merchant site in the search results instead.

Affiliate links are a necessary evil. Without them, half the merchants would be out of business and all that would be left is Amazon.com.

Another point, the power of backlinks too high?

Certainly the anchor text in backlinks is too powerful, especially from the ODP. I have an ODP listed page that happens to be listed with my site name(which is also my url) as the anchor text. For the site name as a keyword, the ODP listed page outranks my homepage.

--------
From what I’ve seen over the past year it looks like the small niche sites are being pushed lower and lower in the rankings. Large, high PR sites seem to be coming up for searches when they may be relevant but not relevant in a specific way. They’re often not what I’m looking for or I need to search the larger site again to find what I’m looking for. I used to get the specific results first and the best results came from the smaller niche sites. In my opinion it’s due to PageRank and the way Google now handles phrases. It seems keyword proximity or an exact phrase match is less meaningful than it used to be. Now, the phrase sometimes isn't even on the page. PageRank dominates too much.

kire1971

3:35 pm on Oct 16, 2003 (gmt 0)

10+ Year Member



PageRank and a link from another site is supposed to be a “vote” for the site being linked to. Unfortunately, buying listings, link exchanges, and a variety of other factors devaluate the impartiality of PageRank. In addition to PageRank and all other algorithm factors, why not base popularity on bookmarks? Adding a site to your favorites or bookmarking is the only true and unbiased “vote” for a site. We don’t bookmark spam or junk sites, we only bookmark the best and most relevant sites. Something like this can be so specific that it not only measures popularity but also relevancy by tracking what the search terms or link text were when a page got bookmarked. It would also be very tough to manipulate since you would need thousands of individuals to make an impact.
How can it be done? I’ll leave that to the brilliant Google and Microsoft engineers. If they can’t read bookmarks already saved I’m sure there are ways around that. Maybe Microsoft can come up with a new IE that can capture info on pages being bookmarked. For Google, how about a bookmark list in the Toolbar or a “bookmark this result” link in the search results display?

Has such an idea been discussed before? If not, Google and Microsoft can feel free to chip-in and make me a millionaire… or at least list my site at the top of the rankings :)

kaled

3:59 pm on Oct 16, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Provide a lookup service so that webmasters who think a site has been penalised can discover if it is true or not. Presumably, Google keeps such a list for internal use so the only real issues are how to achieve it technically and could the system be abused. Frankly, I cannot see any way that this could be abused and it would, presumably, save some time in email support (provided a detailed explanation of why a penalty was in place was provided by the lookup service).

One thing is certain, it would reduce the number of Have I been penalised threads? on WW.

Kaled.

jimbeetle

6:50 pm on Oct 16, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



One thing is certain, it would reduce the number of Have I been penalised threads? on WW.

Trying to count how many times people can use "Google" and "penalty" in the same sentence is half the fun of WW.

jon80

7:33 pm on Oct 16, 2003 (gmt 0)

10+ Year Member



>"spanish, 2nd language in the world"
Hombre, no es así.

Off topic but:
English is second, Spanish is third, Mandarin Chinese is first.

Monus

10:32 pm on Oct 16, 2003 (gmt 0)

10+ Year Member



Off topic but:
English is second, Spanish is third, Mandarin Chinese is first.

Well not everybody agree with that.
www.google.com/search?hl=en&q=most+spoken+language

kaled

12:24 am on Oct 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Money talks. The commonest language of money is English. The Chinese and Spanish speaking peoples of this world have fewer bucks to spend and fewer computers with which to spend them (from the point of view of professional webmasters).

From the first message of this thread

Please vent unhelpful comments and waffle in other threads.

Kaled.

Dave_Hawley

2:06 am on Oct 17, 2003 (gmt 0)



[quote]why not base popularity on bookmarks? Adding a site to your favorites or bookmarking is the only true and unbiased “vote” for a site[/quoute]

I can see it now on all homepages, "bookmark us and get 10% discount" I think this one will be easier to manipulate than inbound links.

Dave

Monus

10:37 pm on Oct 18, 2003 (gmt 0)

10+ Year Member



Money talks. The commonest language of money is English. The Chinese and Spanish speaking peoples of this world have fewer bucks to spend and fewer computers with which to spend them (from the point of view of professional webmasters).

This is about google and the things to do, not about money and what is the most common language. That there is no Adult filters for the most common languages is very bad and strange for a company as google.

otnot

12:50 am on Oct 19, 2003 (gmt 0)

10+ Year Member



I think that a rolling update that works would be a good idea. LOL.

superportal

2:18 am on Oct 19, 2003 (gmt 0)



Each result should be customized to ME not to ALL PEOPLE. Google has it upside-down right now. They're ranking the answer when they should be ranking the relation between the question and the answer. No two people should get the same results on the same terms over time. Like a fingerprint.

If I always search for the MYSQL manual and always click on that mysql.com link then my LOCAL, PRIVATE, file should check that before I send the query and get that closer to the top. Repeat, sort the result based on my previous results (unless I say differently) If I spend a lot of time on DPREVIEW and then type in "Canon d300" I want DPREVIEW results to show first usually. Are we on computers or on we on Atari 2600s? How many Google rocket scientists, Marketing VPs and IPO mavens does it take to make this happen? Of course some people may want STEVES to show first....

I should be able to manually edit my LOCAL,PRIVATE preferences file. This is like a bookmark system but much more sophisticated, it's a personal filtering system, a web document firewall that can learn from my queries, rank my queries both against my other queries and against a relative value based on those queries, allowing meta-filtering rules, free beer and whatnot.

If I follow the wrong link and end up in a vortex I want to hit the big red "THIS SUCKS" button and not see that domain again in my search results (or at least severely downgrade it, perhaps there would be an OBLITERATE button for the really egregious cases).

Rank the QUERY not the web page.

Please do not give me some goody-goody know-nothing-about-me web "editor" dictating what links are relevant to me, what I should look at. That would be the absolute worse-case nuclear winter scenario and would be the end of Google as a respectable search engine. Perhaps, there can be a suggestions drop down or something as an extra, I'm willing to support that as long as I can choose which editor/s I like and the results are secondary, not my main results (or at least have a choice). Several sites allow you to rank the result. That's fine for google to do that but each person's choices should have NO impact on anybody else's results.

What is Rainbow_Mama_69 doing ranking my web links anyway? ...or Googleplex Corporate HQ for that matter....

kamikaze Optimizer

5:29 am on Oct 19, 2003 (gmt 0)

10+ Year Member



4. Faster full-update of sub-pages, and less popular websites. Even willing to pay for this!

"Faster full-update of sub-pages, and less popular websites."

Absolutely.

"Even willing to pay for this!"

No, no, and no... period. Just no.

pegaweb

5:41 am on Oct 19, 2003 (gmt 0)

10+ Year Member



Here are my suggestions:

1: Increase the value of pagerank greatly. In my "topic area", a list of all the relevant sites, ordered by pagerank would produce an almost-perfect list of the best sites. A lot of people (including myself) are being outranked by some very garbaginous pages these days. :)

2: Massively downgrade sites as they fail to be updated. (Sites that haven't been updated for a year should be sent down about 25%, two years - 50%. Sites that haven't been updated for five years should be riiight at the bottom.)

3: Downgrade abandoned pages (i.e. pages with excessive broken links, or broken Image links.)

4: Ignore anchor text.

5: Let me get rid of those little binoculars on the Google Toolbar :)

6: Bring on the Voting Buttons! I write lots of quality stuff, but the Googlebot doesn't know it. I know who the best Photoshop sites are, and they certainly aren't 1, 2 and 3 on the list.

7: Let me change my email address in Adsense. (I'm sure you know about that one though :)

steveb

7:03 am on Oct 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



1. Update the Directory so expired/hijacked domains in highish pagerank categories aren't a commodity and don't get unmerited benefits from being listed long after they have been removed from the rdf dump.

2. Update the Directory so quality sites that have been in the rdf dump for the past seven months can get the benefit they desrve.

3. Update the Directory so you don't look like a bunch of dorks.

victor

7:20 am on Oct 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



4. Just update the Directory -- who needs a reason!?

markdidj

8:32 am on Oct 19, 2003 (gmt 0)

10+ Year Member



Beat rival in thier law suit against Google. ( Or is that over ). Some things shouldn't even have a copyright applied. Good luck G....

markdidj

8:43 am on Oct 19, 2003 (gmt 0)

10+ Year Member




2: Massively downgrade sites as they fail to be updated. (Sites that haven't been updated for a year should be sent down about 25%, two years - 50%. Sites that haven't been updated for five years should be riiight at the bottom.)

I don't agree with this one, sorry. What about the historic subjects. This suggestion is only useful for people looking for current up-to-date info, probably relating to computers, stock, money-making. What about students looking into history. A good paper might have been written by a professor 5 years ago, who's gone on to write others on different subjects. Then a student comes along and writes a new paper as a college project, and that gets a higher PR! That's daft.

Putting more emphasis on voting buttons, and timing how long a visitor stays at a site ( or doesn't stay ) I think would be useful to include.

rfgdxm1

9:56 am on Oct 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>I don't agree with this one, sorry. What about the historic subjects. This suggestion is only useful for people looking for current up-to-date info, probably relating to computers, stock, money-making. What about students looking into history. A good paper might have been written by a professor 5 years ago, who's gone on to write others on different subjects. Then a student comes along and writes a new paper as a college project, and that gets a higher PR! That's daft.

Yep. The flaw in this logic is that there are topics that are static.

Jakpot

10:35 am on Oct 19, 2003 (gmt 0)

10+ Year Member



Return to the monthly updates. The weekly update destabilizes the SERPS for 2-3 days and has resulted in
sale/commissions decreasing up to 40% since this weekly
disruption started. Have not changed any of my web pages.
I missed the google objective justifying the weekly updates
and question its added value.

pegaweb

10:50 am on Oct 19, 2003 (gmt 0)

10+ Year Member



Would you want to read something historical that was written this year, or 25 years ago. Even history isn't static.

I think the vast majority of topics (if not all) prefer up-to-date information.

And even in the case of history, the professor's old paper would be beaten by the student's new paper. But that would in turn be beaten by a professor's new paper. (The top positions would all be occupied by professors' new papers, not professors' new and old papers.)

It remains my opinion that Google should downgrade very old and abandoned sites.

kaled

11:35 am on Oct 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



My suggestion for the Google directory is to scrap it. It is and always will be an out-of-date copy of an original work. Just exactly what purpose does it serve? Well, I suppose it has two uses.
1: It allows webmasters to check website PR (out-of-date PR?).
2: It reduces the load on the original ODP.

Assuming that the ODP could manage the extra workload that would result if Google scrapped their copy, doing so would be no bad thing.

There you are, a simple suggestion to save money, manpower and resources.

Kaled.

Lightfoot

1:19 pm on Oct 19, 2003 (gmt 0)

10+ Year Member



Google should run and edit its own directory.

John_Caius

6:05 pm on Oct 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Surely not... "Google should run and edit its own directory" implies "Google should employ 10,000 more people to do what the dmoz volunteers are already very experienced at doing."

My helpful suggestions:

1) Alongside the spam report form, have an expired content alert form. In one search, the number one result in 3 million contains only the title "No More" and the text "It has become obvious that I don't have time to maintain this page. Yahoo is an excellent starting place for information of all kinds." - nothing else. Its position is entirely due to incoming link anchor text and having that site at number one is of no use to anyone. It's been there for at least a year.

2) On the reducing affiliate sites in the SERPs issue, something that might be worth looking at is the now pretty extensive database of sites submitted to dmoz (and rejected) that have been hand-noted as containing exclusively affiliate content. I see no way other than human editing to recognise affiliate sites and the dmoz list of red-noted sites must be the largest around. If nothing else, it might be worth using as a start for Google employees to hand-pick affiliate sites or identify common factors.

Ok, back to measuring several kilos of radioactive excrement - yes, that's really what I'm doing... :(

Macguru

6:53 pm on Oct 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Offering users living in multilingual countries to select regional AND language filters right under the search field. For the time beeing, its one OR the other.

>>Ok, back to measuring several kilos of radioactive excrement

Its all Marcia's fault [webmasterworld.com]! (msg # 15) ;)

markus007

7:07 pm on Oct 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



penalize all sites with only 1-10 pages for competitive SERPS, chances are the site is just an affiliate or scammer.

wmburke

7:11 pm on Oct 19, 2003 (gmt 0)

10+ Year Member



I agree with many of the suggestions...

My $0.02...

a) De-emphasize the <title> tag in SERP. Too much baloney
out there to continue supporting this, and far, far too easy to maipulate.

b) Again, as said before: Do something about to spam reports!
I've reported sites w/ BLATANT abuse.. hidden text, reciprocal link farming, etc., etc.Three months later...
Nothing. The same sites are all still well-ranked as a reward for their cheap, deceptive tricks.

c) De-emphasize the importance of link text, unless it is again reflected well on the page linked to.

Hope you're still tracking this thread, Google Guy.. :-}

This 109 message thread spans 4 pages: 109