Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

How about a new ranking method.

Something to consider.

         

Frequent

6:55 pm on Feb 14, 2005 (gmt 0)

10+ Year Member



When Google started relying heavily on inbound links for their ranking it worked great and gave them the superb results we all used to know and love. Then everyone started to SEO with this (links) in mind and Google started tweaking to keep their results in line with what they consider good quality. This has resulted in the results we now complain about regularly.

Perhaps Google or some other upstart needs to come up with a way rank based on something that is truly relevant to the end user.

This something is bookmarks (or favorites, or whatever you browser chooses to call them). It's time for Google to unleash it's own browser that allows them to manage a users bookmarks Google-side (G-marks perhaps?). This may be easier for a company that is already associated with both a browser and a search engine...insert your least favorite M onster S earch N etwork here.

I know I don't bookmark things that are useless to me. I also immediately clean out irrelevant built in bookmarks when they are encountered.

Is there any better indicator of relevance and quality than the pages a user bookmarks?

Please feel free to hash this out at your leasure. All relevant opinions and viewpoints welcome.

pmkpmk

7:14 pm on Feb 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The issue of Google creating a webbrowser has been discussed in [webmasterworld.com...] and in [webmasterworld.com...] (even though the latter seems to be unaccessible at the moment), and probably in other places as well.

I disagree on the bookmarks. My bookmarks are stuffed and full with useless, probably inexisting links. Only the top-10 URLs in my bookmarks are the ones I frequently access. The rest has been acquired over years, never been cleaned out. I typically put pages in there which I find interesting at that very moment, and which - at that moment - seem to require additional reading during some spare time. Since "spare time" never happens, these links get older and older and in the end osolete. Maybe I'm a bookmark-messie, but I guess I'm not the only one.

What would be a huge step for Google (or ANY search engine), would be two indexes. One "I want to buy or research prices" index, and one "I just want to research a certain topic" index.

I recently experienced weird problems with a satellite TV decoder. When I entered its name into Google, I got literally 15 pages of buying-related results (and a lot of scrapers, ebay/Amazon affiliate sites), including these annoying price-comparison portals, before I found the very first result which talked about issues with the device.

Those 15 pages should go to the "I want to buy" index (and the scrapers filtered out altogether), and that site I finally found should have been #1 result in the "I need to know more" index.

limitup

7:15 pm on Feb 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



For most topics or categories of bookmarks I have the top 20+ related sites bookmarked. How is this going to help Google rank them, when I have them all bookmarked?

Also, there are many other problems with ranking sites this way. Most notably is that people who are good at promoting their sites will get bookmarked a lot, while those that don't get a lot of traffic won't be bookmarked nearly as much even though they may have a site that is 10x better.

In general, the site that gets the most traffic/bookmarks is usually not the *best* site on that topic.

helleborine

7:22 pm on Feb 14, 2005 (gmt 0)

10+ Year Member



That's one thing.

Ask yourself, when you find a page you like and that is relevent to your searched, what might you do? Bookmark is all but one type of response. You will bookmark a site you expect you will use reguarly.

Here are some search behaviours I've noticed:
- I spend more time on a good site than a bad one.
- If not bookmark-worthy, I search for a more specific term, like "Auntie Em's Polka Dot Widgets" instead of just "widgets," if I want to find Auntie Em again.
- If I search for "widgets," I might scan the SERPs for Autie Em and click on her site.
- I might click more on Auntie Em's internal links.
- After I found what I wanted, I might close my search window.

These are all measurable.

But it would be darned difficult for the little guy in position #3558 to make it to page 10.

pmkpmk

7:24 pm on Feb 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



After I found what I wanted, I might close my search window

How would you measure THAT?

BigDave

7:37 pm on Feb 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I can see it now, the exploding whale video coming up for any search related to Oregon.

When I think about what I have bookmarked, it is almost always things that I am very unlikely to find again in a search engine. Given my ecclectec tastes, I really doubt that they would be of much interest to very many other people.

Do you really want one of your searches for "diesel engine parts" to go to the site that I have bookmarked that will custom machine pre-WWII Enterprise diesel tugboat engine parts?

My bookmarks are all really weird stuff, or local stuff like weather, tides and crab season pages.

pmkpmk

7:46 pm on Feb 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Nevertheless, it COULD be one thing to go into a new equation, To what percentage is open to discussion though. And with a Google browser (or at least a Google-bookmark-service - which is actually the only reason I have the Yahoo toolbar installed), you could even measure repeat visits to bookmarked sites. And THAT would actually be data with a value in terms of popularity.

kevinpate

7:59 pm on Feb 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> data with a value in terms of popularity
Or not. Why wouldn't the issue of Alexa
manipulation be applicable to a (for now)
fictional Gooooolexa mechanism?

zjacob

8:00 pm on Feb 14, 2005 (gmt 0)



At this point in time, I wish that Google would have stuck to the old (pre- mid Dec 2004) algorithm, but would have interfered with the influx of new junk domains in other ways.

How? My 2 cents: take the profit motive away from spammers.

A large majority of the newer spam domains that I've seen have Adsense on them (most with nothing else in terms of ads). I don't expect G to manually inspect the 8 billion web pages on their index, but the universe of Adsense account holders is much, much smaller.

Say, if a single spammer with AdSense has a network of 1,000 autogenerated-content spam domains (with thousands upon thousands of pages on them), by terminating such an account, Google would be taking much of the motivation to generate new ones.

I don't know how people in general define spam content, but personally, I'd like to see at least the "directories" with copy-pasted results from Google search results go away.

And you don't need to find all the 1,000 spam domains from one spammer to terminate an account, just one that breaks the G Adsense TOS is enough.

And if, for example, the auto-generated "directories" don't violate the AdSense TOS, maybe G should tweak the TOS instead of any algo.

Furthermore, there should be a (functioning) channel for white hat webmasters to report spam AdSense domains, in effect doing much of the post-approval quality control check for the AdSense team.

Naturally, any reporting should be manually ok'd by a human from G side before any action. But they are already hiring new people to do quality control for SERPs, why not increase the power to weed out spammers from Adsense?

To summarize my take on the issue: stop turning the knobs on the algo (which wasn't broken pre-Dec 2004 in my view), and start taking the (AdSense) bread-and-butter motivation away from generating spam.

Frequent

8:14 pm on Feb 14, 2005 (gmt 0)

10+ Year Member



All excellent points and exactly what I had hoped for.

Most certainly there are relevant arguements to be made for and against any ranking option and no one source can be the one and only factor with regards to who should or shouldn't rank well for a given search.

I kept my suggestion oversimplified for a reason. Letting you all pick and suggest what would and wouldn't work is more important than having everyone say "YES they should do that right away!"

Perhaps I cull my bookmarks more often than most. I have very few that aren't visited at least monthly. This problem could be handled by the browser by "archiving" unused bookmarks periodically. They would still exist unless the user deleted them but would pass less "rank".

Those that are used very regularly would stay active and pass more rank.

A search engine could monitor the content of the page to determine what terms it should rank for. They could even use the name of the folder for determining what specific content was deemed particularly valuable. (Assuming that I'm not the only one who categorizes their bookmarks.)

Keep 'em comin'!

Freq---

BigDave

8:34 pm on Feb 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



While I don't think that it is necessarily a bad idea to look at bookmarks and their usage, it just does not seem to be of great value in the grand scheme of things.

The biggest problem is that all that it tells you is that it made it into someone's bookmark, and they might consider it important enough to go there every so often, but the URL is too difficult to remember. For example, I come to webmasterworld daily. Rather than have bookmarks for it, I just type "web" and hit the down arrow a couple of time to get to the right forum. It is much faster than messing with bookmarks.

What it does not tell you is why they go there and what they go there for. Sure you can edit the default of using the page title, but how often do you actually do that?

Then consider who is likely to have bookmarks, and who is not. It will greatly skew the results towards sites that appeal to technical users.

And this seems incredibly easy to spam. I have 13 computers here in my computer room. There are 6 others around the house and I have around 40 more out in the garage. I could put them to work surfing around the clock to my sites.

Frequent

9:23 pm on Feb 14, 2005 (gmt 0)

10+ Year Member



BigDave,

Admittedly any ranking technology can be spammed. Not very many users are going to have 60-ish computers that they would be willing to network and set to browse 24/7. When you consider the millions of home users with 1 or 2 machines on-line I don't think it would be much of an issue.

Of course there would have to be safeguards in place. Of course people would find their way around them.
Of course there are many other reasons this wouldn't work as the sole source of determining rank.

Thanks for bringing more info to the table. Please feel free to suggest some other options besides adding Bookmarks to the equation.

pmkpmk

9:28 pm on Feb 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



BigDave: the bookmark (or not bookmark) for WebmasterWorld is actually a good example. It's among my top-10, because I use it several times a day. However it reads like this:

http://www.webmasterworld.com/stickymail.cgi?action=mymsgs&imode=16

Probably not, what is relevant to a search engine...

zjacob

9:39 pm on Feb 14, 2005 (gmt 0)



frequent, I'd like to expand from your original idea of user-generated SERP quality control via bookmarks.

Bookmarking could be a pro-active indication from your part that you want sites that are bookmarked to rank higher than others, at least on your searches.

The other side of the coin could be a feature where you mark a SERP result (a domain or page) as spam. For example, this could be a feature that could be integrated to the Google toolbar.

Once you've marked a domain/page as spam, that result wouldn't be found on *your* searches anymore. Also, the spam "vote" could be reported to the G search quality team that could investigate the domain once there would be a pre-determined number of spam "votes" for a domain.

This would empower the SE user in very much the same way as spam is eliminated from email.

I love Yahoo! mail because every time I report an email as spam, the email "results" on my inbox are better the next day.

HughMungus

9:43 pm on Feb 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Good idea, but, it could be given only partial weight...and it'd be hard to determine how much because most people (like me) only use bookmarks when they want to reference it later...if they ever do reference it later.

However, a search engine based on user interaction would be great (just like most search engines know who's clicking on what links in the search results).

phpdude

9:45 pm on Feb 14, 2005 (gmt 0)

10+ Year Member



I can just see the spam emails now!

BOOKMARK this site and we'll Bookmark yours!

It would never work. Not enough people bookmark sites in a enough quantity that would give any search engine a good index to work with.

Not to mention the issue of privacy.

Nice idea but in my opinion is not the answer and could never work.

pmkpmk

9:55 pm on Feb 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The other side of the coin could be a feature where you mark a SERP result (a domain or page) as spam. For example, this could be a feature that could be integrated to the Google toolbar.

There is actually a similar feature on the Google toolbar, though it must be switched on in the advanced preferences.

See here: [google.com...]

Voting buttons: If you especially like or dislike a web page you're visiting and want to share your opinion with Google, you can vote thumbs up by clicking the happy face or thumbs down by clicking the unhappy face. These buttons can also be used to report especially useful or unsatisfactory results after searching with Google. Just click the appropriate button while you're still on the results page. This feature is currently in test mode, so you will not notice any immediate effects based on your action, other than experiencing a warm sense of satisfaction from having shared your feelings with people who really do care.

Dunno where the results end up, and if they have any influence on SERPS.

Frequent

10:02 pm on Feb 14, 2005 (gmt 0)

10+ Year Member



zjacob,

I like what you are saying with regards to being able to filter your search results based on site that you've given a previous bad rating. And passing this info back to the search engine for a closer review of the site (preferrably by a human).

phpdude,

With regards to not having enough people bookmarking sites to create an index. I think there would be pleanty to work with. I've always had a healthy number of active bookmarks spanning a variety of topics and interests and I'm sure I'm not alone. However, you would not need to use bookmarks to create an index it would be to suppliment the index.

If you have any other suggestions for making more user relevant search results please feel free to pitch it for review.

Thanks again for your input guys and gals!

Freq---

Frequent

2:16 pm on Feb 15, 2005 (gmt 0)

10+ Year Member



pmkpmk,

Thanks for the info. I never noticed that feature on the toolbar. I guess I better do some investigating and see what other features I don't know about.

Freq---

julinho

2:25 pm on Feb 15, 2005 (gmt 0)

10+ Year Member



"Simple experiments indicate PageRank can be personalized by increasing the weight of a user's home page or bookmarks".

Larry and Page wrote this in the Section 6.1 Future Work of "The Anatomy of a Search Engine".

pmkpmk

2:44 pm on Feb 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Personalized SERPS will be the end of SEO as we know it.