Forum Moderators: open

Message Too Old, No Replies

If you were Google?

Which un- or underdiscussed algo-factors should be considered?

         

vitaplease

8:34 am on Sep 10, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



We have chewed on the ranking, crawling and freshness factors of Google such as; Pagerank, linktext, title, headings, keyword density, alt-text, bold type etc.

Which un- or underdiscussed factors would you think need inclusion in the Google Algo?
(never mind how difficult to implement)

Linkless citation
Many pages honour the originator of an idea with a citation on the bottom of the page (e.g. source: dr. Creator in Creative Science). Not always is the creator honoured with a link, even though the originator has a website/page to which could be linked.
Would it be fair to add points for this virtual link? even though it would be difficult to find out which site or page to honour..
Should the virtual anchor text be the nearest words to the left of the little referal note number in the body text?

The specificality of directories or hubs
Should a directory or page recieve extra points if it deep links more to external sites. That is if it offers its visitors links to more specific (deeper) on-topic pages than to general index page?

Deep spidering - if deep linking
Some sites get spidered deeper (completer than others). The consensus here would be that it is mainly Pagerank related. Would it make sense to give a bit of extra deep spidering if the pages of that site consistently offer deep links to external sites? (that is: offering possible extra resources other than only your own?)

Bookrank
Should pages that get bookmarked (added to favourites) more often get extra points? Moreso if they stay bookmarked for a while? Even moreso if they are on top of the list? And get clicked on daily?
(toolbar extra spy deluxe)
Do not forget that even though links offer the biggest access to pages, direct access (i.e. amongst others via bookmarks) is still more frequent than access through search engines. [webmasterworld.com...]

martin

8:39 am on Sep 10, 2002 (gmt 0)

10+ Year Member



>Bookrank

That would be too much to ask from normal people.

Would you share your favorites with Google, if so how many bookmarks will SEOs have?

ukgimp

9:03 am on Sep 10, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I could see situations where longevity of the site would be a factor.

As in: Established 1875 to offer credibility.

Cheers

<added> Of course what was I thinking, what about planetry alignment :) </added>

chiyo

9:11 am on Sep 10, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Deep spidering.. and specificity of links to external sites//... interesting idea..

But i cant see how linkless citation or bookrank could be acheived technically, and even if so, whether it could be implemented without major costs.

To me the one major change that would move Google SERP relevance to the next stage is congruent/related theming from external hyperlinks. It is strange that you get PR (points) for links from totally off-topic sites, and it is against the whole idea of hyperlinking as the engine of the Web which PR is meant to exploit.

Markus

9:27 am on Sep 10, 2002 (gmt 0)

10+ Year Member



What about traffic? It's already tracked by the Toolbar and it's easy to be implemented into PageRank. From the PageRank Patent specifications:

"Real usage data, when available, can be used as a starting point for the model and as the distribution for the alpha factor. This can allow this ranking model to fill holes in the usage data, and provide a more accurate or comprehensive picture."

Note: alpha is the damping factor.

>themeing

I can't see how themeing in any sense of content analysis can be implemented within the next few years. However, Haveliwala's Topic-Sensitive PageRank is a very interesting approach. But themeing is definitely not underdiscussed :)

ciml

1:04 pm on Sep 10, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I would love to see the happy and sad faces used to give us a "people who like the pages that you like also like..." function.

Ideally, we'd have a bit of context, too. I'd set up personality 1 for my local geographical browsing, personality 2 for my SEO information browsing, personality 3 for widget price comparison browsing, etc. Each personality would then acquire a profile, allowing a cross-match for similar profiles.

Even with just one personality and no use of the happy/sad buttons, I'm sure that the data Google receives from my Toolbar could help me find some content of interest.

fathom

1:55 pm on Sep 10, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Do not forget that even though links offer the biggest access to pages, direct access (i.e. amongst others via bookmarks) is still more frequent than access through search engines. [webmasterworld.com...]

Although true, there is a big distinction here between type of visitors. A "bookmark" visitation is obviously a return visitor as opposed to a SE user which (normally) would be a new visitor that can turn into a bookmark one.

Direct maybe either via another marketing media (brochure, mag ad) or self remembering brand.

Personally though I think SE users stepping over listings #1, #2 and clicking on #3 (click through %) holds weighting factors as to how responsive Google's user are to results, as well as users use of back button to Google results. (.eg. - if they don't back out did they find what they wanted)?

c3oc3o

2:09 pm on Sep 10, 2002 (gmt 0)

10+ Year Member



Geographical search, like the project that won their programming contest, sounds like a great idea.

Some kind of analyzation of the type of a page (Tutorial, Online shop, Research paper, Newspaper article, review, blog) and type of content (definition, summary, easy-to-understand explanation, advanced discussion, raw data, marketing) and bias (opinion or disputed facts or definite facts?) would be helpful as well :).
I sometimes miss an option to search for "All about" my keyword or "Introduction to" the topic, instead of, say, news or opinions about it.

Giacomo

2:10 pm on Sep 10, 2002 (gmt 0)

10+ Year Member Top Contributors Of The Month



Linkless citation
I bet Google's engineers are looking at even more sophisticated semantic technologies for the near future...

The specificality[sic]of directories or hubs
This is already happening in my opinion.

Deep spidering - if deep linking
Nah. If you want to be "deep-spidered", mod_rewrite your URLs and make them spider-friendly.

Bookrank
Impossible to implement without some serious spyware, I'm afraid.

europeforvisitors

4:38 pm on Sep 10, 2002 (gmt 0)



To me the one major change that would move Google SERP relevance to the next stage is congruent/related theming from external hyperlinks. It is strange that you get PR (points) for links from totally off-topic sites, and it is against the whole idea of hyperlinking as the engine of the Web which PR is meant to exploit.

You're correct in theory; it's the "in practice" that makes things tricky.

Take my site: I have a European travel site, and I'm linked from a number of university libraries, public libraries, and other reference sites. Their links are relevant (since they're designed for library users who are looking for information on Europe or European travel) even though the sites themselves aren't related to my topics except on one page or one section of a page.

For that matter, the DMOZ isn't relevant to my theme, although a handful of pages in the DMOZ (including the one that links to my site) may be. Does that mean a link from the DMOZ's European travel-guides page should be worth less than a link from, say, the Rick Steves Web site or the European Travel Commission even if all have the same Google PageRank?

I like the idea of weighting relevant links more than irrelevant courtesy links or mutual-backscratching links, but I'd hate to be the person who has to make "theming" work.

Giacomo

7:40 pm on Sep 10, 2002 (gmt 0)

10+ Year Member Top Contributors Of The Month



europeforvisitors: Thematic relation is obviously established at the page level.

I have empirical evidence of reciprocal deep linking between thematically related pages (where the main sites may or may not be on theme with each other) influencing both the PR and the ranking for specific keyphrases dramatically.

Example: your site, which sells "X", gets a link from page {P} of site [WD] which is a web directory. Page {P} is the topic-specific subsection of [WD], listing a number of other sites selling "X"; {P}'s title happens to be "Online X Shops". The link from {P} points to your web site's home page {H}. You put a link back to {P} from {H}, and label it (via alt tag or anchor text) "Online X Shops". As a result, if the initial PR of {P} and {H} is high enough, both {P} and {H} will benefit from the PR feedback loop you have just established through relevant deep-linking; not only that, but also both {H}'s and {P}'s ranking will improve for the keyphrase "online x shops" <added:>especially if {P} linked to {H} using something like "Yet another online X shop" as anchor text</added>. This obviously makes sense if and only if both pages are actually on topic with the alt/anchor text you use for the link, so don't try to turn this into a quick-and-dirty trick to increase your ranking for strategic keywords by linking to off-topic pages using unrelated anchor text: it just won't work. ;)

Slade

8:30 pm on Sep 10, 2002 (gmt 0)

10+ Year Member



What about traffic? It's already tracked by the Toolbar and it's easy to be implemented into PageRank.

You have to remember that you're a power user... You've had the toolbar since you read about it here or actually on google.com, because you want to know your PR.

Joe SearchUser doesn't give a flying flip about PR, they're just looking for something. Also, you leave out the people who don't use IE on Windows. Yes, that might be a large amount you're leaving out, but think about the type of traffic you'd be basing it on...

90% SEO traffic
10% Power users
0% I don't know much about computers, but I'm trying to buy your product

vitaplease

7:13 am on Sep 11, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You have to remember that you're a power user..10% Power users..

Slade,

you are right, this thread is meant to be hypothetical:

Which un- or underdiscussed factors would you think need inclusion in the Google Algo?
(never mind how difficult to implement)

I wonder how many "users" actually have the toolbar installed.
In France the percentage of searches (not users) was
3% recently [webmasterworld.com].
Do you have any other data?

Would you share your favorites with Google, if so how many bookmarks will SEOs have?

martin,

It would only work with a toolbar type spy thingy around. With the current toolbar you are already sharing a lot of info with Google..
Google could only take the top 15 favourites/bookmarked pages into account e.g.

As in: Established 1875 to offer credibility

ukgimp,

that could actually be a new advanced search criterium, show me sites that have been around for more than 5 years..

But i cant see how linkless citation or bookrank could be acheived technically

chiyo,

linkless citation, yes that would be a major beta.. Many scientists however do have an own home page. In a way this is already happening though, as Google indexes all text. If I want to check my ego as Scientist, I can type in my name and see how many Google results turn up..its just not the Pagerank, just Namerank ;).

What about traffic? It's already tracked by the Toolbar and it's easy to be implemented into PageRank. From the PageRank Patent specifications

Markus,

Yes, I never can conclude for myself if I would weight traffic as quality. A sudden increase in traffic should trigger the refresh bot though. In a way bookmarking if possible would lean more towards "usefulness" or "special" than popularity?

"people who like the pages that you like also like..."

Ciml,

I agree, it could come in place of or next to the current "similar pages", which I find only works well for very high Pageranking pages.

(.eg. - if they don't back out did they find what they wanted)?

fathom,

If I understand you well, a nice idea, that could be a very strong factor, maybe already researched by Google?
Extra points for stickyness!
Your site does not load quickly? Searchers leave (click "back" or "stop" within 5 seconds? Too bad your site looses stickyness points..

Deep spidering - if deep linking
Nah. If you want to be "deep-spidered", mod_rewrite your URLs and make them spider-friendly

Giacomo,

I am not sure that always works, check this thread: [webmasterworld.com...]

Their links are relevant (since they're designed for library users who are looking for information on Europe or European travel) even though the sites themselves aren't related to my topics except on one page or one section of a page

europeforvisitors,

Surely the surrounding text would be relevant enough to give topical points?
And if the title/description of that page also equals the theme of your page, you just get extra points.

"X", {P} [WD] {P} {H}'s

Giacomo,

I cannot competely follow your lines ;), are referring to a similar effect as with the linking structure that identifies "similar pages?"

Giacomo

9:09 am on Sep 11, 2002 (gmt 0)

10+ Year Member Top Contributors Of The Month



vitaplease, I've checked the thread you mentioned only to find empirical confirmation that "static-looking" URLs can help in getting all pages indexed.

My example was about the effect of reciprocal "deep linking" between related pages to show that:
1. PR feedback loops work.
2. Theme sharing seems to have great importance in Google's ranking algos.
3. Carefully choosing your anchor text for outbound and (if possible) inbound links can improve your site's ranking dramatically.

danny

11:03 am on Sep 11, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



PR feedback loops work

Not in the PageRank algorithm as published they don't (except in the trivial case, going from one page to two).

vitaplease

12:19 pm on Sep 11, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



confirmation that "static-looking" URLs can help

Grumpus, I agree but it is not enough to guarantee full crawling indexing.
It seems to me that webmasterworld's urls are static, yet sadly only a part is indexed.

rubik

1:48 pm on Sep 11, 2002 (gmt 0)

10+ Year Member



I would offer a choice of 3 different filtering options for searching. The default setting would be "informational pages". The next option would be "shopping pages". Finally, I would offer "regional pages" based on entered zip code (or city/state/country).

This goes to the root of any searcher's query. If they are looking for information or content about a subject, they search for "information". If they are window shopping or intend to buy something, the "information" pages can get in the way and take up the top spots. If they are looking for a regional service or information, they can get flooded with pages that are national or worldwide, diluting the accuracy of the results.

The standard "search" button is too broad in what is returned, forcing users to try to refine their query to return the results they are looking for. Unfortunately, users don't always know the best way to refine their search, and will only further frustrate themselves.

vitaplease

2:19 pm on Sep 11, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



c3oc3o [webmasterworld.com] and rubik [webmasterworld.com] wished one could choose within Google for:

Tutorial, Online shop, Research paper, Newspaper article, review, blog

and

"informational pages". The next option would be "shopping pages". Finally, I would offer "regional pages"

Once Google finally offers the categorisation/topic clustering similar to the top related phrases [labs.google.com], those could be additional options.

I was just wondering if Google would introduce its own metatags, they could offer easier classification as c3oc3o and rubik were wanting.

It could be relatively spam free, as per content/page Google could stipulate that they only read and index the first listed classification. You just have to choose as webmaster.

Giacomo

2:48 pm on Sep 11, 2002 (gmt 0)

10+ Year Member Top Contributors Of The Month



danny,

If Google published every aspect of their ranking algo, then it would be trivial to crack it. ;)

europeforvisitors

6:38 pm on Sep 11, 2002 (gmt 0)



I'd ban any site that attempts to install Gator scumware (as just happened when I clicked on a page 1 Google search result).

savvy1

6:58 pm on Sep 11, 2002 (gmt 0)

10+ Year Member



ahhh that would be nice :) Gator :( scumware :(

vitaplease

6:10 am on Sep 12, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Should Usability give extra points?

- Should any site above 100 pages without a search function get minus points?
- Should a non functioning back-button also give red points?
- Will quick loading be an extra?
- Points for 100% validation?
- Contrast rich text/back ground another plus?

Maybe this is already part of the Google game with Jakob Nielsen on the Technical Advisory Council [google.com].
Jakobility points.

chiyo

6:46 am on Sep 12, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Definately not! google should find good content - and the rest it up the user. Often that is not as usable as it should be but that is no business of google. Usability can also vary according to the type of info and the complexity if it.

I sure hope and think google will stick to the knitting. We dont want it making judgements on websites other than those relating to how well they can transverse them for indexing, relevancy, and how "popular" they are.

vitaplease

5:11 am on Sep 13, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



chiyo,

I was speculating that that would not be the most popular idea :)

How about:

Counting external javascript links as full links?

Many directories have decided to put all external links as javascript redirects. How should one value such a vote? As the ballot-box that the president (Google) disregards?

For internal links I think it is good Google disregards javascript links as they are often navigational (hierarchical) menus repeated on every page. Including those links would flatten out the differences or highlights of a site (or send all Pagerank to the contact us page). Also it would give the Googlebot too much checking work to do.

BikeMan

5:51 am on Sep 13, 2002 (gmt 0)

10+ Year Member



A major problem I find while using search engines are out of date pages with no longer relevant options.

How about some sort points for more current data.

Giacomo

7:17 pm on Sep 14, 2002 (gmt 0)

10+ Year Member Top Contributors Of The Month



Jakobility points

I don't think they will ever get that fussy.

Very funny, though (plus it's going to make for a really cool whack [google.com]). :)

vitaplease

8:49 am on Sep 17, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Giacomo,

would your Google whack be a sign of Google following links from frequently spidered/cached pages (profile pages of moderators e.g.)?

Giacomo

9:13 am on Sep 17, 2002 (gmt 0)

10+ Year Member Top Contributors Of The Month



It's just a sign that WebmasterWorld threads get spidered and indexed by Google on a daily basis; this is primarily due to their "static-looking" URLs IMO.

..Oh, and overall quality of content of course. ;)