Forum Moderators: open
1) Devalue links from pages that have not been updated in the past year.
-- Many older sites have an unfair advantage because they have links from pages that have not been updated in five years. This sort of ancient "vote" should carry less weight because it is not a *current* vote, and because newer sites are unable to get such a link from the dormant site.
2) Somewhat devalue internal links.
-- There is no earthly way a 100 page site with 210 (similar) backward links will ever get the page rank of site with 1000 pages that has 195 (similar) backward links. The 1000 links back to the main page matter too much. While a 1000 page site implies more content than a 100 page one, this is certainly not *always* true. The playing field is currently so unlevel the 100 page site is always at a huge disadvantage (as opposed to a small disadvantage).
3) Attempt to do a better job with relevancy.
-- A site with an apparently on-topic title that has zero inbound links from the top 100 sites for a keyword should be devalued significantly for that keyword.
4) Devalue domains that have been entirely unchanged for a year.
-- Some sites (especially geocites-type ones) have been dormant for several years and will just sit there forever. Their content may have been relevant a few years ago (like announcing some event that took place years ago) and they may have got relevant linking back then, but now they are just purely wasted space.
In other words, other sites are now defining what your content is about. This is getting into the realm of theming, which several people around here have been suggesting was going to happen.
My suggestion is to ease it up, and restore more control to the web site to define itself.
So basically Google needs to know the difference between generic words like store,book, baseball, etc and coca cola, toyota, etc. If the search uses a generic word then no value should be given to domain keywords. If a search is made for something else then keywords in domains should rank extremely high.
While theyre at it you should be penalized for having a hyphen in your domain. I mean who the hell has a business name with a hyphen in it. Something is seriously wrong when I search for a familiar term and get all spam yet the major sites for that category are in the back.
Also while im at it, this crap about other sites determining what our sites are about is not a good direction to go. This nonsense with anchor text needs to be dumped as well, it is simply rewarding sites with keyword rich domains "aka spammers" because sites usually make text links out from the domain itself. Google is really becoming a spam filled joke.
Now what things should count? How bout content and number of pages in your site. How bout the age of the site and how often it is updated. Sites rarely updated would be punished.
Whatever it is google needs to do it and fast. As webmasters we should be concerned with making the best site for users possible, not having to worry about having certain text in certain places and all this nonsense.
I'd like to see concrete suggestions for what could make the Google search results (even more) relevant and fair
What makes one web site's owner believe their web site is more relevant to a topic than someone elses in that industry and targeting the same market.
It is natural for someone to consider that all web sites above their results position is less relevant and some of these may even considered spam.
The simple fact is regardless of your ranked position, if a search engine user finds what they are looking for and are happy with that company's information, products or service how can this be intrepreted as unfair.
If search engine users don't find what they are looking for at #1, they generally click through on deeper listings and even on yours.
The site owners ranked below yours probably consider you as having that same unfair advantage but should the user click to your site and not find what they are looking for they will move on.
I don't believe search engine algos are problematic. Spam is spam and may even be at the top but it is the search engine users which will define these sites as spam (or just less adequate) to their needs and not us as competitors.
From my SERP experience this month, this has already been done. The main success of my sites was based on this, and they all crashed and burned this month.
I agree with this assessment on google's part, because it eliminates one more chance that big sites have to abuse Google's system. For my own sites, this was a unfortunate side effect.
I personally think that google should try to stick with a fair majority of 'off-site' content, as it eliminates the bias that webmasters all have that 'my site is the most relevant'.
With this update, the emphasis is coming from "on the page" factors of the pages that are linking to you, and much less from how your site defines itself. – martinibuster
martinibuster I’m still seeing that on page and on site changes do and can make quite a difference. I’m seeing a cleaning out of pages where the only reference to the keyword on the page is in a link text going out. I’m posting on it further in [webmasterworld.com...]
Very simple , totally eliminate keyword value in the domain names. Any true business would almost never have a keyword in their business name because of branding. Because so much emphasis is made to these keywords in domains you get SEO punks in basements setting up hundreds of domains rich in keywords promoting affiliate programs. – stratocaster
Welcome to Webmaster World stratocaster, and bah. Sorry but I just don’t agree. First off I don’t believe its Google letting the actual keywords in the domain directly affect the results but the fact that when folks use that keyword useful URL in an anchor text, which we know often happens, it then used to affect results. Maybe so, maybe not but I love them, no problem. Even for branding if you are good at what you do, but really it comes down to getting all you legitimately can from every campaign including naming. The Internet is a whole new realm in which to market a business. My mother would say don’t cut off your nose to spite your face, haha.
Stuntdubl – could it be as I am seeing this cleaning up of pages where an internal link may have said the page was about blue widgets but when you get there you can either nothing about blue widgets or the only reference to blue widgets in a link pointing to another page. I see these being cleaned out. And those where blue widgets is only mentioned in the css heading tag.
I think you missed my point with #3. Suppose we search for "tractor". Google, searchers and webmasters with tractor websites wants the the search results to list tractor-focused websites. Now suppose some company that sell king fu DVDs decides for some reason to call itself "tractor industries". Other DVD sites will link to that site using "tractor" in the link text... but NONE of the other top 100 "tractor" sites will link to it.
I don't want to get too literal, but ideally you would not want a site in the top ten results that gets no links at all from the other top 100. In this case, Tractor Industries can get a top ten listing that isn't relevant or useful to anybody, including themselves. (I'm not exaggerating in this example.)
To oversimplify it, I'd therefore like to see Google require linking from at least ONE site in the top 100 of that keyword for a site to get listed in the top ten for that keyword. This wouldn't purge huge amounts of non-relevant results, but it certainly would tidy up the results.
GoogleGuy, if you read this, I'd like to know: do you REALLY think Dmoz is worthy of the authority you guys give them?
While theyre at it you should be penalized for having a hyphen in your domain. I mean who the hell has a business name with a hyphen in it.
My most humble apologies to the moderators for this. But I must show these two real examples that immediately popped into my head on this one. And their real business name does contain the hyphen.
[harley-davidson.com...]
[merriam-webster.com...]
I bet there are more than a few legal firms or big accounting firms with hyphenated names.
How about people who have hyphenated last names and want a personal website? Is their last name worth less than your's?
I have a couple, or three, listings in DMOZ, but still I must agree. DMOZ is junk. How hard can it be to delete a few thousand dead links?
The funny thing is, they admit that most editors are SEOs affiliated with sites in categories the SEOs edit.
I edited the ODP, except that when I edited, I knew who my competitors were and so I added them. ( The whole idealistic "Republic of the Web" thing got to me. )
To clarify, my competitors had not submitted their sites: I added them anyway.
However, one deep-pocketed site had multiple listings under different domains. I deleted the spam, leaving just one listing for the spammer in the appropriate cat.
<Email excerpts snipped [webmasterworld.com]>
Now tell me this, how many spam domains would Yahoo! have listed? None.
Yahoo! is a serious directory. It could be cheaper, yes, but at least it doesn't go listing every BS site that happens to submit. And their editors are accountable.
[edited by: ciml at 5:51 pm (utc) on Sep. 30, 2002]
All my domains ( 6 of 'em ) are hyphenated. I feel its easier to read. Of course, I have the same domains w/o he hyphen redirecting to my sites, but the sites are know by their hyphenated names.
What would my domain look like if it was an information source for American Standarard Shole. (Polyurethane Shole) Thats quite long so lets abbreviate: ass****.com
I'm thinking a hyphen would do nicely: as-shole.com
[1) Devalue links from pages that have not been updated in the past year.
Steveb
I had a similar staleness comment here [webmasterworld.com].
Stale does not always mean bad:
Is the original Pagerank document of Serge and Larry less authoritive so many years later on?
I would agree with your comment if you added that if those "stale" pages do not get any new incoming links, it would get a tuned down value for its outbound links.
In the example of Serge and Larry, that document still gets linked to nowadays, so its still very topical.
2) Somewhat devalue internal links.
-- There is no earthly way a 100 page site with 210 (similar) backward links will ever get the page rank of site with 1000 pages that has 195 (similar) backward links. The 1000 links back to the main page matter too much. While a 1000 page site implies more content than a 100 page one, this is certainly not *always* true. The playing field is currently so unlevel the 100 page site is always at a huge disadvantage (as opposed to a small disadvantage).
Purely Pagerank wise I would disagree. Unless you mean that it is more likely that 1000 pages are more likely to earn their own external incoming links (and thus Pagerank) and therefor can carry that forward towards the other internal pages.
From what I can guess at the moment. Internal links do count a little less in this update (not as much as with Fast though).
yes some content is timeless. eg project guttenburg content, works of art, books of religion, works of fiction, some is very useful for a year or two (we can look back at Dvorak's columns and find out how many time he got it wrong!) while CNN news is out of date in 6 hours.
Freshness is harder to spider than one would think, but i think vita's comment is a good start. Recency is not always good, it hasnt stood the test of review and time. In the case of news sites it's imperative, which is probably why Google should banish all news sites except for front pages (and that includes ours) to news.google.
OK im being provocative... sorry :)
Also, I think I made a better point with #4... if whole domains are not updated at *all* for one year or two years or whatever, their value should diminish over time. The practical value of the Magna Carta diminishes even if it will be important forever.