Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: open
I think this shows us the power of pre .com keywords in the algo.
Assuming that Blue Widgets is my keyword string, should I just join up and register smartbluewidgets.com? Some of these keyword laden urls look so dumb, you'd think its time for an algo tweak?
Easily brandable. You want "google.com" and not "mykeyword.com". Keyword domains are out - branding and name recognition are in - big time in. The value of keywords in a domain name have never been less to se's. Learn the lesson of "goto.com" becomes "Overture.com" and why they did it.
I'd say Brett is giving bad advice here in some cases. This seems to make sense if you are running a commercial site selling multiple products. However, if I happen to have a website where I am only selling green widgets, then what is so bad about the domain green-widgets.com? Not only does it get a boost because the keywords are in the domain name, but when people *see* the URL it immediately stands out as a site that may have exactly what they want to buy.
Also, this doesn't seem to apply to a lot of information sites. If someone has a site about breeding dinosaurs, dinosaur-breeding.org sounds like a good domain name to me.
And if there are any doubts about the power of inbound links, check out the cache of result number 3, the Smoky Mountain one. This guy's a titan!
And, things like the fact if I run keyword1keyword2.com, I sure as heck would have optimizized the on page text with lots of instances of keyword1 and keyword2. I suspect other webmasters would to. I also totally disbelieve that Google is actually trying to parse out keyword1keyword2.com as if it were the same as keyword1-keyword2.com. Take a look at this very domain, webmasterworld.com. Parsing that URL without hyphens, is this a site for webmasters of the world, or is it a site for for those seeking to achieve world domination via the Internet? ;) If Google parsed that as the same as web-master-world, this site would then rank a lot lower than if webmaster-world. Because of this danger, not trying to parse URLs makes sense in a SE algo.
The problem is that this is too easy. All it takes is buying a domain name, and getting links back. No great content needed, just at least the term in the body or tags somewhere. Google will find a way to make sure that the Title of the site is well related to content soon I am sure.
This DOES create a potential for spamming just becuase it is so easy. It IS possible to get great rankings with a very poor site.
My feeling is that the google ranking algorithms are always changing, and as optimization technqiues catch up with them, they will change.
This may include taking the content of the whole site into account as well as the page only, and maybe theming in incoming sites.
Im not too worried. The only options are 1) to wait a while for Google to improve, or join the crowd using their knowledge which works at present. (we have done both for different sites)
Long term, the only option is content content content and target target target and using as many of Googles stated 100 criteria (that you can guess) to make your sites more attractive to them. Spread your optimization efforts over several technqiues. Thats long term.
I disagree in that they seem to be relatively large. However, things like the keyword in the anchor text of links, keyword in page title, on page text, and PageRank are also very important. A LOT of this will have to do with how competitive the search keywords are. For very non-competitive keywords, just having them in the URL, with a little on page optimization, may be enough to get close to the top. If there are a LOT of pages that match they keywords you are aiming it, not so. For example, I could probably come up #1 for a search on "capybara pollination" pretty easy with just the domain name capybara-pollination.org and a link from one page with a low PageRank. Dunno that many will ever search much for that though. ;)
Absolutely. The only problem seems to be with those who do not use domain names to reflect the content of their own sites.
If www.blue-widgets-with-stripes.com is on topic for blue widgets with stripes, where's the problem? It's simply telling the punter what the site is about.
Furthermore, I bet 99.9% of sites that use descriptive domain names are actually descriptive of the actual content. Why would us off topic domain names?
If Google uses this fact I just don't see a problem. It seems perfectly sensible to me.
And again (I've been here before), names like this are fantastic for real world... billboards, sides of vans, etc. I just can't get my head around why anyone objects (other than perhaps that they don't use them).
Age and wisdom are supposed to tell us that when, time and time again, friends and peers disagree with you continually and consistently, then perhaps you are, just this once, obviously, wrong, and that it's time for a rethink. So, colour me very young or very stupid and stubborn, because I still think that the no-brainer-toss-in-my-keywords domain 'trick' IS overly weighted in the algorithm.
One bad apple and all that aside, in my area (travel, not affiliate stuff, where it's even worse), I too see high-up pages with titles such as new_document.html or default.html, and the sites with hyphenated-keyword domains are, pretty much without exception, Johnny-come-latelys, with little content, little optimisation, little pretty much anything you care to name, except the great big (yet hidden) signpost that screams 'I'm a mirror/doorway/gateway/harvester'.
OK, disclaimer first - our site is numer one, or, at very worse, on page one, for pretty much any keyword-combo or phrase you could dream of. The last few Google updates have loved us, we're getting daily trawls and pretty much daily Fresh tags. Couldn't be better. So, this is not sour grapes, just a well tempered and informed-by-long-observation take on things. And that take is? - IT'S JUST TOO EASY TO DO. Sorry for shouting.
No finesse, no time-served in terms of content and link-building, no reason whatsoever for that site to be where it is. You'll just have to trust me on this - I do know our market, and I do know the players - I'ts not that big a pond, and the bulk of these sites really are cookie-cutter types - 'use once and throw away' - that weave a breadcrumb trail home to Daddy. Not in a way that Google will follow, but once an enquiry is made through such a site, then the punter is led, smiling innocently, towards the real portfolio beef.
With hindsight, I too would have gone for a slightly more on-the-money name for our site. But that was four years ago, and now it doesn't matter. I have no problem at all with the principle - aside from the fact that most of these domain names look pretty ugly - but the (supported in this thread) assumption that 'it's called we're-really-great-at-this.com so then, like, hey, they must be' seems to me to be surprisingly naiive. Why are you all so willing to implicitly trust a domain name, but 99.9% of you wouldn't trust a keyword metatag if you'd given birth to it yourself?
That's what I find most odd. The fact that someone has gone to the not overly taxing extreme of spending a very few Dollars/Euros and come up with a sledgehammer-to-brain-and-wallet domain name is enough to convince a lot of people that the site will be, by extension, right on the money, no doubt about it deserving of a high ranking. It says so on the label, doesn't it?
Maybe your neighbourhoods are different, but in mine, it looks like a get-ranked-quick scheme, and it does seem to work.
EDITED TO ADD - Napoleon, we've head-butted on this before, I know. It's not the usage that bugs me - sure, it makes sense - no, it's the rewards. That, after all, is how this thread started out, and that is what I have a beef with.
In the light of this, isn't that a perfectly good fact to consider when ranking sites? It's a lot safer than many of the criteria they use.
As for people clicking on it... well... perhaps they too have learned through experience that the domain name usually reflects the content. Also, maybe I am right about the real world - people actually remember the domain name far more readily if it is a proper construct.
The neighborhood may be the problem.
This goes back to my criticism of Brett: "Domain name: Easily brandable. You want "google.com" and not "mykeyword.com"." D'ya think everything on the WWW is somebody shilling something? I personally only use e-commerce sites rarely. I suspect there are many on the Internet like me. What if some gourmand who liked Italian foods wanted to share a lot of recipes he had collected and registered pasta-sauce-recipes.org. (And, before the mods jump on this, I checked and there is no domain like this registered.) Informational sites and the like will tend NOT to try and develop a "brand" for their domain, and choose a domain with keywords. As Napoleon said: "It's simply telling the punter what the site is about."
No algo is perfect. NONE will work well for all searches. However, have you considered that the Google algo may work well in a lot of other instances? Also stop and consider that in a highly competitive and lucrative search area such as travel, there will be a LOT of SEO types doing all that they can to fight to the top, and succeed better at beating the Google algo. However, for the info sites like the hypothetical Italian food one I mentioned, the webmaster is a more likely to pick a name that gives tells the punters exactly what they will get.
One question to mat: Have you checked exactly what the PageRank of these "Johnny-come-latelys, with little content" have? If they are doing this for money, I wouldn't be surpised that they have managed to use some other Google SEO tricks besides just a keyword1-keyword2 domain name.
similar thread in parallel here at the moment: [webmasterworld.com...]
The basic question is:
All other things being equal, should Johnson.com selling blue widgets rank lower than Blue-widgets.com, just because of the different url name?
The answer is of course NO! The added effect should be nullified.
The sad reality for the moment and in the past is YES! (amongst others within Google, because of the anchortext effect).
It has been mentioned earlier within a Google suggestion/discussion thread here [webmasterworld.com] and and there [webmasterworld.com] and with a bit of luck it has also been plugged in as a todo! :)
I would think it to be best to just disregard the seperated words in anchortexts should they be part of, or equal the to-be-linked-to separated words of the core-url and instead to use other factors such as the nearest surrounding text etc.
The link, and its Pagerank gift, and all the other secret benefits should be credited but not this see-through effect, in my mind.
[edited by: vitaplease at 7:45 am (utc) on Sep. 6, 2002]
Has anyone actually seen any statistics about what percentage of domains in actual use are commercial sites, rather than non-profit info sites? Absolutely no logic for a non-commercial site to use keyword stuffed domains that don't reflect the content. And, even in the case of commercial sites, I tend to think in most cases of a green-widgets.com the site actually does sell green widgets.
>The answer is of course NO! The added effect should be nullified.
But would the person running pasta-sauce-recipes.org I mentioned choose something like brandname.org? NO. What doesn't work well as an algo for johnson.com does rather well for pasta-sauce-recipes.org. And, if you throw the keyword in domain benefit out, then if johnson.com is selling blue widgets then I maybe I can succeed with stuffing lots of instances of "blue widgets" on my page and stuff the page text with keywords rather than the domain name.
Have you checked their backward links? They might all be owned by the same people.....site rings, etc etc etc
One tip here. I have seen multi-keyword url sites get high rankings in fairly uncompetitive terms with no back-links showing in Google for the keyword urls in anchortext.
If you however check with alltheweb's backlink function you can see plenty links (all below PR4, but in quantaties together they do add effect) all using the keywords in url in anchortext. This occurs more often in non-English languages pages which naturally have much lower PR's.
Don't get me wrong. Anyone should use pasta-sauce-recipes.org if that suits them and their clients/visitors. But poor old Johnson.com being in the pasta sauce recipe business long ago, before Google's algo kicked in, should have just as much chance in high rankings as pasta-sauce-recipes.org.
And, if you throw the keyword in domain benefit out, then if johnson.com is selling blue widgets then I maybe I can succeed with stuffing lots of instances of "blue widgets" on my page and stuff the page text with keywords rather than the domain name.
The point here is that Google has done an excellent job in cleaning out keyword stuffing and other on-page SEO over-emphasis in their algo ranking.
But my tendency is to think that for the most part sites with keywords in domains *do* tend to reflect the content. Stop thinking commercial for a second (which seems to be the area of interest of a lot of people posting here) and think of websites run by the little non-commercial sites and such. The Google algo tends to do well in most of those cases. By lowering the importance of the keyword in the domain, Google might well produce better SERPs on searches for terms that happen to be the sort people SEO for, but lower the quality in cases of searches where the search terms have few if nobody gunning after in the SEO world. Searches on travel will improve, while searches in other areas will get worse.
But my tendency is to think that for the most part sites with keywords in domains *do* tend to reflect the content.
I agree as it's only one part of the whole "ranking" process. If you had big-yellow-widgets.com and actually sold cars on the site - it wouldn't rank at all well due to the lack of Widget content and no links from Widget-related sites. But if you sold Big Yellow Widgets and had Widget pages and had good quality Widget links - it would!
Man, I need to start selling Widgets....
think of websites run by the little non-commercial sites and such
as much as I sympathise, its an ugly commercial world out there. How to have two standards, how to choose between big-bad-commercial and mom-pop-non-profit? And who says the latter should be given a ranking boost?
Also the little non-commercial sites will rank for what they should rank, because there are still enough other ranking factors.
If I had big-yellow-widgets.com as the domain and sold cars on the site I could get it to rank well for 'purple bunnies' if I had enough quality inbound links. That's where I see the flaw in the algo...
possible, but very difficult, the more competitive, and not very likely to be sought after by profesional SEO spammers, as they would add "purple bunnies' on-page to rank even better.
>That's where I see the flaw in the algo..
Actually, thats where I see one of the strengths of the algo.
If blue widgets suddenly are the cure for an awful desease, the site on blue-widgets containing only blue-widgets information can still help searchers looking for a remedy against that awful desease thanks to the inbound links.
ROFLMAO. Good point, as you are right. One of the inherent problems with *any* search engine algo is that so long as there are some people out there who have a basic idea of how that algo works, these some people WILL find a way to score high by using that knowledge. The trick with any search engine is to come up with the best SERPs for a wide variety of search terms will come up *on average* with good results. When it comes to searches on very competive phrases in a commercial area, there will be a natural tendency for the quality of any search engine to suffer because the SEO types have played games to make sites that don't deserve it come up high. Personally, I have seen exceedingly few instances where keyword1-keywords.TLD domain name sites have lowered the quality much. This likely has to do with the fact I almost never do searches on terms the the SEO types care about. Google works very well for *me* because I don't use the Internet much to look for things to buy. Non-commercial sites don't tend to have domain names that don't fit what they are about.
>Also the little non-commercial sites will rank for what they should rank, because there are still enough other ranking factors.
Sure that is true? I ended up going out into Western Samoan namespace for my main site (anyone can check profile if they care) because domain name speculators have (foolishly) grabbed up one of the 3 main single keywords I had to optimize for. None in use because nobody will ever buy. Of the 4 other sites run by others on a similar theme, 2 of them aren't doing nearly as well. I got one of the 2 out of Google nowheresville by doing some SEO for him. The other ain't in the top 100 in Google on this all important search word, *even though* this site's home page has a PageRank of 5, equal to mine.
As for it being an ugly commercial world out there, it has been my experience most people typically use the Internet looking for non-commercial sites, and only rather rarely commercial ones. Google needs to please the searchers out there to get eyeballs. Now where does Google rank in search engine popularity at the moment? #1. If Google has to choose between big-bad-commercial and mom-pop-non-profit, the obvious priority should be the latter.
Also, have you considered for a moment that Google *makes money* from those "Sponsored Links" that comes up in searches? Google actually *benefits* if searches on commercial terms come up with lousy SERPs; those "Sponsored Links" then seem much more appealing to click on. ;) Selling something on the Internet? Sign up and pay for Google AdWords Select. THEN Google will care about your big-bad-commercial site and put it on page 1. Did you forget that Google was a big-bad-commercial website itself? They ain't in business to give businesses good SERPs for they search terms they want. That's why Google AdWords Select exists.
At the moment Google is what it is because it is striving to serve the best most relevant information for free.
The best information could be highly commercial or completely non-profit, I would wonder if you could judge for others where that would be wanted to be found and which (comm. or non-prof) sites generally deliver that quality.
Google's algo using Weighted (PR), Motivated (anchortext), Citation (links) seems to be the best available to differentiate between results and not the single fact of being commercial or non-profit.
Example: I would say the publication "Nature" to be highly informative and generally of high quality, furthermore it has peer-review and is edited for mistakes and sources. Is it commercial? Yes.
The point of this thread's discussion is that, in some occasions, there seems to be an over-weighting of keyword-rich-seperated urls disturbing the above mentioned rankings of best information .
The discussions hope to lead to improvement, with perfecting and not necessarily averaging out results.
>The discussions hope to lead to improvement, with perfecting and not necessarily averaging out results.
I'll accept that this happens on some occasions. I've never actually seen it be a material concern, and I normally use Google. Thus I would disagree that keyword-rich-separated URLs are overweighted. At the moment that this is averaging out seems very acceptible.
Can you come up with a specific protocol that could be added to the Google algo that would be able to distiguish the difference between a keyword1-keyword2 URL that will properly weight these sites such that this causes good rankings of best information, and those that lead to poor rankings of information? Unless you can, to me the algo seem just right in this regard as it is now.
Can you come up with a specific protocol that could be added to the Google algo that would be able to distiguish the difference between a keyword1-keyword2 URL that will properly weight these sites such that this causes good rankings of best information, and those that lead to poor rankings of information?
No I'm not a programmer. 50 Google PhD's might though :)
But conceptually some ideas?
1. In general, leave the weighting of the anchortext in the algo as it is.
2. If anchortext equals (or is a main part of) the to-be-linked to url, average out the keywords (in anchortext) weighting in the algo with the surrounding text.
3. For high-Pagerank pages such as DMOZ/ODP, use the description instead of the title for the anchortext algo value (again if title equals keywords of url). DMOZ/ODP tend to often use the url or company/site name for the title.
4. If there is no surrounding text, just use the link for Pagerank and all the other unknown factors. (would that be fair to Johnson.com ?)
5. There is probably an average of how many links in the whole Google index are motivated (descriptive word(s) in anchortext), how many are unmotivated and are equal to URL, and how many are completely non-related (e.g. click here). Google could use these averages to neutralise any over-emphasis created by multi-keyword url effects.
any others - someone ?
Wrong. At least part of it. Google is trying to serve the most relevant results to ensure they gain even more market share which will in turn increase revenue. They aren't in this game to make sure Billy can find pertinent research papers. Profit is the motive. Relevant results are the means to that end.
Google has also created their share of problems. Pagerank is for sale and since that is definitely part of their algo it means that their SERPS are for sale. To the highest bidder with the most talented SEOs. Just because a site is listed in their free index and didn't purchase an ad directly from Google doesn't mean that the position wasn't paid for. Companies with large ad marketing budgets can afford to buy all the Pagerank they require and hire SEOs that know what to do with that Pagerank. Even though Google doesn't outright sell positioning doesn't mean they can escape from the commercial aspect of the medium.
Since Google uses offsite factors in determining position, and they weight anchor text in inbound links, they've opened themselves up to manipulation from these off-the-page factors and it can be embarrasing for them. Fortunately, for Google, commercial sites strive to provide relevant content because that's how they make sales.
The best information could indeed be commercial or non-profit, but Google is starting to lose it's ability to find "the best" information on its own because it's dollars driving the web and if Deep-Pockets wants to rank well for alley-oop widgets, Mr. Deep-Pockets will crush Ma And Pa Alley-Oop even if Ma and Pa Alley-Oop were lucky enough to purchase Alley-oop-widgets.com.
I don't really care if Google tones down the URL component of their algo or not. All I want to know is how to manipulate the algo so that my client's sites rank well. Yes, I still get a thrill out of beating out the competition on a small budget, but for those deep-pocket clients that want results no matter what, it's quite easy to deliver. I don't care if Google becomes a "perfect" search engine, nor do my clients. They want to sell widgets. I want to help them sell widgets. Google just wants relevant widgets. As I see it, it's win/win.
>>keyword rich urls are beating good pr and well optimized pages!
Google doesn't care about that either. They want happy, mouse-clicking surfers that only have to click once or twice and leave. The thrust of the discussion seems to be that keyword rich domains are favored by Google with the theory (a good one at that) :) that it's because anchor text in inbounds reinforce the keywords in the domain, title, url, headers and body text. That makes the end-game pretty easy. Buy a keyword rich domain, or find a way to level the playing field. Competition has widgets.com eh? Well, get a good number or PR7 inbounds and a couple of PR8's and see how the competition holds up. Better yet, supply your own anchor text for those PR7 and PR8 inbounds. Then the competition, which is now below you in the serps, will be here in the forums debating the merits of Google toning down the PageRank component of their algo. ;)
Wrong. At least part of it. Google is trying to serve the most relevant results ...and ... Relevant results are the means to that end.
Exactly, you are pushing for the same :)
Is the algo perfect? No.
But there are so many ways to counter possible short-comings in the future:
(at least if Google remembers where it came from)
- Against buying pagerank [webmasterworld.com]
- Against keyword rich urls [webmasterworld.com]
- Against heavy interlinking of seperate sites [webmasterworld.com]
and thats just some quick & dirty from one mind ;)
Will cash-rich still have an extra? Yes. Always. Its an imperfect world. Not only on the web.
But publish a brilliant idea, concept or overview and eventually you can surpass them all and more so and with near to no budget than in the real non-web world.
(there are still mom&pop Pagerank 8 sites around for example).
Only partially right. Google is definitely interested in gaining even more market share. However, if Google does a good job of getting Billy pertinent information for his research paper, Google may become Billy's search engine he uses for everything. The Google algo likely will do Billy well, because the sort of people with websites that have pertinent information for research papers likely aren't competing against commercial sites trying to set top SERPs for the sort of things people writing research papers use as search terms.
Now, if Google can get Billy to be a regular user, do they care that much if their SERPs are great when he searches for "widget sales"? Or, is perhaps is would Google prefer lousy SERPs, and hope Billy will click on the Google Adword to the right that says "Discount Widget Emporium"? I noticed on a search for "peloponnesian war" there were no Adwords, so there Google has no benefit from lousy SERPs. ;)
- Against heavy interlinking of seperate sites"
both of which would hurt our perfectly legitimate site...we have our key phrase in the URL on our "directory" site, we have the main keyword in the URL of our multilingual site, and both sites are linked from every page of our information site
the multilingual site has to be heavily cross linked...you can't tell where a visitor will arrive, and you can't guarantee to get them to the correct language by content negotiation alone...so penalising interlinking is penalising well organised multilingual sites
All you whiners should wake up and realize that Google doesn't give a damn about what you think they should do. They are a "for-profit" business and they will (and should) do whatever they think is in THEIR best interests. Welcome to free enterprise.
IMO most people here would be better off focusing on studying what they can control to improve THEIR rankings rather than complain about things they have no control over. Feel free to disagree.