Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: open
It's been my experience that keywords in the domain name counted substantially for positioning...once upon a time. I don't believe that is the case anymore.
I have sites that are extremely well ranked for highly competitive search phrases that don't have any keywords in the domain. It appears to me Google greatly diminished the importance of keywords in the domain a long time ago.
joined:Nov 20, 2000
joined:Oct 27, 2001
A few years ago, if you searched Infoseek or AltaVista for "Paris hotels," you were likely to find a lot of Web pages for individual Paris hotels. Today, those pages are lost in a cornucopia of hotel booking sites that, in many cases, are virtually identical. Those sites may not be "spam" from a search engine's perspective (or from a user's, for that matter), but they get in the way when the user is looking for information that he hasn't already seen a dozen times.
Even better, I'd like to see a way for users to filter search results by criteria such as:
- All sites
I realize that this latter suggestion isn't practical, since (a) it's probably impossible for an algorithm to distinguish between an editorial site and an e-commerce site, and (b) the line between editorial and e-commerce is often blurred. Still, maybe Google could experiment with a filter that's designed to prioritize (rather than to exclude) by type of content. That way, a reader who's trying to learn about (for example) Nikon digital cameras could choose to have articles, reviews, etc. listed before pages that fit an "e-commerce" profile because of their layout, quantity of text, links to shopping-cart software, etc. And the reader who wants to order a Nikon digital camera could choose to have e-commerce pages listed first.
Also, for those complaining about keyword rich URLs, did you ever consider that it's the keyword rich TITLE that gets these sites high placement and not the domain?
Yes, non keyword domains can rank well, we have the same experience. But preference in serps is given to 'index pages'. The content of that page will compete with other index pages, and good pr etc. will win through sometimes. But there's a limit to how many keywords one index page can target. And even in this scenario a keyword domain helps.
We are talking about all the other keywords that the second layer pages should do well in the serps. But they don't if they are competing against 'index page keyword domains'.
So if you have one site with twenty pages only the index page can seriously compete on competative phrases (generalisation I know but reasonable). So the solution at present is to have 20 index pages, hence 20 sites. This also gives the dmoz entry keyword relevance.
2 months ago this technique was used in the uk employment sector with devastating effect. In their (not me!!) first time in the serps they had no links in and top 10 positions for the keywords targeted. They now have a few links in and are still there. The sites have little content as it is a start up company. They targeted very competative phrases with ALL the domains entered into yahoo.... who have made a special category just for them! (big investment, so yahoo were not going to turn them down). The domains are VERY clever, which gives them high ranking in alphebetical directories as well. The pages themselves are not very optimised... but the domain and title win through.
And you reckon it is a thing of the past.....! Look at the serps, they are everywhere, with rubbish content and few links in.
it looks a whole lot more meaningful to the casual surfer.
It absolutely can but Google does not value keywords or keyphrases in the url for their own value.(and if it does it would be negligable).
Google does value the anchortext from external incoming links (a lot).
Company name: Rank higher Solutions
Home/index page title: Rank higher Solutions
DMOZ listing (very often):
Title: Rank higher solutions (as anchor-text link) towards www.rank-higher-solutions.com
For low/medium competitive search phrases, it works, as one of the many criteria for ranking higher on e.g. the search query "rank higher". At least a lot better than the competitor (called "Plasma") with a company name, url and recieved anchortext containing and equalling the non-relationally-descriptive "Plasma",
If you were a (basic) Google and you had to choose between the value of these two different incoming links (no other incoming links around), which site would you let rank higher?
For competitive search phrases, the real quality of the "Rank Higher" site will play a much more important role in the amount, the description and Pagerank of the incoming links and the key-identifying texts on the pages on which those links have been placed - and the Google algo works OK.
[edited by: vitaplease at 9:07 pm (utc) on July 18, 2002]
All the above said is frequently checked by Googlebot and Google's anti spam algos. Google does a better job than our editors and robots. Our policy is more stringent than Google's TOS, let's say its Google's quality standards plus our policy regarding content requirements. The editor's mission should be reduced to a serious quality control reviewing visible content. Why must the editor check the webmasters honesty just to ensure we're not penalized caused by linking to sites we didn't identify as 'bad neighborhood'?
What I wish to have is a function to eliminate cheaters like
returning a string like
page/domain is penalized
page/domain is banned
page/domain is not (yet) crawled
no known issues
[too bad you won't return the page rank here ;)]
This may sound like a too special request valuable for a handful of sites only but in fact it's a tool zillions of site owners could use.
There is no other SE rating content as accurate and fair as Google.
There is no other SE crawling the web as a whole so frequently (FAST tries but does not).
There is no other SE staff being so competent and helpful as Google staff regarding spam issues (thanks to Matt and colleagues for all your help and support! BTW, I won't kiss asses and I'm not cheerleading).
Why not make Google the instance for link and index spam?
Thank you for your time reading my rant
Keyword rich domain... example
Do a search for 'Finance jobs' worldwide.
Number 3 is a frames site with the description 'this page does not support frames'! It has 18 links in and no dmoz. It beats quality sites with 860 links in and a dmoz listing AND specialise in finance jobs.
The difference? Both have finance in the title, but the winner has it in the domain.
Am I right?
A few weeks ago someone suggested the PR bar turning red for penalized sites and I definitely feel this would be a good thing. Another possibility would be a "Penalized" mark on the info:www.yourdomain.com page...
A few weeks ago someone suggested the PR bar turning red for penalized sites
Yikes! That might scare people away ;) There certainly needs to be some way out of the Google hole. I've not many pennies, but a pay-for-review service or something would be wonderful. Or even pay for specific Google support. Heck, that IS a good idea!
I dunno - there are two other things I would LOVE on Google:
(1) An FTP search like All The Web
(2) A multimedia search for wav, mp3 etc. (hehe, doubt this will happen.........)
1) Preserve preferences across search types.
I have a bookmark to the normal Google search, which automatically sets it to return 100 listings at a time (can't stand being automatically redirected to google.de, and my cookie file is write protected). Now if I have some search results, and click on any alternative search link like related, directory, or image search, then I am automatically reset to 10 returned listings.
This is one of the most annoying "features" that Google currently has. If I have told the engine that I want 100 listings, then I expect it to adhere to that until I tell it otherwise. Of course I realize that there will never be 100 returns per page in an image search, but the preference should still be remembered for when I switch back to normal searches.
2) Use sort priority for subcategory listings in the directory. The current arrangement is a blody mess, and could be made a lot more user friendly by just applying the information already present in the RDF dumps.
3) Allow a more flexible configuration about which file types to search for. Right now, I can search either for all types, or just for one. But I might (and often do) want to search for both .html and .pdf documents, but not .doc, or for everything except .pdf, etc. That would require a simple list of file types, each with an "include" and an "exclude" radio button. It would probably also be a good idea to have "non-web" formats like .doc set to "exclude" as the default.
4) The adwords select minimum bids must reflect the geographic target area. A click that is only accessible from Elbonia can't possibly justify the same minimum cost as one available in the US (or worldwide).
5) Make all pages on google.com validate against whatever version of HTML you chose.
6) Provide a Toolbar for Mozilla on Linux and the Mac.
Oh, and don't change too many other things... ;)
And PR of substantial value might well be flowing through Yahoo to those domains that UK employment firm created in that special category Yahoo set up for them. And if the site title of each contains highly appropriate sets of keywords, then probably that is how those sites are now responding in SERPs to search queries, since that site title translates directly into anchor text for the link at Yahoo.
I suspect that is the case. I doubt very much if keywords in the domain, alone, are responsible. For what it's worth, I can show you a lot of sub-pages that rank higher than index pages in highly competitive categories. What does it take to do that? Inbound links to those sub-pages with suitable keywords in the anchor text, good optimization of those sub-pages, and some quality PR from inbound link sources.
Ode to Google
or "WYSIWYG" (when you skip it with your Googlebot)
Google, oh Google, how much suffering you have wrought,
When you give my site a PR0, when you skip it with your Googlebot.
My life is not worth living, I'm in anguish! I'm in pain!
How I long, for your attention, but you ignore me, to my shame.
The competition laughs at my misery, my client cries out in a rage,
Why, oh why, have you forsaken me? What if I make you a new page?
Google, oh Google, how I long for your PR10.
Oh, that you would please, just grant me this, (and a number one SERP, for each keyword within).
But I know somewhere in a server, oh so very far away,
There is a search engine that will give you, number one, if you pay.
So I will hold my chin high, my site's referrals, are all shot,
And suffer through another month, when you skip it with your Googlebot.
I doubt very much if keywords in the domain, alone, are responsible
I hope you are right, but for the moment I give the benefit of the doubt to my own observations. You are right about the importance of titles, but that does not mean that many webmasters/directories simply take over the central url-text without "www" and "com" in the link towards that site.
The nice thing about this forum is, that given the specific interactive audience of, e.g. this thread, the simple fact of mentioning this possible general anomaly or "short cut" might guarantee a very exciting second half of 2002, without having to revert to focussing away the attention of one's site to several keyword-rich url-sites, just for SEO's sake.
Axacta, thats the way I like it..
number one: A all white page rank just means your site sucks and needs alot of work.
number two: A all yellow page rank which means that you got a penalty for something and if you change that something by the time googlebot comes back to town, you will be golden again.
number three: A all red bar for the google death penalty. Site is toast, server is toast, and anyone dumb enough to link to you is toast(domain wise).
I think that will do a great deal of good and poeple will know if they should get a different domain and start over or go back over what they did and change some thing to get this back going again.
I'm done yappin.
Googleguy, please let your Google directory toolbar geometrist check his eyesight or his mathematics. Just let the Directory toolbar's distribution coincide and be consistent with the normal Search toolbar. Certainly if they are both updated simultaneously. That alone will reduce the amount of Pagerank postings here by 10%.
(sorry Chris_R if this will give you less visitors ;))
I agree, but they have got their position by money and not merit. What about the previous example I posted? No yahoo link there. That has got to be the domain name kicking in.
Sub pages can compete but that requires huge amounts of work to get the links in, the pr and the optimisation. It is not a level playing field, when comprehensive sites are pushed down in favour of a $30 domain name.
It seems to me that the issue is 'who actually owns a site'. If google could detect that a whole bunch of sites belongs to the same company, then it can limit their presence in the serps. The question is how? Incentive has to be given to creating big comprehensive sites, in that way the serps improve, with more variety in the top ten around a theme, rather than 9 "keyword.com" rubbish all selling the same topic.
I'm still wary about link popularity.
Large numbers of recipricol links can be because:
1) The webmaster spends a lot ot time getting them and he's good at it.
2) The webmaster handles a lot of sites
3) The company has a lot of sites interlinked.
4) The site belongs to a big network, e.g. universities or schools each with their own site.
All these only prove that they have friends and influence, not content. Unrecipricated links may be a better indicator of quality, as it is an unconditional recomendation, and less easy to trade. Perhaps these should be given more points. The user does not care if the site is well linked, they want visible content which is relevant. As far as I can see, recipricol links and quality have nothing in common.
Here are some suggestions I'd like to see:
1. Roll out a visual search option. Navigating in a 3d space is fast, intuitive and a lot of fun. Perhaps you should take some pointers from the Touchgraph Googlebrowser???
2. Why not aggregate Toolbar users movements across the web and make it avaialble in a visual format again. This would be far more useful and fun than flat hyperlinks seen at Amazon.
3. Get the distributed computing thing happening.
4. Don't change too much too quick. You've got the big flywheel spinning fast, so it just needs a little push and shove to keep it going.
And for all these great suggestions, why not give this community access to pre-IPO stock. Now there's an idea!
<<Do a search for 'Finance jobs' worldwide. Number 3 is a frames site with the description 'this page does not support frames'! It has 18 links in and no dmoz. It beats quality sites with 860 links in and a dmoz listing AND specialise in finance jobs.>>
You and I are seeing different results. I do not find that page. Might it be one of those phantom "new since last update" pages?
Incentive has to be given to creating big comprehensive sites, in that way the serps improve, with more variety in the top ten around a theme, rather than 9 "keyword.com" rubbish all selling the same topic.
you are directing towards a good point.
An example of a generic posting here:(could have been myself;))
1.My affiliate company sells blue widgets, yellow widgets and orange widgets.
2.I have created three unique sites; blue-widget.com, yellow-widget.com ....
But I have cleverly used different IP's, who'is', and formatting..
3.How much interlinking can I do without being punished, but still increasing both my Pagerank and my ranking above my supplier, widget.com? I have been very cautious in taking in some other independant external links for each of the three sites and have also linked outwards.
4. I have something special in informational content to offer my visitors by splitting this information into three seperate sites, because blue-widgets only offers information on blue-widgets..and orange-widgets only..
5. Will DMOZ/ODP list all my highly informational sites several times, within one week just before the next Google update?
A site having e.g. 50 of the 100 internal pages each earning their own external inbound links of the same theme or "set-of-words", but all from independent sites, should get a site-general Pagerank or ranking boost over a less comprehensive site of 10 pages of which only the index gets external inbound links of some "authority" with the obvoius link equals url/title.
I perceive that keywords in domain names seem to give a boost to rankings, and I'd like Google to diminish the importance of that factor... When I search, I prefer to find the detailed internal content page, not the index page of a small sites with keywords in the domain name.
For improving search quality, I'm sick 'n tired of pop-ups and pop-unders. I don't want to visit any pages that do that. Is there a way to let the toolbar voting buttons be used to report pages that do popup behavior, coupled with a user preference to exclude pages from SERPs if enough negative votes accumulate? I realize that this mechanism alone could be abused by competitors, so maybe it needs human review to confirm it.
joined:Dec 9, 2001
My biggest request: If for some reason a site is down when Googlebot comes calling, please don't give up too easily. No site has 100% uptime, and being down for the wrong five minutes can have horrendous consequences if Googlebot knocks once and doesn't come back.
The more power is concentrated in Google, the wilder the ride becomes for site owners if something happens besides normal ups and downs. Please try to be as generous as possible about re-checking when it's appropriate.
1) Ability to write reviews (like Alexa)
2) Increase from 0-10 to 0 - 100. It seems a little weird that rather inocuous sites can be a 9 the same as a site so important as the BBC News.
3) Use colour coding in the toolbar, if a site is being monitored for some kind of penalty use colours that the webmaster can use to check what type of penalty it is and then he can try and find the problem and fix it.
For the search I would like Google to keep it simple and the really means keeping it the way it is and just refining search results somehow.
Constant updating for all sites or pages that change daily.
I am sure I will think of somemore later.
3) Allow a more flexible configuration about which file types to search for. Right now, I can search either for all types, or just for one. But I might (and often do) want to search for both .html and .pdf documents, but not .doc, or for everything except .pdf, etc.
?? You can already do this, Bird. Or do you mean via the advanced search form? The advanced search form on Google is good, but it does limit you on what you can do. But if you wanted to search for both DOC and PDF, for example, this would work:
cow ( filetype:doc ¦ filetype:pdf )
Of course, you'd have to use this on the front page search form.
...sorry if I misunderstood you and this isn't what you meant.