homepage Welcome to WebmasterWorld Guest from 23.23.12.202
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 52 message thread spans 2 pages: < < 52 ( 1 [2]     
New Penalty against Hyphenated Domains?
F_Rose

5+ Year Member



 
Msg#: 34384 posted 4:53 pm on May 18, 2006 (gmt 0)

Do you own a hyphenated domain?

Is your site affected?

We own a hyphenated domain name, our site is down to 24 pages. Just wondering, anyone with a hyphenated domain in the same boat?

Just trying to figure out if this may be Google's problem for not being fully indexed.

 

ogletree

WebmasterWorld Senior Member ogletree us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 34384 posted 3:02 pm on May 19, 2006 (gmt 0)

Every time Google makes a change and amateur SEO's websites fall they come here screaming some random SEO technique is being penalized. There is no proof for this. I know of a site that has two dashes and ranks for a very competitive term. If your site fell it was for some other reason. Google is all over the place right now. I have sites that bounce from page 1, 2, and 3 every other day. Calm down and get more backlinks. It is hard to maintain a good ranking welcome to life.

randle

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 34384 posted 3:09 pm on May 19, 2006 (gmt 0)

Two sites we have with hyphenated domains continue to rank well for their main key words; no drop at all. However, they both have lost thousands of previously indexed pages that came from discussion boards. Hard to believe it’s an intended penalty, but at the end of the day what’s the difference? The pages are gone.

Maria444

10+ Year Member



 
Msg#: 34384 posted 3:27 pm on May 19, 2006 (gmt 0)

I, like g1smd "don't buy any theory for one or two hyphens in the domain leading to sites being penalised".

There are two hyphens in my domain - more in file page names - no problems except some 3-level pages (rightfully) missing from the index as of now. Home page on first page of serps for its main keyword, position 2 (first page) on secondary keyword and several first positions on other keywords with inner pages. For instance, a page with 3 hyphens in the inner page file name (plus 2 hyphens in the domain name) ranks #6 out of 18,300,000 on the serps for a keyword found in that page.

BUT: the content of the page is ABSOLUTELY ORIGINAL - UNIQUE and the page is well linked from several pages of the site.

It seems that as far as I'm concerned, my missing pages are either located 3 or 4 clicks away from the home page or contain minimal, non-unique content.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 34384 posted 4:24 pm on May 19, 2006 (gmt 0)

...my missing pages are either located 3 or 4 clicks away from the home page.

Bingo. I just studied five domains yesterday that are suffering from dropped pages. There were hyphens in two of the domains, and not in the other three. But in all cases, the dropped pages were 3 or 4 clicks from Home and hadn't been spidered in a long time.

1script

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 34384 posted 5:34 pm on May 19, 2006 (gmt 0)

But in all cases, the dropped pages were 3 or 4 clicks from Home and hadn't been spidered in a long time.

Pretty depressing result there, tedster.

There is not too many pages you can link to and still stay within 2 clicks from a homepage. This is, of course, unless you want to risk flooding your pages with thousands of links. I'd say one hundred links per page is already too many. If you jam all the first layer pages with as many, you can only link to 10,000 pages. This becomes even more pitiful considering that no one in their right mind would stuff their homepage with 100 links, so you are down to what, 1,000 pages as the top number you can hope to get listed?
That's a real shame but goes well with the recent "Google run out of storage space" sentiment.

BigDave

WebmasterWorld Senior Member bigdave us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 34384 posted 6:47 pm on May 19, 2006 (gmt 0)

Who care how many clicks a page is away from the home page? A much more useful thing to be concerned with is how many clicks it is away from a deep incoming link. Distance from a home page is only an issue if all your links go to that page or your internal navigation really sucks.

If you have content worth linking to and cookie crumbs on your pages, you should not have much problem getting your site fully indexed.

An example of this is on a review site that I run. Every manufacturer gets a page that links only to the reviews of their equipment. At least half the manufacturers display the link to that page prominently on their website.

All the reviews of their gear may be up to 3 or 4 clicks away from the home page, but they are one click away from the link to their "manufacturer's page".

Every reviewer gets a similar page that lists all their reviews. Many of them have websites where they link to the list of all their reviews.

Both these pages also have links to every navigation page working it's way down to each review and all pages have cookie crumbs. Many reviews also get direct deep links.

I would be surprised if any of the 10k reviews on the site is more than 2 clicks from a deep link. The same will hold true if we reach 100k reviews.

The only place where there are pages that are much more than 2 clicks from a deep link would be in our news blog, and I don't really care about that.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 34384 posted 6:52 pm on May 19, 2006 (gmt 0)

how many clicks it is away from a deep incoming link.

Right on -- the domains that were brought to my attention did not have any deep links. They were also under 1 year old. I agree that deep links are essential, especially with a larger site.

pageoneresults

WebmasterWorld Senior Member pageoneresults us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 34384 posted 7:02 pm on May 19, 2006 (gmt 0)

They were also under 1 year old. I agree that deep links are essential, especially with a larger site.

They are naturally occurring also. I've seen many sites in the past couple of weeks that are working with very shallow site structures. They have everything appearing at the root level and I'm not too certain that is a viable structure for a large site. It loses out on the whole architecture/pyramid concept.

It's all about channeling/funneling the weight through the site. I call it "power distribution". ;)

Maria444

10+ Year Member



 
Msg#: 34384 posted 7:09 pm on May 19, 2006 (gmt 0)

No, my pages (msg #:33) WERE in the index a couple weeks ago. In fact, 99% of my pages were indexed, even the ones located 4-5 clicks away from the home page. These “buried and weak” pages even had PR, either PR1 or PR2 and they still do, of course.

As someone mentioned earlier, Matt talked about some “indexing problems”, I believe, that are going to be fixed this week or the next.

No, no one suggests we stuff our home pages or any other page with 100 links or more (for me 100 links is TOO much).

But since a very long time, I was personally totally against this mentality of “stuffing” your website with 10,000 pages or 50,000. WHAT FOR?!? Not even Einstein can create so many pages of ORIGINAL, VALUABLE content. I believe that all these gigantic websites were created only for SEO purposes with crap pages containing a few sentences of crap content - if I were whatever SE I would definitely want to get rid of all this junk.

I believe this is what they are exactly trying to achieve and I’m sure it’s NOT easy.

Don’t tell me you are promoting 10,000 products and that you need one page for each. Keep those pages for your customers, within your website, they don’t need to be directly accessible by the whole world.

I can also declare that my missing pages are crap in the vast majority. Problem is, 20% is EXCELLENT content written by a top professional in the field, and it ONLY exists on my website.

So I’m hoping for the best and I’m fully aware that like many others I am paying the cost of these huge, useless websites containing either dupe or inexistent content– remember the time when literally every SEO guru was suggesting we create two pages a day? And what about the “articles” fashion – write a stupid article and submit it everywhere…

Bull… Now the Internet is inundated with bull…

pageoneresults

WebmasterWorld Senior Member pageoneresults us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 34384 posted 7:12 pm on May 19, 2006 (gmt 0)

Don’t tell me you are promoting 10,000 products and that you need one page for each. Keep those pages for your customers, within your website, they don’t need to be directly accessible by the whole world.

Oh yes I do need one page for each product. I have descriptions, specifications, options, etc. that all need to be presented to the visitor. How else would I present those pages?

And, without them being accessible, I lose out on all that page visibility and the opportunity for that page to pull it's weight in the overall scheme of things. Why would someone want to block access to those pages from a spider?

Maria444

10+ Year Member



 
Msg#: 34384 posted 7:52 pm on May 19, 2006 (gmt 0)

How many products CATEGORIES are you promoting? 100? If one is say, “furniture” you’ll have links to chairs, tables etc on this page. Your visitors WILL see the descriptions / options through the links but G and any other SE DOES NOT need one million pages selling chairs (I’m sure there are one million chair manufacturers / retailers in the whole world).

So you’ll have 100 pages indexed, that is absolutely enough imo.

Or, make another website selling only furniture and your chairs and tables pages WILL be indexed.

But as I said the problem is deeper than that. For me, it’s about those huge websites containing useless stuff, created ONLY for SEO purposes and occupying large space. Forums, “articles” and the like. Let alone that G has stated from day one that they are MAINLY interested in valuable content / information, they created Froogle for products – the problem is with the crap content circulating around so much, that it is difficult to spot the valuable, original from the dupe and crap.

BigDave

WebmasterWorld Senior Member bigdave us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 34384 posted 7:53 pm on May 19, 2006 (gmt 0)

Maria,

You sure are good at judging everyone else's website as huge and useless without viewing their content.

In the case of my large site, that is 10k reviews of products. Not epinions reviews, but reviews that generally take 6 months of testing and usually a couple of days to write.

If people are looking for a review of item X, they want the search engine to know about all 10k reviews. Would we have somehwere in the neighborhood of 20k natural links if our content was not considered valuable?

You simply cannot judge the value of the content by the size of the site. I did a search last night to find emergency dosing information for one of my dog's medications.

The results were very spammy, but I found what I was looking for on a site that almost certainly had more pages than the spam sites, because it had full information on EVERY approved vet drug. By your definition, those pages should not have been available in the search engines. I might have found that information otherwise, but my dog is alive today because of one of those huge sites with all sorts of extra pages in it.

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 34384 posted 8:12 pm on May 19, 2006 (gmt 0)

Hmm. I am very interested in the "clicks away from index" theory; especially when looking at a site that is using Breadcrumb navigation.

At the moment, I see that for the ODP, that their page count is inflated from the "real" three million to a reported 26 million pages.

More importantly, in a site:domain.org search, the results finish with the "repeat this earch with the omitted results included" message after just 500 listings. That, to me, is very telling.

BigDave

WebmasterWorld Senior Member bigdave us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 34384 posted 8:20 pm on May 19, 2006 (gmt 0)

More importantly, in a site:domain.org search, the results finish after just 500 listings.

It tells me that there are issues with the site command then. I get a full 1000 on all of my sites that are larger than that.

Give this a try, go to dmoz and go down a few levels and do a site: on the URL for that directory. It will become very obvious that they have far more than 500 entries in the index.

For example, I did [site:dmoz.org/Recreation/Roads_and_Highways/] and got 194 and it showed me 194.

CainIV

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 34384 posted 8:46 pm on May 19, 2006 (gmt 0)

One site with two hyphens in the domain name, unchanged in Google results as of today.

j_h_maccann

10+ Year Member



 
Msg#: 34384 posted 9:47 pm on May 19, 2006 (gmt 0)

but using a keyword1-keyword2.com type of domain, [...] it's quite easy for Google to tell what two words you are optimising the site for...

Hyphens in a domain name to indicate word boundaries are likely to be of no help to search engines; the search engines probably do an excellent job of finding words in domain names written solid.

All of the search engines have recently spent a lot of time working on Chinese. In Chinese (and some other languages), writing does not normally include word separations. So the technique used to find words is to progress along character by character, looking at a dictionary which has word-frequency statistics and some other information, finding the most-likely words. (This procedure can be carried out with good efficiency.)

This same procedure can be used on solid domain names in any language (using word frequency statistics from the same language) to find the intended words (and occasionally unintended ones).

One well-known kind of example where this may not work is the case of "Example Therapist LLC" using the domain "www.exampletherapist.com". But actually, preferring longer dictionary matches might well give the right result; and on the scale of the search engines, given their huge corpora from which to mine word frequencies, such a procedure would almost always work very well.

Since most people do NOT use hyphens now, already the search engines would need to be able to identify probable word boundaries, using a technique like the one just described. So they may as well use that method for all domain names, even those with hyphens, and treat the hyphens with suspicion, mostly as indicators of low quality.

FWIW, it seems to me that Google is doing better now than it used to at finding domain names containing search words embedded solid within a longer string of characters (in English).

(I've given up all my hyphenated domains, and where I own both the hyphenated and solid versions I've moved to the solid version only. It seems to me that ordinary searchers don't like or trust hyphens, and people are now used to typing in domain names solid. Whether or not this is true, using hyphens seems to be unnecessary.)

coosblues

10+ Year Member



 
Msg#: 34384 posted 10:03 pm on May 19, 2006 (gmt 0)

(I've given up all my hyphenated domains, and where I own both the hyphenated and solid versions I've moved to the solid version only. It seems to me that ordinary searchers don't like or trust hyphens, and people are now used to typing in domain names solid. Whether or not this is true, using hyphens seems to be unnecessary.)



Hyphens (at least one) are quite necessary when the domain name you want is taken. My site of 4 years has one hyphen and has always done well. I'm sure G realizes the reason "some" people went to a hyphen, but I do agree that when you push the threshold over perphas X number of hyphens it may raise a spam flag.

I really don't think people searching care a hoot about your domain name as long as they find what they want on you site. Think about it - someone uses G to search and a hypenated domain comes up first - they just click the link and pay no attention to the domain name. Hopefully, you have what they want and then they'll just bookmark your site.

Original content that is useful is all searchers care about - perhaps not webmasters, but the majority of the internet is not surfed by domain owners.

JuniorOptimizer

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 34384 posted 10:11 pm on May 19, 2006 (gmt 0)

I'm not selling any of my hyphenated domains, regardless of what the Morality Police at Google suddenly think of it.

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 34384 posted 10:13 pm on May 19, 2006 (gmt 0)

A lot of time being spent on just understanding that pagerank is important, and PR1 and 2 ppages are a lower priority than they were before. Something four clicks from the dmoz homepage is indexed just fine. Something four clicks from a P3 homepage is going to have a hard time ever getting the new sicklybot that far.

The shallowing of the web shouldn't be too much a surprise considering the number of utterly useless 100,000+ page "sites" out there. A domain like dmoz now has the PR, authority and linking to merit deep crawling while some million page amazon feed does not.

Pagerank is more important. The lesson may be to forget your entire sites and focus only on any pages PR3 or higher.

martinibuster

WebmasterWorld Administrator martinibuster us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



 
Msg#: 34384 posted 10:22 pm on May 19, 2006 (gmt 0)

The whole notion of hyphenated domains getting penalties is off base, imo. No logic behind doing anything like that. There are far more reliable signals to tune into as a quality metric.

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 34384 posted 10:30 pm on May 19, 2006 (gmt 0)

<poof.. have to edit mine now too....>

[edited by: steveb at 10:31 pm (utc) on May 19, 2006]

Jane_Doe

WebmasterWorld Senior Member jane_doe us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 34384 posted 10:31 pm on May 19, 2006 (gmt 0)

No logic behind doing anything like that.

The logic is that if your domain is www.really-cool-widget-posters.com more people are going to link to that using the exact words "really cool widget posters" and trigger a spam penalty more than if you have a domain that reads like a license plate. But for today anyway that filter seems to have been loosened back up again. But when that gets tightened keyword domains tend to be more vulnerable.

< continued here: [webmasterworld.com...] >

[edited by: tedster at 4:15 pm (utc) on July 11, 2006]

This 52 message thread spans 2 pages: < < 52 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved