joined:Jan 16, 2007
I've had a 'for fun' website for 5 years. Since it was just for fun I never cared about search engine ranking. 2 years ago I put together a 'business' web site and knew from experience 'if I upload it, they will come' so I didn't pay much attention then, either. However, I had no idea of the complexities of search engine particulars like SERPs and page rankings and penalties and the like until I ran across this web site about a month ago.
I have 3 domain names for my 100% unique content business web site, structured abcwidgetcompany.com, widgetcompany.com, and awc.com. I have to say that I am completely astounded that search engines view them as 6 different web sites with duplicate content. I get traffic from all three names so I'm not about to discard any of them. I have no intention of rewriting the urls because I find value in knowing where people are coming from.
I am highly curious as to why:
- the search engines don't look at the IP address and if the IP is the same and the content is the same, then it's one web site. If the IP is the same and the content is different then it's different web sites using shared hosting. I've been a professional programmer for over 20 years; it's just not that hard, even when you want to add in colocation.
- they don't they have spots in their webmaster areas for us to list all of our mirrors, parked domains and affiliates? In the same spot you could specify the one single address by which you want to be indexed. One of them already has a place for me to specify whether I prefer the www or non-www version, but I can't list any additional names. Why in the bloody heck not? They ran out of text boxes?
- the web master collective puts up with this nonsense. Why fix their programming shortcomings by rewriting your own urls on your own website?!? It's your house, you should be able to have as many doors as you want without their coming along and saying 'no, one door or we'll make you sorry'.
To me it seems highly analogous to content designers molly coddling IE to such an extent that MS didn't see the need to be standards compliant until version 7 (if then) of their browser.
I understand that everyone wants high rankings, but when it comes right down to it the search engines need the web sites more than the web sites need the search engines. A lot of industries have centralized 'link' repositories listing purveyors of those goods. If webmasters began turning the search engines away at the front door, telling them 'when you've got your act straightened out submit this form and if I approve it I'll let you back in', they'll fix their bots a lot sooner than version 7.
Do I have my head on backward, am I wrong? I just don't get it, I really don't. It certainly looks to me like the gang right here knows more about the internet than the search engine people do.