Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: phranque
I have 3 domain names for my 100% unique content business web site, structured abcwidgetcompany.com, widgetcompany.com, and awc.com. I have to say that I am completely astounded that search engines view them as 6 different web sites with duplicate content. I get traffic from all three names so I'm not about to discard any of them. I have no intention of rewriting the urls because I find value in knowing where people are coming from.
I am highly curious as to why:
To me it seems highly analogous to content designers molly coddling IE to such an extent that MS didn't see the need to be standards compliant until version 7 (if then) of their browser.
I understand that everyone wants high rankings, but when it comes right down to it the search engines need the web sites more than the web sites need the search engines. A lot of industries have centralized 'link' repositories listing purveyors of those goods. If webmasters began turning the search engines away at the front door, telling them 'when you've got your act straightened out submit this form and if I approve it I'll let you back in', they'll fix their bots a lot sooner than version 7.
Do I have my head on backward, am I wrong? I just don't get it, I really don't. It certainly looks to me like the gang right here knows more about the internet than the search engine people do.