| 10:34 am on Apr 24, 2003 (gmt 0)|
I think it looks OK - no reason why 19 sites with different content shouldn't point to a single site. And perfectly reasonable that the single site should point back to all 19 to return the compliment so to speak.
A robot doesn't know you own all the sites.
| 10:43 am on Apr 24, 2003 (gmt 0)|
>A robot doesn't know you own all the sites
I'm not so sure if Google has not found a way to automate advanced whois data lookups by now.
Second judging from what we see here in the travel industry Google spam reporting is business as usual.
So if it were me I would not even try to hide the fact it were my sites. I would however make all sites so that they can stand on their own two feet (how many feets do websites have, actually?), meaning get them all independant links. What I would try to avoid is setting up a closed circle.
| 11:54 am on Apr 24, 2003 (gmt 0)|
|I'm not so sure if Google has not found a way to automate advanced whois data lookups by now. |
wouldn't that be really time-consuming?
| 12:22 pm on Apr 24, 2003 (gmt 0)|
Personally, I wouldn't link to my own sites in that way until I had established links to sites 2-20 from other websites.
The paranoia of an ex-PR0 sufferer. :)
| 1:00 pm on Apr 24, 2003 (gmt 0)|
<I think it looks OK - no reason why 19 sites with different content shouldn't point to a single site. And perfectly reasonable that the single site should point back to all 19 to return the compliment so to speak.>
<Personally, I wouldn't link to my own sites in that way until I had established links to sites 2-20 from other websites. The paranoia of an ex-PR0 sufferer. :) >
You know, these two comments highlight my frustration with Google not publishing more clear cut guidelines in some cases. I keep advocating the need for this as a way to help serious Webmasters stay "clean"...but others say it would just help spammers understand better how to break the rules.
One of Google's over-arching principles seems to be that if you could explain what you did with a straight face to a competitor, and if it's also good for the user, then go with it.
chrisandsarah, seems to me that if your new site is a sort of hub, or more general site, and sites 2-20 are dedicated sites with unique and useful content that in each case is a subset or otherwise related topic to the new one, then your plan seems fine. We have run a similar set up because it's intuitive for users. We make no effort to hide it because we believe in it. And so far we don't appear to have received any penalties. No one would look at it and think anything was fishy...except webmasters who know the potential for linking penalties is out there...somewhere.
My personal attitude is that you could even link between some of the new sites, IF there is good reason for a consumer point of view.
I'd love to hear other comments on this. GoogleGuy...you out there? I know, I know, you won't comment on things like this. ;-)
| 1:10 pm on Apr 24, 2003 (gmt 0)|
Does that mean that I can safely interlink a number of my own sites provided that I also have lots of other links?
| 1:16 pm on Apr 24, 2003 (gmt 0)|
I'm with you on this point.
I'm sick of fumbling in the dark trying to establish the do's and dont's. It all seems very amateur when even the professionals dont know. Googles aspirations seem to far outweigh their technology and the result is that spam chaos reigns on a playing field that is anything but level.
Bout time you got your multimillion dollar act together Google.
| 1:37 pm on Apr 24, 2003 (gmt 0)|
I completely agree with the need for Google to define SPAM (webmasters manipulating their own Google PR or results).
In my industry, we have two webmasters from the same family (same last name, same address) who own 16 sites. They dominate the SERPs for many valuable key phrases. In some cases, they take 6 out of the top 10 positions, including the #1 position. All sites are affiliate marketing sites, selling some combination of the same small selection of products. Each site has a little bit of "content", including tutorials and glossary definitions. On each site, the wording of each page has been changed enough to avoid any Google SPAM filters. All sites link to the other sites. They used to link to the other sites from every page, but now they only link from the links page.
This is classic SPAM, is it not? Is there anyone here who would make a reasonable argument that this is not SPAM? These sites do not contain new and interesting content. They were launched with the sole purpose of increasing PR on the other sites, and taking more SERP positions for the same key phrases.
Googleguy - I reported this yesterday, with my WW nick.
How do you catch these guys? It is difficult to catch these kinds of sites with an automated algorithm, except when you notice that so many top SERP positions for the same key phrases are taken by the same webmaster (by checking domain registrations).
If these guys had all but one of these domains resolve to the same IP address (the same web site), I would not consider it SPAM... would you?
| 1:39 pm on Apr 24, 2003 (gmt 0)|
Same IP address = potential problem.
| 1:41 pm on Apr 24, 2003 (gmt 0)|
In regard to a webmaster with many duplicate sites, does anyone know of a whois system that can look up domain registrations by the owner's name or address? By checking the whois on each site linked in the above example, I found 16 total substantially duplicated sites. I only wonder how many more there are from this same webmaster duo.
| 1:59 pm on Apr 24, 2003 (gmt 0)|
There are three very good reasons why Google doesn't publish the kind of laundry list you'd like to see:
1) Such a list would just make life easier for spammers;
2) Such a list would have Webmasters designing for Google, not for users;
3) Because Google constantly tweaks its algorithms to improve relevancy and fight spam, the boundaries of Google "rights" and "wrongs" would be outdated almost as soon as they were published.
The topic of this thread--"Is this safe to do?"--illustrates what's wrong with so much SEO: Instead of helping search engines find "spider food" and identify relevant content, it tries to use artificial means (in this case, 20 or so different hotel-booking domains) to influence the SE's results. Would a booking service have 20 domains if Google didn't exist? Does having 20 domains make the site (in this case, sites) more convenient for the user? I don't know what Google would say about this, but hey--it's Google's search engine, and if Google were to decide that an artificial linking pattern involving 20 domains was worth a downward nudge in the search rankings, I'd say that was Google's decision to make.
| 2:07 pm on Apr 24, 2003 (gmt 0)|
There is more than one way for Google to identify same owned site linkages. Similar linking styles, similar formats or templates, pulling from the same database, IP's, hosts, etc etc.
Agree with Heini, dont cover up that the domains are all owned by you, but make sure each has a distinct theme/target, very small amounts of code, content and graphics in common, and independent links. Just dont do anything that could be construed that the cross-linking motive is to give each site a search engine exposure it wouldnt deserve on its own.
Also ask yourself, why have 20 sites? Do they help the browser more than one site with 20 sections?
I think as time moves on cross-linking penalties per se will become less significant as google improves tech - one small way being to become even more page-centric than they are now, compared to site-centric - another way to continually refine PR to reduce weightings of reciprocal links - and many others.
google's first attempt at significant cross linking "penalties" was a failure, even GoogleGuy said it penalised too many mom and pops and "innocents". Next time they will be more intelligent. I think they will look for other ways to minimise the SEO advantages of cross linking between co-owned sites.
| 2:10 pm on Apr 24, 2003 (gmt 0)|
I would agree with you but for one thing....the SERPS are currently cram packed with spam. Whatever the system is now its not working so far as spammers are concerned. The position could hardly be worsened by releasing a comprehensive set of fluid rules.
| 2:15 pm on Apr 24, 2003 (gmt 0)|
>>2) Such a list would have Webmasters designing for Google, not for users;<<
Europe is absolutely correct. What possible advantage could publishing such a list be except for people "optimizing" sites? Google wants to rank sites by themselves thankyou, and SEO's working out their methods only make it extremely difficult!
And his point about the algo changing rapidly is extremely pertinent too.
Really 99% of the time many people put into SEO is a waste of time, though it is true that 1% of it works long term, and spamming can sometimes work for a while short term.
The only SEO principles that work long term are the same ones that apply to all documents off and on line. The Web was NOT designed at its roots for advertising, but for the sharing and connecting of information, so you really do have to think of web pages as informational documents for the correct karma! Good Structure and organization, good descriptive titles and headings. A unique useful message. And genuine citations.
Judge Jefferies wrote:
>>the SERPS are currently cram packed with spam<<
Which SERPS? I dont see any cram packed with spam that i would use in my normal life. And i use google as a normal user very frequently.
I tend to think the SERPs that are crammed with spam tend to be highly commercial queries, and slowly but surely both advertisers and browsers will learn to click on the adwords instead. Google was never meant to be a commercial search engine, its information-based algo was not designed to handle it. Indeed the bidding and PPC systems like overture and Adwords work much better for these categories.
Worrying about these SERPS is a waste of time long term as eventually commercial products and services pages will be available in more relevant way (to the user) in Adwords, and people will not just bother with the main listings.
| 4:15 pm on Apr 24, 2003 (gmt 0)|
Thanks for all your info everyone.
it looks like we will just link to sites 2 - 20 from site 1 (our established site) and maybe drop the links back to site 1 until we have other links from sites not owned by us.
We didnt set up 20 sites for artificial PR and linking purposes at all. There is one reason why we didnt make 1 site with 20 different sections, and that is keyword domains. Our established site (site 1) is jam packed with content and gets great reviews. Listing accommodation on it is how the site pays for itself but then we started to lose positions to sites with keyword packed domains. Keyword domain names seem to have an advantage and so we decided it may be beneficial to build some smaller more relevant sites with keyword domains to stay in with the competition. so we have.
These keyword sites we have produced are not junk, but useful well designed sites with unique content.