Forum Moderators: open
1 of my sites is on an IP that has over 600 other sites.
The more sites on a single ip, the higher the likelyhood that the site quality is going to be lower.
Or to put it another way:
The number of sites on a single ip is directly and inversely proportional to the quality of those sites.
Quality sites tend to be on stand alone ips.
Stand alone ip's generally have massive sites with endless pointless content. These are the guys who can also afford seo's, who then make these sites rank for everything with their shotgun approach.
It could be argued that many great small business's manage to afford a presence on the web via cheap hosting. If google's mission is to give these sites a public platform, supply choice for the user and avoid big money dominating the serps, then weighting the serps towards shared hosts would be overall a better tactic. The smaller sites with lower budgets cannot employ spam, seo's or indulge in thousands of pointless pages.... their host will stop them. They are more often small, focused sites with good content. They cannot indulge in complicated cloaking tactics etc. etc.
Where did you get this information? I would conclude that listing sites on relevance is the only factor, whatever the hosting. There is plenty of evidence of top positions in highly competitive serps being achieved by small sites on shared hosting.... thankfully google has not made the same incorrect assumption.
p.s. Bang goes my senior status chances :)
The more sites on a single ip, the higher the likelyhood that the site quality is going to be lower.
Where on earth is the data to prove this 'fact'? And what constitutes a quality site? It's all a bit presumptive if you ask me.
Are quality sites big sites? Are quality sites run by people with money to spend on single ips and servers?
I am sure google DO use IP to detect cross-linking but to use it based on above the assumptions would be wrong.
Often, perhaps. But its like saying all quality cars are expensive ones....
money does not = quality.
content = quality
I can understand Google having to make sweeping generalisations, but this is not a logical one if true... which I doubt. However, its a scary thought.
it's probabely tempting to increase the ranking for single ip domains, but you can't tell for sure if it's bad or not. if I ran this search engine I definitely wouldn't.
crosslinking is another thing... here I definitely would make a difference!
If you ran a search engine and could easily deduce how many sites are on an ip and you are fighting major battles with quality control problems - wouldn't you make that one of the criteria?
The more sites on a single ip, the higher the likelyhood that the site quality is going to be lower.
It's hard for me to believe that counts for anything... but maybe you have inside information that I don't.
In any case, the sites at the top of the SERPs I watch are certainly on a healthy combination of both - probably heavily weighted towards shared IPs.
Take all the blogs on Blogger for instance. Are they as 'quality' as Microsoft.com or Gap.com? Certainly not. But for any given search phrase, I think it is more likely you would find relevant content on the blogs - simply because they have more content, the content tends to be textual/simple html, and the self-ascribed metadata is usually more accurate. IMHO the SERPs reflect this.
The more sites on a single ip, the higher the likelyhood that the site quality is going to be lower.
Wow. I would argue the opposite. The smaller sites are more likely to be specialised topics. If I am searching for a technical question and the results include webmasterworld and a domain that is completely devoted to my topic of interest. Chances are my question will be answered on the small site.
Money does not impact the quality of a single page of information. And people vote for that quality by linking to it.
Something that some guys writes of a forum on a large site, that is in answer to a programming question is usually less relevant than a page the someone has creating specifical to answer that question.
The page rank system is a much better indicator of quality.
Can you explain how you justify a broad statement like that.
First, static IP is so cheap ($1/month), if static IP would benefit ranking, then that's really a cheap investment. (but I don't believe it would, so I didn't invest even it is cheap).
Dynamic IP is invented to save IP resources, if each domain require a static IP, the current allocation of IPs will not be enough in the next few years.
> Quality sites tend to be on stand alone ips.
"Establishe OLD sites" tend to be on stand alone ips because they didn't change host. IMHO, but today quality sites tend to be on dynamic IPs because most hosts are utilizing HTTP/1.1 which promote name-based IP sharing hosting.
Pimpernel: But I am CERTAIN that if sites sitting on the same IP cross-link, trouble will lie ahead.
Agree totally with this statement. I would add up that if some one purchase 10 static IP from the same hosts (C Block mask 255.255.255.0) for 10 sites and interlink them, the same thing would happen. Guys, study LOCALRANK!
The more sites on a single ip, the higher the likelyhood that the site quality is going to be lower.
I don't think anybody is suggesting for a second that a site on an IP with multible sites is definitely going to be of low quality. I think that, if a search engine wanted to do a quick and dirty determination of site quality as one part of its algorithm, seeing the number of sites on one IP would be an easy win.
If you're looking for a good indication that this technique is warranted - just look at Geocities...
The number of sites on a single ip is directly and inversely proportional to the quality of those sites.
The majority of Google's database is informational sites which are generally free and therefore can't afford (or it is not in their interest to purchase) dedicated hosting.
For cross-linking/anchor text value/pagerank purposes I can see why links from one website on a shared IP to another with the same IP could be devalued, but never penalised.
I don't think anybody is suggesting for a second that a site on an IP with multible sites is definitely going to be of low quality.
I'm afraid that's exactly what Brett said.
p.s. Bang goes my senior status chances :)
if a search engine wanted to do a quick and dirty determination of site quality as one part of its algorithm, seeing the number of sites on one IP would be an easy win.
i would disagree with this. Too many other factors impact on this assumption.
1. Site size is not a reflection of quality
2. Spammers are very much aware of the benefits of unique IP addresses
3. Many small to medium business and personal sites are hosted in-house on in-house servers and home pcs. I would guess that there are more of these type sites than the ones large enough to validate a unique IP on a dedicated server.
4. A strong correlation between traffic and unique Ip address is more likely. But traffic can not be an indication of quality either.
Beware that a statistical correlation between variables does not mean that the correlation is valid in the real world.
By the way I can't understand Brett's argumentation here. Example: One of my friends is an engineer and has a small (~500 pages) Homepage on which he describes engineering, mathmatical and physical problems and solutions. This site is of course on a shared host (as the most german sites are). I can't imagine why this site should be less quality than another site on a dedicated IP?
greg
So how would you determine the quality of a site?
Well certainly not by how much money they have to spend on hosting, that's for sure.
1. Content. Right now Google will rank anything with more than three words on the page if it has enough inboind links etc. - I would put minimum 1000 characters of 'original' content to get ranked. Can anyone think of a page that should be ranked well that will have less than 1000 characters?
2. Chuck out framed pages again. In Google's eagerness to become the 'cleverest' search engine they started ranking the frameset page according to anchor text and noframes tag etc. All this has done has created a spammer's dream and excellent black hat SEO opportunities. You can still rank the actual frames as individual pages because they have actual content.
3. Ban 'search engine results' pages. These are clogging up Google's search results in both the commercial and informational site results. All these mini-espotting-affiliates that create a site and link to all the search results pages so that Google indexes them. Google can spot links pages, surely they can spot search engine results pages?
4. I realise that Google cannot see server side operations so some spam is undetectable unless the site is actually visited by a human, but alot of them are completely detectable (my basic spider can sort them out anyway). Meta refreshes and javascript refreshes are cloaking and should be banned. Now there are legitimate uses for cloaking but normally this is just for the main pages which main sites do not normally intend to have ranked (since when have you seen amazon.com optimizing?) so just remove all pages that re-direct, eventually Google will index the real page through their toolbar on these major sites.
Well, that's a start. Obviously I can't list some of the more 'secret' black hat techniques that are raging on Google as I'd probably end up dead - (tough game this marketing!) - but these are a few basics and if Google would like to explain why these are so hard to do then I'll gladly eat my words. Anyone else?