Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
Someone told me that <snip> could be one, but I am not sure, does anyone have any feedback on that.
I want to find out as they link to our site and I need to find out if I should get them to take the link down or if it is no worries.
[edited by: Marcia at 6:31 am (utc) on Mar. 22, 2004]
[edit reason] No specifics, please. [/edit]
It opens up too much room for frauds and scammers.
For example, I run website A which is highly popular, and website B runs the same kind of content and wants to be as popular as website A.
Therefore, website B could very easily add links from "bad neighbourhoods" to website A, thus getting it banned or have it lose PR etc.
I think Google would have surely thought of this, and I don't see that they would allow such opportunities for frauds - so there is my two cents - I think it is rubbish!
I'm curious whether it is advisable or not?
The really funny thing is that inside the Google directory there are tons of links to what I would call bad neighborhoods - spammy and useless sites. They freely link to sites that are nothing but repositories for affiliate links. And many good sites can't get in because DMOZ is a dinosaur stepping into the tar pits. If I applied the Google criteria OBJECTIVELY, I would be sure not to link to Google, lots of spammy outbound links, lots of garbage.
I recently created 10 sites. 4 of these consist exclusively of links to the top 10 results in Google for certain keywords, affiliate links and some plain vanilla but SEO'd text. The other 6 sites have actual content. Guess which sites do well in Google? Guess which sites were reviewed and accepted quickly by DMOZ. That's right the spammy garbage sites are all in DMOZ and doing great in Google. The real content sites do very poorly.
Either create content or index it, but don't try to control what you are indexing if you aren't creating content yourself. There is something wrong with the equation. But that's OK, time will sort it out.
You probably need a separate thread, but here's a hint from the SERPs: Google seems to *love* directories these days - even bad ones. So rather than having everyone linking to everyone, you might try moving toward a more hierarchical linking strategy, and reserve inter-linking for similarly-themed and semantically-similar pages... and I did say "pages," not "sites."
Its got nothing to do with whether people today do or do not take personal responsibility. It has everything to do with who the self-appointed legislator, judge and jury is. My most important point is that if the criteria were applied objectively, Google might be deemed to be a bad neighborhood and anyone linking to it would be penalized.
But like I said, it is far easier to get spam into Google today than it is to get quality sites. As a result, there is a ton more spam on the web today than there was 3 or 4 years ago. Far from correcting the situation, Google is creating it. Maybe they should take personal responsibility for that!
Whether one gets spam with/for/in spite of Google also seems to me to be a direct result of "rankhounding". Since the commercial sites I run don't "advertise" on or require referrals from search engines, none of the mad scrambling for rank etc. makes any difference to us.
Live by rank, die by rank?
My site was a member of a "webring" that had been formed by about ten sites that were involved in an engineering discipline but not in competition with each other. We linked to each other before I knew that Google did not like this. (Why is this bad?) Apart from the potential for link popularity improvements our being in aligned businesses created traffic between the sites.
A few weeks ago my site suddenly disappeared, lost its PR5, all titles and cache info. When I raised this on the forum I was told by GoogleGuy that it was probably caused by my linking to one of the sites in the webring because it was being classed as a keyword stuffer.
Now classing this as a bad neighbourhood seems to me to be very wrong. The sites in question were all legitimately trying to generate more traffic through their links and I cannot for the life of me see why this should not be allowed. If sites are offending then by all means penalise them but don't penalise those who link to them.
Obviously we closed the webring when I heard from GoogleGuy and as a result I have lost the traffic I was receiving from it. I have also lost ALL my Google traffic. This is not fair in my opinion and it is also a lesson to us all that Google's definition of a bad neghbourhood can be quite "flexible". Be warned!
Understand, it has nothing to do with the actual quality of the site you want to link to, nor does it have anything to do with how useful that link might be to your visitors. "Bad neighborhoods" is all about Google, co-opting the language to look out after it's own commercial self-interest.
sites are offending then by all means penalise them but don't penalise those who link to them.
I agree. Google loves to toot the "natural linking" horn, however this goes against that. Imagine the millions of average people out there who have never heard of SEO, and they put links to sites that they believe their users would be interested in. They have never heard of "bad neighborhoods" so they inadvertently link to some. These quite innocent, "natural links", cause the poor webmasters harm. Seems wrong. (We as people familiar with seo aren't even positive what a bad neighborhood is, although we at least have a good idea. How can they know?)
Please elaborate on your comment 'look after its own self-interest'. Google obviously wants surfers to get the most relevant information. Hence the the theming issue. But they also seem to be moving towards geographic searches too.
In my view LOCALITY IS A THEME. A group of people in a town have a common interest in other non related businesses, in which they can create a multiplier effect by linking together. There is NO excuse for Google to penalise this type of link association. It would be easy, in my view, to adjust the algo to take account of postcodes and postecode proximity to others
Perhaps Googleguy would like to comment?
Background: Google felt there was a threat to their PageRank system. So they started handing out penalties to webmasters based upon who they linked too. The most famous of these was the "PR0" penalty from a few years back. Link to a site with a PR0 penalty and you too get penalized. Share the pain.
Nowhere in all that is Google looking out for anyone but Google, hence they are looking out for their own self interest.
As a practical matter, few webmasters can afford to defy the dictates of Google when it comes to linking so, be warned.
They have never heard of "bad neighborhoods" so they inadvertently link to some.
That's the exact point I was making. I am an engineer and less than three years ago I knew nothing, (and I mean absolutely nothing!) about web design or SEO. When I started to become aware of it everywhere I looked someone was talking up link popularity. Linking together seemed like a good idea at the time and I still cannot see why people in associated businesses should not be allowed to do this.
Presumably you exchanged links with the 'keyword stuffer'? You didn't just have an outward link?
I most certainly did exchange links with the keyword stuffer but the whole point was that I did not see any problem with his site. Apologies to those who have read about this in another thread but this particular site deals in hard to find mechanical equipment. A lot of it is foreign equipment and they had listed many of the company names at the bottom of the home page. Each of these linked to another page with more information about what was available.
It did look like keyword stuffing but this guy was also providing a valuable service. Had I been searching for a part and found his site this way then I would have been deilghted. Unfortunately the algo designers have probably never did anything more mechanical than remove the fluff from their mouse so it was flagged as keyword stuffing.
Is it safe to link to subwebs?
Just where and how were the keywords being stuffed?
You mention a list of mfg's at the bottom of the front page with links to inside pages. Nothing wrong with this. Unless you have something like mfg1 keyphrase1 mfg2 keyphrase1 mfg3 keyphrase1 and so on. He would probaby be 'had' on repetition and proximity. But, is this what happened?
He has a list of links at the bottom of his index page. It is obvious that he is using them to get found but they are valid links. I think if he had created a standard navigation menu at the side of his page with the same content it would have looked OK. I may be wrong so I will sticky you with the URL and you can let me know what you think.
Back then the whole "link exchange" concept was yet to be. And it appears Google is turning it into a page in the history books. Link spam has become ineffective.
Isn't it time to move on?
When asked for clarification about bad neighborhoods, GoogleGuy said, "Don't link to spam." Not sure what spam is? Read Google's webmaster quality guidelines [google.com].
Certainly: they are distinguishable concepts (like assault and battery, or breaking and entering.) But Google engineers noticed that spammers, faced with the problem of obtaining link pop for pages that couldn't ever get it naturally, started building bad neighborhoods -- so there's a very high correlation.
I don't think the definition of "bad neighborhood" is at all difficult. There are two aspects,
(1) "Neighborhood": when all the affiliated pages are put together, the amount of "recirculating page rank links" is substantially [i.e. several orders of magnitude] larger than either the outgoing links or incoming links. [I place no credence whatsoever in theories about "same c-class IP addresses" or "domain registrant" or the like: for a competent mathematician, the trivial approach is simply to try to partition the transition matrix. Each partition is a neighborhood. (The tricky part, of course, is doing this efficiently. That's what Google hires the PhD's (not engineers!) for.
(2) "Bad": presence of blatant, egregious keyword spamming in at least one page of the neighborhood. Googlebot presumably constantly gets better at finding that, so one should expect techniques that _used_ to work, suddenly being detected, and "there goes the neighborhood."
If you're creating artificial link structures AND artificial keyword stuffing, then you need to worry. Otherwise, you should be safe.
... for a competent mathematician, the trivial approach is simply to try to partition the transition matrix. Each partition is a neighborhood.
DOH! Sorry, I haven't got a clue what you are talking about!
What I will say again is that I see nothing wrong in a few sites that are in similar industries, but obviously not in competition, creating links to each other to generate additional traffic. This was working for me. I used to get inquiries saying, "I saw your link on www.anotherengineeringcompany.com and I would like some information about" ... whatever. I really don't see how this can be faulted but once again the algo being an algo is not objective in this and many innocent sites suffer in the collateral damage.
Anyway, Google must have seen the potential for exploitation of page rank when they were starting out. It was a while ago but I would have thought that even then SEO must have been a fact of life of which they were aware? They created a whole new agricultural industry out of link farming - linko-culture.
Google, I'm sure, wants to be first with this.
Don't forget to sticky me with the alleged keyword stuffer.