Welcome to WebmasterWorld Guest from 22.214.171.124
1. itís a known factor that back links help a sites position in Google and a listing in DMOZ is a back link.
2. DMOZ feeds thousands of clone directory sites with its data hence a link in DMOZ means thousands of back links get created to your site.
3. A clone DMOZ directory is duplicate content anyway and surplus to requirements yet they feature on the net in their droves and frankly most of them need removing. DMOZ assists in getting more junk on the internet by feeding these useless directory sites with its data and at the same time they give more back links out to the sites featured. We only need one directory and thatís Yahoo as far as im concerned.
4. DMOZ has a percentage of corrupt editors out for their own cause, where you have big money keywords you get corruption. Google adwords are very expensive in some sectors, so assistance in moving up the Google serps for a website for certain keywords by having more anchor back links is an advantage and of significant financial value.
5. As Google is now a public company it should not be connected in anyway to any site that is possibly corrupt or has a percentage of corrupt staff. It should now break away from using DMOZ data.
6. DMOZ by its own submission cant cope with the editing job anyway and has thousands of categories that are out of date with dead links, incorrect descriptions not been updated and often a bias to the editors own or friends sites.
7. DMOZ is not in anyway regulated, it has no management structure as such that is publicly known and is not accountable in anyway for what it does.
Now as I see it because of the weight given to back links by Google an editor is in some cases indirectly getting a nice payoff as they can decide who they do or donít want in the category they edit. If a competitor comes to get listed they can simply ignore it, they can waffle around this factor as much as the editor likes but at the end of the day they decide.
Also, back links are highly valuable to a site especially in commercial areas. For a site to obtain as many ďOne WayĒ back links as DMOZ and its clones provide working without a DMOZ listing in its own right takes a long time and costs a site money as not all sites will give one way back links for free as DMOZ and its clones can.
As I see it, if you build a great site, you donít want some idiot copying it do you?, yet in the case of DMOZ they not only let you copy the data but they actively encourage it, why Ė back links of course!
Having now worked on various sites over the last five years and seen some get listed, some ignored, some taken out I have come to the conclusion that DMOZ has certain corrupt editors within it and that DMOZ can no longer cope with the size of the internet and the submissions as a result.
Google and DMOZ should now take the following action:-
1. Google should delete Page Rank on DMOZ all together, just grey bar it out on every page so that sites are not getting unfair page ranked back links
2. All clone DMOZ or related clone directory sites should be removed by Google from its index on the basis that they are duplicate content.
3. DMOZ should adopt a policy of no follow code on sites listed and insist that no other site should copy its data.
4. DMOZ should deal with all listing requests in order of when they were submitted. I.e. a site submitted on the 1st Jan 2004 should be looked at before a site submitted on the 1st Jan 2005.
5. If an editor finds a site not listed they think should be, it should be added to the same list to be reviewed in date order after the first one has been dealt with.
6. If the editor is turning a site down it should give reasons, not keep webmasters dangling and advise what the webmaster should do to improve their site in order to get listed.
7. DMOZ should be regulated and have a proper management line structure and the public should know who they are. They should be properly named and be fully accountable for their actions.
8. All data in DMOZ including editorís notes should be made wide open to the general public. Under the data protection act individuals have a right to know exactly what data and notes are made about them by companies. DMOZ clearly is in breach of the data protection act by being able to secretly write notes about sites which it keeps from the public view.
If DMOZ wants to gain any credibility it should adopt the above measures.
Meanwhile, if Google as a public company continues to use the DMOZ data and does not take action imo to clean up this relationship and the way it uses the DMOZ data, it will be a short matter of time until some serious corrupt allegations come to the surface which may ultimately damage Googleís reputation and its stock market value.
Its Time to act now...
Using nofollow tags on pages on which YOU YOURSELF HAVE PLACED THE LINKS is cheating.
You are intentionally manipulating Google and other search engines.
Surely its the people who place links in hope that SEs will give extra weight to targets are the ones who manipulate? Using nofollow does not preclude search engines from following link - it is a wrong name given to effectively hint not to rank that link.
And its not cheating to mark any link with nofollow - for example I personally hold the view that all directories or any site that has a wide collection of different urls on one page should use nofollow, even though its pretty obvious from page analysis that these links are not natural and their PR should be discounted.
And it shouldn't be used for links you have placed yourself. If you intentionally placed a link because you thought the link was valuable to your site, then whether Google ranks that link or not is not your business. If search engines really thought every link was placed for the purpose of increasing another page's rank and webmasters should be the ones to decide when that happens and when it doesn't, they would be managing things much differently. Like, for example, making "nofollow" the default and making webmasters use a "follow" tag when they wanted to pass their precious PR on to somebody else. Notice that isn't how they do things.
It's obvious enough what the tag is really there for. It's a deterrant to OTHER people posting links you don't want them to post. No one wants strangers posting "Visit Joe's porn shop!" on their blogs and messageboards. It's not there for me to say "Oh hey, I could increase my own PR if I put a nofollow tag on all my external links. Then I can effectively hint that search engines shouldn't rank the very sites that I chose to link to from my own site! How about that!"
If you really see no difference between the two, I certainly am going to be viewing any "site design suggestions" you have to make about the ODP, my website, or my cousin's pet cat's website with the appropriate amount of suspicion.
If you intentionally placed a link because you thought the link was valuable to your site, then whether Google ranks that link or not is not your business.
I certainly am going to be viewing any "site design suggestions" you have to make about the ODP
It was not my suggestion but I agree with it - if ODP is designed for humans, then it should use "nofollow" attribute for all external links to ensure that its index exists for humans not to influence search engines. By product of this would be huge drop in spammers who want to get their links into DMOZ, this will reduce load on editors and some allegedly corrupt editors will disappear.
But I think that DMOZ is a different beast , I have seen editors who say they surf the net for good sites to add to their catagory and to my mind Passing PR is one way to reward those sites BUT how many editors are working in that way and how many only look after their own ends , If I was to see DMOZ do a big house cleaning exercise and check for multiple entries and clean those out then check editors who were involved in those entries it would be a start
It would not be hard to run a small programme to look for multiple entries of root domain names and then to investigate those then to check for IP addressess in the same way for nests of sites and clean up any questionable areas
We know G does investigate some of these area's and with some success so why not DMOZ .
If DMOZ wants the respect it deserves for a truly massive directory project they need to show the world of webmasters that they want to be seen to be fair to all
If Dmoz is not seen to handle any corruption and to try to be as clean as possible the big G could drop DMOZ and that would truly be a shame that something as wide and varied would not be found by joe surfer
If you're going to do a witch hunt on multiple entries then at least think about it a bit.
reveal so very very much, though not necessarily about dmoz or the vast majority of those who volunteer time there.
Oh well, I never was much of a fisherman and i've probably nabbed more than my share of the hate bait, so i think i'll simply move on to the next thread instead.
Well, any answer to that would be not only be uninformed; it would also be useless. If 30% of the editors were corrupt, could the problem by expelling 30% of the editors by lot?
Or, perhaps, could it be approached better by identifying listings that shouldn't be there, then using that data to see what underlying problem might exist. (Is it editor ignorance, webmaster abuse, editor abuse, or something else?) And then, when the underlying problem is identified (rather than insinuated), it will be time to make informed decisions about possible solutions.
At this point the people who are flogging the solution haven't pointed to any actual data even suggesting any problem. As the xerophytic lizard points out, multiple listings by themselves are absolutely no indication of a problem: some sites (Smithsonian, Project Gutenberg) have multiple listings, and the biggest ODP problem is that they do not have enough listings.
And you'll allow that same privilege to the ODP editing community when?
What do you mean allow? I don't think they require my permission - ODP community is up to decide on their own policies and I merely can express my opinion but by no means this allows or disallows what they can do with THEIR site.
Whatever they do is up to them, but as far as I am concerned it is clear that if directory that lists lots of external sites does not use NOFOLLOW while being fully aware that listing in that directory allows to obtain very good PR, then clearly this is done to play search engines.
Given that it is well known how badly many people can get into DMOZ, it makes sense to use NOFOLLOW to put away those who want to be featured there primarily for PR.
For reference - one of my sites is in DMOZ.
Do the dmoz editors here believe this is how the directory should be organised
PS this is not yahoo or an authority just one of many sites covering a specific niche area
PS i had never heard of this site before and found it quickly by checking couple of cats with high PR and doing search on domain root through the rest of dmoz
One other thing I will say is that it doesn't appear to have helped it dominate it's area which did suprise me
so maybe webmasters like me are getting knickers in a twist for no reason.
And you'll allow that same privilege to the ODP editing community when?
Why should the ODP care to retain this so-called privilege? The ODP is not building a directory for Google, but for people. The ODP needs not bother about such things like a 'no follow' attribute and the PR of the sites it lists.
Save perhaps if the attribute could slow down the rate of dubious site submissions. With submissions down to a trickle, editors would be free to 'build categories' rather than be mere 'site submission processors.' The 'no follow' attribute would benefit the ODP, and harm no one.
Unless all this 'category building' is utter nonsense, and unless the editors do in fact rely heavily on site submissions, and unless their prime motivator is a belief that they are special to Google.
But, in the spirit of "too much ado isn't nearly half enough", you can't easily compare what is available NOW online with what was available at the time those listings were added -- you certainly didn't mention whether you did that. Nor did you mention whether that site might not have had a unique relationship with the original source of the information. You didn't mention whether the listed website had some institutional support that might suggest greater stability than other sites, or some personal connection that might give it greater authoritativeness. (We don't list all sites that have the same content, we pick one of them, and if it doesn't matter to SURFERS which one it is, it's not going to matter to the editing community either.) Nor can you easily replicate the editors' "random walk through the matrix" to see why that site was found at the time whereas other similar sites might not have been. There are lots of possible reasons, all of which are absolutely known to occur frequently, without resorting to suspicions of either webmaster or editor abuse.
That's not true in any useful sense. Google is people also: why should they be prevented from using the ODP in any way they wish?
>Why should the ODP care to retain this so-called privilege?
But that's just the point. The change under discussion is not at root merely about a privilege being retained by the ODP. It is Google's privilege that is under question: the ODP is offering them a privilege--the same privilege possessed by other groups of people working together for commercial purposes. What each group does with their privilege is up to them.
I personally think using the ODP more effectively would
improve Google search results significantly. But it is their choice, and I think that's as it ought to be.
And who are you, who am I, to tell them how they ought to run a search engine? If you think their results are poor, tell them, sure, and let their people work on the solution. If they aren't fast enough or good enough for you, start your own search engine. But Monday-afternoon-quarterbackbiting can't be productive.
Okay, I concede that. However, it's also your business if you want to place, say, hidden links on your site. It's a free Internet. You can do that. I'd be surprised to hear you suggesting that somebody else's website ought to use hidden links out of fairness to other sites, though.
>if ODP is designed for humans, then it should use "nofollow" attribute for all external links
Wait a minute, are you saying that EVERY website designed for humans should use nofollow for all external links? Then what websites SHOULDN'T use nofollow? Just the ones aimed at manipulating search engines? Somehow I... don't think that's exactly what the search engines were envisioning, here. In fact, Google's pagerank system would completely fall apart if everyone complied with that. Luckily it's in no danger of happening, but it's odd to hear it advocated.
>The 'no follow' attribute would benefit the ODP, and harm no one.
I doubt both premises. I think it would diminish the ODP to use a tactic like that, regardless of whether somebody thought it might curb spam or not. I disbelieve that it would deter spammers, and suspect it might actually discourage more legitimate submitters (who came across the site but assumed it was a free-for-all post-your-link thing when they saw the nofollow spiel) than spammers (who are either too dumb to read notices or SEO-savvy enough to realize that Google's clone of the directory won't have the same restrictions.)
I can't see good it could possibly do. The only way in which it would "benefit the ODP" would be by raising its PageRank by artificially appearing to have no outbound links to Google. And I think that's a lousy goal, personally.
If you really want to eliminate spam submissions completely, go ahead and advocate closing site submissions. It's been advocated before. I admit to being tempted by it, but usually come down on the side of thinking that if even one editor finds the submissions useful, then it's better to leave them. But it's not like you, or I, or anyone else HAS to use them if the spam is too frustrating. We can completely leave them be and use one of the other tools. That's what's so nice about editing. Any task I perform is useful, so I can choose which one I feel like doing and still be productive. (-:
Wait a minute, are you saying that EVERY website designed for humans should use nofollow for all external links? Then what websites SHOULDN'T use nofollow?
No, not every site - only those whose objective is to compile a list of external sites. If there is an article which has a few natural references to some times, then its fair play, but when you have a page whose only contents is a long list of external sites then it really should have an implicit nofollow, thus only natural links will take part in calculation of algorithm.
Talking of DMOZ specifically addition of nofollow would discourage those people who want to be listed with the main reason of gaining PR - this will cut down amount of work for editors. Its a no brainer really, unless of course DMOZ is not meant primarily for humans.
If you really want to eliminate spam submissions completely, go ahead and advocate closing site submissions.
Site submissions are not the issue, the issue is link spam and current inability of search engines to decide which links is natural and which is not.
The ODP has a simple approach to spammy links. We just remove them. Have you seen any lately?
The issue is not the ability of ODP to avoid allowing spam links through their human filters - the issue is that because ODP is so desireable by spammers due to PR of the pages, it means that ODP must be getting lots and lots of submissions that they have to reject, this means a lot more work then they would have had otherwise if it was absolutely clear that all links on ODP will have NOFOLLOW attribute and thus won't attract any PR.
directories are a dieing breed in the age of algorithmic SEs.
This follows an assertion that algorithmic SEs can somehow categorize billions of websites. If so, where are they? I have yet to see a billion-site directory.
And, have you seen some of the laughable lists of sites Google puts up when you ask about "related" sites? The times that the related sites are consistently related is when most of them are listed in DMOZ in the same category (or maybe category tree) of the site you started with.
DMOZ brings a "net" of organization, a framework, to a few million listings in order to help SEs actually have some sensible results generated (mostly) externally from analysis of keywords on a page, linking patterns, etc.
But there's another subtlety to the power of randomness. Someone thought "long delay" was our tool against spam. It's not. The tool is RANDOM delay, which is a very different thing.
See, for a spammer, too few submittals are fatal -- no listing, if all of them are rejected. Too rapid submittals are futile -- automatically deleted as duplicates. Too many submittals are dangerous -- attracting attention and building a reputation which can cause ex post facto damage. Too slow submittals waste time -- during which there's a high risk that Google will bin the site because of other spammish practices. It's like firing blindfolded: the less information is available, the more frustrated and more angry the spammer gets.
Legitimate webmasters don't have this problem. Their businesses have legitimate promotional opportunities; their reputation relies no a stable public presence and identity; they are used to purchasing services that they need rather than begging or mugging passersby. So additional free web visibility is like an article in the local paper: nice, worth asking nicely for, nothing to expect as a guaranteed right, nothing to depend on, nothing to badger the reporter about.
If an article on a legitimate business is in tomorrow's paper, that's great; if it's in next March's magazine, that's good; if neither comes through, he's not going to badger the journalist because some article in some other paper mentioned a competitor. If his business starts just after the Yellow Pages goes to press, he won't call for a boycott of the phone company until his ad goes in. Such behavior would be insane!
Aren't they? It seems they're Helleborine's main concern, anyway, and then you keep saying things like "ODP must be getting lots and lots of submissions that they have to reject, this means a lot more work for editors." If the issue is that site submissions are too spammy and are causing too much work for editors, the right thing to do would be to move to eliminate site submissions, not use underhanded techniques to stymie search engines and hope that indirectly makes spam go away. It wouldn't. I can completely guarantee that, in fact--unless you think you're also going to be able to convince Google to nofollow its own directory pages, too. Which they'll never ever do--they'll either keep their copy of the directory because they think it helps them organize the web, or they'll delete it because they think it doesn't. They wouldn't keep it there but block their own search engine from using it to theme and rank links. That would be utterly pointless of them.
And it would be equally pointless of the ODP to nofollow all its pages. This isn't link spam we're talking about here, they're manually placed links to sources that the linker deemed relevant. If search engines wanted to devalue ODP links, they wouldn't need anyone's help in doing so--in fact, for all I know some of them HAVE. Or other search engines may have given those links a value boost. That's up to them to decide. For the ODP to intentionally try to mess up whatever algorithms and balances they've decided suit their engines best is no more ethical than for me to do it on my homepage would be.
Really, why go out of our way to juke Google? Nofollow is for me to use to discourage people from posting links I don't want on my messageboard. I'm really *not* supposed to be using it to try to trick Google into thinking sites I choose to link to are not as important as Google wants to believe they are, or trick them into thinking that my site is more authoritative than Google thinks it is. You can't convince me that's what Google wants us to do with that tag. If it is, then why haven't they used it on their own directory?
As far as using it as a resource, go there first every time I need multiple sites on one topic.
Quick question: Do you still get any kind of confirmation when your site is rejected or accepted?
joined:Mar 17, 2005
Autumn turns to winter,
And winter turns to spring.
It doesn't go just for seasons you know,
It goes for everything.
The same is true for voices,
When boys begin to grow.
You gotta take a lesson from Mother Nature,
And if you do you'll know.
When it's time to change (when it's time to change),
Don't fight the tide, go along for the ride,
Don't ya see.
When it's time to change, you've got to rearrange,
Who you are and what you're gonna be.
Sha na na na na na na na na
Sha na na na na.
Day by day you're facing the changes you've been through,
A little bit of living, a little bit of growing all adds up to you.
Every boy's a man inside,
A girl a woman too.
And if you want to reach your destiny,
Then here's what you can do...
I loved the part where it is made to look like the 'haphazardness' of site reviews as good as if it was by design, and the greatest invention since sliced bread.
Ah! Friendly editors, valiant defendors, breathless debators, are you getting caught up in the heat and flourish of your arguments, and missing the point entirely?