Welcome to WebmasterWorld Guest from 184.108.40.206
After trying to get my site listed on DMOZ for the last 6 months I have finally managed to get some feedback!
The feed back is that my site was deleted due to deep linking. Now im not sure where to post or who to ask, but I need to clarify what they mean by deep-linking.
a) I need to create top level pages and only have the deeper links behind these top level pages,
b) I have linked to someone that wants me to link to their home page rather than an article I have found.
c) I have too many outbound links on my site
or d) something completely different?
Any feedback and help would be very much appreciated.
It is a bad start to the new year, but in every rainbow there is a silver lining - and I am now technically further forward than I was 1 hour ago :)
An editor may gratuitously have added other links to your site without your having asked
Go to dmoz and search for your site using the URL in the form "mysite.com"
The results will show where "mysite.com" is indexed in DMOZ, You may find your top level still there
For instance if you had
the last 3 would be considered deeplinks of the first, whether you choose to link them or not.
without knowing the site or design it's hard to say.
Another possibility is you spammed the directory so egregiously you managed to get the whole domain removed - unlikely, but it could happen if you repeatedly submitted every page of a site.
To your specifics
1)No - build a normal site, expect only the root url to be listed.
3)Irrelevant, unless thay are affiliate links or there is no unique content on the site (is there content left if you delete the outbound links?)
Yep, this is true. And it would have been so terribly difficult for that editor to have pointed that out in the reply you recieved?
If that is the case you can make your own conclusions about what is going on here!
Actually, No, I can't. I see no correlation between this statement and the preceeding lines.
You are trying to say that an editor or editors managed to add so many deeplinks of a site that it got a site banned?
I'd be interested in knowing the exact wording of the feedback - I can take a better guess at the reasons then.
Of course the only way to know for sure is knowing the URL.
Something else I just thought of - how well do you know the history of the url? archive.org might help you if you have a new (to you) URL.
You appear to be implying abuse - that one editor deep linked and another removed the deep link
All actions are logged at DMOZ, so you can file an abuse report and get it checked if you really think that is what happened.
That doesn't actually imply abuse does it? What if the new editor didn't know better and a higher editor or meta fixed a directory. Also many cats are in disarray and could be just being cleaned out because the content doesn't really fit the cat anymore.
I have seen that DMOZ is trying to create hard and fast guidelines. They seem to be trying to remove gray areas that they use to have. Nothing wrong with that. Actually helps.
Your response seems odd if you submitted the top level domain unless it is like said earlier the domain is a subsection of another site. Also sometimes editors make mistakes. Nobody is perfect and they could have confused your site with another when writing the log. Could be any number of issues that really DMOZ probably won't expand on anyway. Do you have any of the issues stated from this thread?
But based on that one word "deeplink" and your proposed solutions, it looks to me like you were creating doorway domains for some business (whether yours or someone else's, doesn't matter here), and the editor noticed.
When thinking about submitting business websites to the ODP, I'd suggest this reality check:
1) Is this really a business? Does it already exist, operate, and engage in business activities that provide goods and services -- even without the website? Does it have a unique management structure, unique place of operations, does it actually provide unique goods and services?
If the answer to any of these is "no", stop. There is, fundamentally, NOTHING for your website to have content ABOUT, hence there's no way to create a website with unique content.
2) Does that business already have a website describing a part of their goods and services?
If the answer to this question is "yes", stop. Our expectation is that the business can provide web-navigation to all of its content -- and if THEY, the most interested party in the world, don't think it's worth linking to, then obviously WE shouldn't either. If they DO link to it, then it's not necessary for us to.
But, you object, there are sites in the ODP now that don't meet those criteria. That is probably true: please report them: we'll recheck.
At this point, quality assurance problems receive MUCH quicker review (hours to days) than site submittals, perhaps partly because (so far) they've proven so much more likely to be valid [>90% versus <50%], partly because they are usually quick reviews [yes, that Teletubbies site is now child porn..], partly because, IN THE EYES OF OUR USERS, they are a MUCH more serious problem than unlisted sites.
The last is worth emphasizing. If Aunt Sally can use the directory to find 30 sites offering crotcheting hooks, it's not going to bother HER in the least that there are 600 more sites waiting review. Dead links, irrelevant links, etc., WILL bother her.
I understand that the 600 knitting shop owners will feel differently. That's an inherent difference of interest. Many organizations stand ready and willing to assist the shop owners; the ODP has the unique mission of serving the surfers' needs. Any help or harm it might give shop owners is (from our point of view) accidental, inconsequential, and irrelevant.
Deeplinking is rarely appropriate for commercial sites (i.e. those that sell goods or services). For reference/noncommercial sites deeplinking may be appropriate where there is sufficient content to justify its listing in a category.
However a "deeplink" can be a domain in itself. Imaging that widgetcorp.com has a site that sells all colours of widgets, but for the purposes of SEO it's split the different colour widgets into individual sites, such as bluewidgets.com, redwidgets.com. - these sites are deeplinks of widgetcorp.com (i.e. the root domain) because there's a logical structure with the root at the top.
The depth of a deeplink is irrelevant. It doesn't matter if it's www.referencesite.com/wombats.htm or www.referencesite.com/living_things/animals/mammals/marsupials/wombats or whatever. The only thing I would suggest is that a deeplink is more likely to be accepted if it's a static URL rather than one of those insanely long URLs generated by content management systems that invariably break.
Yes. In such a case, repeated ODP staff edicts are clear and specific. They are "individual product line listings" and may not be listed separately in the ODP.
This doesn't mean the site shouldn't be laid out like that. You can't manage a large site without imposing a logical, hierarchical structure on it. It just means that you create some pages because you and your users need those pages, not because they might get you extra listings in the ODP.
Then someone (probably the same person) went through and removed all of the other sites I had ever touched (even a major Fortune 100 company.)
1) Is this really a business? Does it already exist, operate, and engage in business activities that provide goods and services -- even without the website? Does it have a unique management structure, unique place of operations, does it actually provide unique goods and services?I'm not sure how imdb gets past this, but they are now up to 8000 dmoz links.
However, if the business' website also presents subpages with substantial amounts of valuable informational content, not copied from elsewhere on the web, that would enrich ODP categories, then those may also get listings. Hutcheson made a very good post in one of these threads comparing such a tactic to a business sponsoring public television or a local opera. As far as I personally am concerned, if I find a webpage that has lots of unique content on an informational topic, I'll list it in that topic's category, regardless of whether it was sponsored by a business or not. If that business gets enriched by the link to itself from the informational page it provided, then that's no different than getting a "This program supported by..." mention during a public event they sponsored. Websites with lots of valuable, unique informational content about many topics, such as museum sites, media sites, and online databases, often get multiple links for this reason. If Joe Webmaster provided subpages with lots of valuable, unique informational content about many topics distinct from his business, his site might be eligible for deeplinking too.
Needless to say, information about each product sold by a company does not constitute "valuable unique informational content" and is all very well represented by the company's main URL. I don't really know what purpose anyone thinks it would serve to have 500,000 Toys-R-Us listings in the ODP. (-:
I don't really know what purpose anyone thinks it would serve to have 500,000 Toys-R-Us listings in the ODP.That would make sense if (as in the real world) the Toys "R" Us toys are the same toys (Barbie, etc.) that you can find at all the stores. But if Toys "R" Us were to sell those toys with Toys "R" Us packaging and a unique return policy, you would essentially have the same product page that imdb has. And to be clear. I don't really have a problem with imdb, or it's listings. Only when I see someone mention that their homepage's listing may have been deleted because an editor found they already had one listing does it bother me. Deeplinking and multiple listings are extremely common, not the rare exception we are led to believe.
Only when I see someone mention that their homepage's listing may have been deleted because an editor found they already had one listing does it bother me.
powdork I very carefully read back and don't see anyone suggesting that the root url would be/has been removed because a deeplink is already listed.
It's possible that could happen by accident if the sites homepage was not the root. ie if your homepage is website.com/myhomepage , but you get a different valid page on website.com . I regularly shorten urls to the root, as that usually provides a more stable url - it still works when someone changes from index.htm to a.asp
There is the remote possibility that the submitter submitted hundreds of deeplinks in violation of our guidelines, and managed to get there site banned in it's entirety, but that's a very rare occurrence.
Taken purely at face value, I'd have to agree with you though - a sites main page should not be removed because a deeplink is already listed.
Not in the least. The packaging and return policy are not worth mentioning in this context. If Toys-R-US had original, quality pages on the history of the Barbie doll, then that content could be listed in a Barbie category.
"Products" is a business term. A web directory shouldn't give a crap about a product. A web directory should care about unique, useful informational content on a subject. The IMDb pages are an excellent example. Those pages are listed for their informational content which has zilch to do with products.
When people talk about having watched a television program, its not the commercials being discussed. The entertainment program content overwhelms the commercials. When the commercials overwhelm the entertainment, those are infomercials. The ODP lists "programs"/webpages that do have some commercials -- but the program overwhelms the commercials. They don't list the multiple infomercials for products.
So that would explain why all those pages on Geocities got shortened to www.geocities.com and all the ATT pages got shortened to home.att.net and all those yahoo groups got shortened to groups.yahoo.com
It's easy to justify removing them if they are all at the root level.
Pretty much. And, like imdb's product pages, it wouldn't get listed at all. That's why imdb DOESN'T and WON'T have 6,000 links in SHOPPING.
Deeplinking is more common in Arts information-based categories than in most other areas of the directory, because there are more pop-culture-related celebrities that are the topic of websites than there are, say, religious or political leaders. And it shouldn't be ignored that by definition, artists create unique content, so as a group they tend to have something to put on a website. The combination of available subcategorization and unique content is more common in arts. IIRC, both John Wesley and Martin Luther first got their own personal categories in Arts (although some might say that wasn't what they were best known for.)
The community is always discussing the amount of deeplinking that best serves the user: and we are nearly always raising the bar. There's a current discussion on deeplinking in Movies categories, and as a result many links will be removed. I don't think imdb will get hit -- hard -- this time around, but as more "celebrity" websites go online, it will be rarer to find unique content on new ones. Editors' emphasis is usually on building categories for wannabe-celebrities or new-celebrities, new or obscure movies. imdb has, over a number of years, carefully built a reputation for getting information on those, hence editors will often look there first when trying to build a new Movie or starlet category.
ie Where there is a page on the end of the address, I will strip the page from the address, check the site still works, and list the root.
The root url for a site is not always the TLD that the site is hosted on, as you pointed out with geocities et al.
If someone submitted webmasterworld.com/index.html my first step would be to strip off the index.html and see if the site worked.
If someone submitted members.tripod.com/~antigeoring/index.html (random example), similarly I'd remove the index.html - since the site works perfectly well without it, it is superfluous.
Yes there are a lot of sites listed with pages as part of their addresses. Why? Because things change over time, because not everyone remembers or is aware that it's better to list the sites root, and because some sites will not function without the named page.
Is it going to be 'fixed'? Nope, it's inconsequential as far as changing listed sites - after all they work as is, although if the site owner changes format the link will not work, and will be flagged for investigation by our robot after the link dies. It can't be automated because of those that do require the page name.
I hope you actually understand this, rather than deliberatly misunderstanding my previous post.
Getting back to the original post - deeplinking is not a no no, but it is not common, and certainly should never be expected.
Thanks, I'll happily take credit for that one :)
The ODP guidelines have serious flaws, look some more if you can't see them.
Too many rules......lack of commonsense in them ;)
The OPD makes itself open to abuse. No simple fix for that, other than it becoming totally irrelevant.....which will eventually happen.
Well, it's refreshing to see a non-editor on the other extreme of that continuum -- the usual outsider complaint is "not enough rules".
Of course, people who haven't seen the "hidden abuses" are likely not to see the purpose for some of the rules that are there. That's why I recommend a new editor read the guidelines before starting to edit. Then edit a week and read the guidelines again -- some of them might even make sense this time. Then edit for a couple of months, and read them again. Many of them will make sense this time. After another few months, nearly all of them will make sense; you'll understand why many missing rules _aren't_ there, and you might be ready to start on the top-level-category-specific guidelines.
I taught some beginning programming courses. One of the things I would do is try to do to pass on the benefit of my painful experiences, was point out things that a programmer should _never_ do.
I could divide a class into three groups.
The first group would immediately go do that thing accidentally, then come asking why their programs wouldn't work. I'd explain to them where they had done The Thing That Must Not Be Done. They'd stumble out and make fumbling random changes in their program till the TTMNBD ceased to function.
The better students would immediately go do the TTMNBD accidentally, then come asking why their programs wouldn't work. I'd show them, and they'd go fix it.
The good students would, within a few months, do the TTMNBD accidentally, then come back saying, "now I know why you told us not to do this!"
The rule was good. It was plain simple common-sense, if you understood enough of the art. And yet ... without some understanding, it was worthless.
The ODP guidelines, worked out as they were by an amateur community building experience dealing with taxonomic and linguistic issues at a scale much larger than most people will ever face, partake of the same nature.