homepage Welcome to WebmasterWorld Guest from 54.226.18.74
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / WebmasterWorld / New To Web Development
Forum Library, Charter, Moderators: brotherhood of lan & mack

New To Web Development Forum

    
First upload of our new website
what have we forgot?
noblehouse




msg:973476
 11:49 am on Mar 31, 2005 (gmt 0)


Not sure if this the best forum category to place this in but I think this is the beginners room right?

Ok, a friend and I have just built our first major website (an ezine) and so far we have taken care of the following things:

1. Lots of content, over 100 pages written about not only our ezine subject but with relevent target pages made as well, all keyworded correctly we hope and we are aiming for 10%-15% keyword density. All spell checked, read and re read.

2. Site design made for easy navigation and checked.

3. Site map made, and pages indexed correctly for SEO optimisation.

4. Site background code reduced as much as possible for quick downloading of site when people visit us.

5. List of directories, link exchange and incoming link sources to contact immediately once site is uploaded.

6. Press release prepared for once site is uploaded, or in a few months after? hmm, advice on this one please.

7. Articles ready to be submitted to places like article city to hopefully recieve some good incoming links.

So we have covered what we think is the most important points after spending hours searching through this great forum with all it's helpful advice, and just posting now to ask if anybody can see anything we may have missed, or may be doing incorrectly, that will negatively effect our page ranking when we first get listed in the search engines.

So apart from crossing our fingers, praying and being patient whilst working on link developemnt what else can we do?

Thanks in advance for any help.

 

noblehouse




msg:973477
 11:52 am on Mar 31, 2005 (gmt 0)

Do we really need a SEO submission tool, or just do it manually through the big players of google, yahoo, msn etc.?

Goober




msg:973478
 11:55 am on Mar 31, 2005 (gmt 0)

Hi Noblehouse,

Congratulations on your site. One thing I have learned here is that once you publish, you will be constantly tuning your site to attain a better ranking. It looks as if you've been very thorough so far.

Press release? Make sure the site it running well, no missing or broken links, etc. THEN submit the press releases. Check your server logs each day and see what is attracting people to your site, and then re-tune to keep them longer.

Submission tool? I'd hand submit. These tools can attract more spam than you can shake a stick at. Just be ready for all sorts of weird offers if you do.

OH....try to find a program that will protect your email addresses from bots and harvesters.

Good luck!

Goober

EBear




msg:973479
 12:20 pm on Mar 31, 2005 (gmt 0)

Welcome to Webmasterworld, noblehouse.

At this stage in development/launch I usually find that some of the following have yet to be done:

  • robots.txt
  • custom 404 and 403 pages
  • check your server config -
    does domain.com resolve to www.domain.com?
    Is directory browsing disabled?

Finally, my topping off ceremony for any new site is to create a favicon.

These are all more to do with the user experience than SEO. For that, follow Goober's advice above. You only need to submit to directories; get good links and the search engines will take care of themselves.

Good luck.

limbo




msg:973480
 12:41 pm on Mar 31, 2005 (gmt 0)

Try to get a spot on the DMOZ
Start a word of mouth campaign
Proofread, proofread, proofread...
Accessibility/User agent testing

noblehouse




msg:973481
 1:49 pm on Mar 31, 2005 (gmt 0)

Ok, thx for the quick replies, I'm constantly impressed by this forum.

My partner will probably understand this more as he's the webmaster and I'm the content writing guy, although we of course collaborate on everything, but what's robot.txt and customise 403 and 404 pages how?

Word of mouth campaign started already, proof reading being done over and over again to make sure everythings perfect.

So what else?

I think press release will be made at a later date.

Another thing regarding directories. The website will not be totally complete for a long time, say we are discussin blue widgets, yellow widgets, red widgets, green widgets and black widgets, and each widget section is a couple of hundred pages, we have the one main section complete and main parts of the others, but of course it will be a long time before the full site is properly complete with full content for each section. If we submit to the larger directories without all sections completed in full will we be rejected?

sifredi




msg:973482
 1:55 pm on Mar 31, 2005 (gmt 0)

Clean URLs.

topr8




msg:973483
 1:58 pm on Mar 31, 2005 (gmt 0)

>>but what's robot.txt and customise 403 and 404 pages how?

they are just polish.

robot.txt - not needed, infact if you don't know what it is best not to use it, there is a whole forum here about it if you want to try

custom 403 and 404 pages how?

your web host may allow you to customise these pages (404.page not found 403.forbidden) check your webhost control panel as to how

quiet_man




msg:973484
 2:47 pm on Mar 31, 2005 (gmt 0)

Welcome Noblehouse

Some good advice above, I'd second EBear's suggestion to check your server config. The www versus non-www domain issue is very hot right now, its good to get this right at the start. Also you might consider adding a <base href= > tag to all your pages, this may help combat any problems with redirects and 'page jacking' (also a hot topic right now).

custom 403 and 404 pages how?

A lot of folks like to create custom error pages that match the design of your site. That much is 'polishing' and may depend on how style / brand conscious you are. But it also helps to include links back to your home page, site map, search function etc on these error pages, that way you have more chance of retaining users if they hit on a bad link etc.

Finally, you may want to have a look at this excellent thread and check each item against your own site:

Building the Perfect Page - Part I - The Basics [webmasterworld.com]

noblehouse




msg:973485
 2:59 pm on Mar 31, 2005 (gmt 0)

I've had that page bookmarked for quite a while!

I will get the webmaster to check those key points you have mentioned, I'm not too clear on the server configs you are talking about, can somebody explain it in layman's language plz?

quiet_man




msg:973486
 3:40 pm on Mar 31, 2005 (gmt 0)

in layman's language plz

I am a layman myself when it comes to most server-side stuff! But what I understand by the issue is that if you are not set-up correctly, it may be possible for the same page on your site to have two URLs:

[yourdomain.com...]

and

[yourdomain.com...]

The problem is that if a search engine spider happens across both URLs it may punish one version as duplicate content. And the version it penalises (or simply excludes) may be the one that most of your links point to, and which has the higher PageRank in Google.

This old thread tried to look at this issue (though the subject strayed a bit later on):

www.domain.com vs domain.com [webmasterworld.com]

One other item:
If we submit to the larger directories without all sections completed in full will we be rejected?

If you mean that you have a load of empty template pages, live, awaiting content, then yes that may count against you (depending on how strict the editor is, and how much content you have on other sections). It may also trip a duplicate content filter from the SEs if the templates really are empty of any content. Much better to add subsections as you go, once there is something worthwhile in each.

EBear




msg:973487
 11:12 am on Apr 1, 2005 (gmt 0)

they are just polish

Exactly! And that was my point. When you've done everything you know of to make your site search engine friendly and filled it with your first batch of content, then is a good time to go back and think about your users again. It's like when you've finished buiding that extension to your house, you hoover!

A custom 404 page is easy to do and as quiet_man says, you can use it to keep users at your site. (I use it too to alert me to missing pages.) It stops it from having an unfinished look. Likewise with favicons: they're not always appropriate, but they're a nice touch.

quiet_man makes some good points about resolving a non-www url, but again I was actually thinking about your users. I rarely type the www when entering an address into a browser, and it pisses me off if the server can't make sense of it.

Brett has some good info on robots.txt here [searchengineworld.com]. Read the tutorial and just copy/adapt one of the examples.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / WebmasterWorld / New To Web Development
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved