Forum Moderators: DixonJones

Message Too Old, No Replies

404's - Page not found - Why?

What's causing this?

         

zoobie

9:31 pm on Sep 17, 2003 (gmt 0)

10+ Year Member



My web stats say 9% of requests get 404's...What could possibly be causing this in such a small site?
Thanks

BaseVinyl

9:34 pm on Sep 17, 2003 (gmt 0)

10+ Year Member



Probably just robots looking for a robots.txt file and not finding one so it makes a 404.

defanjos

9:35 pm on Sep 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Look at your stats and try to find out which files or file is causing the 404's - maybe you have a bad internal link or an outside link to a page that does not exist any longer.

zoobie

10:28 pm on Sep 17, 2003 (gmt 0)

10+ Year Member



It's just a 2 page site...All links are good...How would I stop the errors with robots.txt?
Thanks

defanjos

10:44 pm on Sep 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



How would I stop the errors with robots.txt?

Create one (robots.txt) and place it on the root of the site (the same folder as your index or default page)

Tutorial [searchengineworld.com]

jdMorgan

12:21 am on Sep 18, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Files which are often assumed to be present on all sites:

/robots.txt Robots control file 
/favicon.ico Windows Favorites Icon
/w3c/p3p.xml P3P Privacy policy

..and maybe more.

User agents may attempt to load any of those, and if you don't have them, you'll see 404 errors.

Jim

amoore

12:28 am on Sep 18, 2003 (gmt 0)

10+ Year Member



And then there is a bunch of worms and hackers that try to hit common scripts with security problems. For instance: formmail.cgi, cmd.exe, default.ida, and so on.

You ought to have a way to find out what filenames are brining up the 404's. It may not actually be a problem. Don't try to get rid of 404 errors just for the sake of it. They're not hurting anything.