I allow all content on my site to be crawled. With that said, should I simply not use robots.txt, or have one with the following lines :
user-agent: *
allow: *
Would having the latter of the two improve the number of pages crawled by the spiders?
-panic