|troels nybo nielsen|
| 9:02 am on Nov 14, 2003 (gmt 0)|
Welcome to WebmasterWorld, Laxters
My own experiences with dynamic pages are limited, but IMO it defintitely gives an advantage in search engines to keep your URL's as uncomplicated as possible.
Others may disagree, though.
| 10:45 pm on Nov 17, 2003 (gmt 0)|
Thanks for the info =)
Anyone else have other opinions/ideas/experiences? I guess I'm still wondering about which scripting languages get indexed best.
[edited by: Laxters at 11:15 pm (utc) on Nov. 17, 2003]
| 10:49 pm on Nov 17, 2003 (gmt 0)|
My understanding is that any page name and type at all is fine. It's the parameters that make it difficult or impossible to crawl.
| 3:36 pm on Nov 19, 2003 (gmt 0)|
Some of my dynamic ASP pages were difficult for the search engine to index. I have both a lot of static pages for the robots to chew on.
SEs don't like to read asp code, or jsp either!
| 4:50 pm on Nov 19, 2003 (gmt 0)|
|SEs don't like to read asp code, or jsp either |
Untrue. There is no difference as far as the browser is concerned between ASP and HTML. ASP is done at the server.
| 5:02 pm on Nov 19, 2003 (gmt 0)|
So basically, SEs don't care what scripting language is used on a site, but they have a hard time following links with parameters.
Good to know. I'm leaning towards CGI at this point now.
Now, does anyone know if spiders have problems following off-web root links? What about following into the /cgi-bin?
| 5:35 pm on Nov 19, 2003 (gmt 0)|
It doesn't matter. You can disguise any content type to look as html or shtml. One line in .htaccess will parse your html files as they are shtml or even cgi and vise-versa.