Welcome to WebmasterWorld Guest from 184.108.40.206 , register , free tools , login , search , pro membership , help , library , announcements , recent posts , open posts Become a Pro Member
Which content types are taboo for search engines? Is .html that much better than .cgi, .php, or .shtml? Laxters
Excellent site - I've been reading up through your posts and I've caught up a ton with search engine placement and ideas.
Here's my question: How well are dynamic pages being spidered and indexed these days?
I'm currently building out a new site. It's going to be low traffic, small content "brochure" site. I'd love to user .cgi or even .shtml to more easily add in some dynamic content or random images/colors.
So, does this hurt my search engine placement by doing so? Is it harder to crawl a site if the pages are .shtml or .cgi (with or without parameters)?
Thanks in advance,
troels nybo nielsen
Welcome to WebmasterWorld, Laxters
My own experiences with dynamic pages are limited, but IMO it defintitely gives an advantage in search engines to keep your URL's as uncomplicated as possible.
Others may disagree, though.
Thanks for the info =)
Anyone else have other opinions/ideas/experiences? I guess I'm still wondering about which scripting languages get indexed best.
[ edited by: Laxters at 11:15 pm (utc) on Nov. 17, 2003] richlowe
My understanding is that any page name and type at all is fine. It's the parameters that make it difficult or impossible to crawl. tomparis
Some of my dynamic ASP pages were difficult for the search engine to index. I have both a lot of static pages for the robots to chew on.
SEs don't like to read asp code, or jsp either!
SEs don't like to read asp code, or jsp either
Untrue. There is no difference as far as the browser is concerned between ASP and HTML. ASP is done at the server.
So basically, SEs don't care what scripting language is used on a site, but they have a hard time following links with parameters.
Good to know. I'm leaning towards CGI at this point now.
Now, does anyone know if spiders have problems following off-web root links? What about following into the /cgi-bin?
It doesn't matter. You can disguise any content type to look as html or shtml. One line in .htaccess will parse your html files as they are shtml or even cgi and vise-versa.