|Are Database driven sites search engine friendly?|
I am developing a website where most of the content is constantly changing - either being added or removed and rated. The content is stored on a database.
Will this website be search engine friendly? I do not see why not, since the content is still on the website. However, since the website is so dynamic will search engines have a trouble browsing through the ever-changing content?
You can write the script to make the URL search engine friendly.
The SE's don't seem to have any problems indexing and ranking my dynamic content pages these days. In the past, there were some difficulties, and URL's were adjusted to look like static pages as a countermeasure. I'm not sure that trick is needed any more.
My URL's are fairly simple. Perhaps long strings are still a problem. Anyone?
quicksilver1024, database driven sites are not SEO friendly by nature but one of the effective technique we are using is 'Cloaking'. Have you heard also about Google Sitemap? Try that one also.
Search engines are getting smarter and as a higher percentage of websites switch to dynamic as opposed to static pages it is in their interests to be able to index as much of this content as possible.
Making url's as search engine friendly as possible is still a smart move, you can generally do this by rewriting your url's.
In time I believe this will all be irrelevant as the se's get smarter but for now we have to give them a helping hand.
>database driven sites are not SEO friendly by nature
With respect, it is not the fact that a site might be "database driven" that determines its Search Engine Friendliness factor, rather how the site is implemented. Cloaking is one option, albeit a risky one, but as others have stated, working with the URLs, having a clear navigations system, and so on, can help a lot.
On the Dynamic URL side - KISS! Keep It Short and Simple. Short as in URL length, Simple as in fewer variables, preferably only one.
Oops - I just found some threads relating to my question
If I read correctly, the actual data contained in the database cannot be accessed by SE's, however if your content is published on the website then it can be read by SE's.
Also, would a SE friendly URL contain keywords? Can you give me an example of a SE friendly URL?
And I won't be cloaking or doing any of that black hat stuff. I rather play it safe in this area.
I do know about the sitemaps, but I don't think implementing it on my website will be of any help. Unless I can simply sitemap major categories of my website, since the individual content under each category would be over 100.
I'm a fan of making the URL's look like static content. There are different ways of doing this depending on which technologies you are using on the backend.
The best way from a search engine perspective is to not include the argument (field) names, but rather a "keyword" that specifics what the rest of the variables are going to be.
An example would be:
Product would signify that the variable to follow would be for product, color, and flavor. Tack a .html on the end and fudge the last modified header so that it looks like static html. Not sure all those things help anymore, but they certainly don't hurt.
Be careful about duplicate content. Make sure each page can only show up for one url. Write code to enforce the correct url.
At this point, the search engine (and users) don't really even know it's not static html.
I think there is a trend to drop the .html. It really serves no purpose. It conveys nothing to the user, and in most cases conveys nothing to the web server.
I think it should be dropped when possible for the same reason that I drop www. (I do redirect to non-www.) It's supurflous.
This is the default for Ruby on Rails, BTW. The path is just a name for a page. What useful information does ".html" convey? It would be like tacking-on "person" to the end of everyone's name. John Doe Person. (Or $50,000 money...)