|Changing static site to dynamic |
Does the chang of url affect SERP?
I have a static site, all files are .htm. Now I want to change it to dyanmic, I will rewrite the urls, so some urls will be kept, some will be new, but all .aspx file.
I wonder whether Google think the URLs are new and negatively affect the SERP.
Any help will be appreciate.
Although we casually speak about search engines indexing and ranking "pages", the reality is they index and rank "urls" -- there's no technical definition for the word "page".
So if you change the url, it is a new url and it goes through the regular hazing process that Google imposes on new things. Where you must change a url for already ranking content, make sure that "old" urls no longer resolve with 200 OK. A 301 permanent redirect or a 404 is fine.
Don't change the URLs of your resources, or the links to them. If you don't change the URLs or links, then Google won't do anything -- They don't care whether your site is dynamic or static, all they care about is URLs. If you change the URLs, then Google and the other SEs will have to re-index and re-evaluate your site -- and that might cause up to a 3-month drop in your rankings and traffic.
The standard procedure is to *internally rewrite* your old static URLs to the form needed by your new (dynamic) page-generation script. This can be done using ISAPI Rewrite on IIS, or with mod_rewrite on Apache [webmasterworld.com] server.
There is no need to change URLs in this case. As a matter of fact, with a little site-structure planning, there is never any need to change a URL. Comment from Tim Berners-Lee, the inventor of the Web [w3.org]
Thanks, Tedster and Jim.
I know the importance of url for Google SERP, but I add new content constantly and there's more and more pages, if I chang the site to dynamic, it only need an individual to administer it.
Is it possible that I change to dynamic site and keep the rank?
With a good linking structure, your new urls for your new content will stand just as good a chance of ranking well as your present urls do. Dynamic urls are not necessarily a problem at all, but be sure that there is only one url for any given bit of content. There are a few more pitfalls to be careful with, that's all. Rewriting the new dynamic urls to remove the query string is a very good way to go.
Rewriting the .htm URL to the new .aspx URL is 50% of what you should do. This keeps the information available via the old URL and is the basis to keep your rankings.
The second you have to do however, is making sure that Google gets no access directly to the new .aspx URL. For example:
Your old URL is www.example.com/file.htm
Your new URL is www.example.com/dynamic/file.aspx
When Google manages to spider the /dynamic/file.aspx URL, it will compare it with the file.htm version it already has in its index, and a duplicate content penalty might be the result. I am suffering on one site myself from this phenomem. On this site, Google has found my wanted www.example.com/file.html versions, but for some strange reason also managed to access the site with www.example.com/index.php?... URLs. Almost the complete site went supplemental because of this. I still don't know how Google managed to find the /index.php?... series of URLs, but it is almost impossible for me to remove them from the index, once Google knows they are there.
[Google has found my wanted www.example.com/file.html versions, but for some strange reason also managed to access the site with www.example.com/index.php?...]
I don't mean to hijack this thread but I'm having the same problem with Yahoo and I'm guessing soon it will be a Google problem also. All my URL's are static .html using Apache and mod-rewrite but somehow Yahoo is managing to index www.mysite/php?=...This has got to be a major problem. I'm thinking there must be a way to solve this with .htaccess but I haven't quite got it yet. Anyway, it's something to be aware of when changing your site structure.
I don't know about asp, but I similarily switched to dynamic content using php. A simple entry in the .htaccess-file did the job, so now all html-files run thru the php parser. No need to change the urls. Just think of all the backlinks you might have, now presenting a dead link.
This might cause problems on sites with high traffic, but for me it works perfectly fine. Is there any reason, why a similar construction would not work with asp?
lammert and fjpapaleo: didn't you generate a meta name = robots content = noindex in your dynamic files? Alternatively exclude them expressively in your robots.txt-file?