Could you guys give me some adive?
we have a travel site which has about a thousand pages, and has been online in the search engines for about 5 years. it uses a CMS, one that outpts in html.
we have never done any real seo, apart from changing a few tags around since last year. however it ranks ok for the keywords we have chosen.
we are going to do a total re-design. the new design proposed has a lot of rich media content, is still using a content management system, and will be very graphic oriented.
if we made a duplicate version for accessibility reasons , in plain html, exactly like the bbc have done . (u can see this under 'text only' version on the bbc main page - bbc.co.uk/home/today/textonly.shtml) and then put a robot exclusion tags on every page of the rich media site. could this work?
so in effect we have a index splash page. 2 entry points on that page. main visitors will go to the colourful graphic pages. that link blocks the search engines. they follow the links to the text only pages. previous link popularity still going to main index page, + new links deeplinking to text only pages.
is this considered a kind of cloaking? or is it just an alternative way of doing things. we would have to have the accessability part anyway for legal reasons as it is a brand site. this way would just be taking advantage of seo at the same time.
we would also need to have the robot exclusion tags because of the duplicate content. this way there would only be 1 set of copy indexed. We would just be choosing the text site to be indexed instead of the rich media site.
do you or do the search engines consider this a form of cloaking? if they do, how and why do huge sites like bbc and cnn do it?
any help, info or ideas would be appreciated thanks.