My client has a very user friendly site with valuable content on a product which bought and sold online in great volumes.
The only problem is that the the CMS which runs the website churns out long and complex URLs. Now while the users may not have any problem looking for what they are searching using the search functionality once they are on the site, the search engines are not able to get to the deeper content due to the URLs.
One option I'm looking at for them is to make browse trees based on location and categories which will use search engine friendly URLs to point to the same content that users can get to using the search form. The browse will also be helpful for user to quickly find what they are looking for using browse instead of search.
My SEO question here is: Since the same content is available also by performing search on the site albeit with a much more complicated URL, is the browse tree option going to create duplicate content issues?
Currently the URLs for the pages coming through search forms are not indexed in the search engines due to complexity but they may on an offchance if the algorithms become more sophiscated causing two URLs being indexed for the same content.
In summary, I would like some feedback on these issues:
1] Is creating this browse option "spamming" by having two URLs for the same content?
2] What if robots.txt is used to block the complicated URLs from ever being indexed and therefore having only one URL indexed for the content?
(Although this still means we will have two URLs for the same content, we are making sure search engines are not indexing them both.)
3] Our only goal is here to make the site more user friendly and search engine friendly. If this is going anywhere near spam, it is a no go for us.
Thanks in advance