Forum Moderators: Robert Charlton & goodroi
e.g. you can narrow your search down by
'red widget' (click - submit form)
'with screw cap' (click - submit form)
'< £50' (click - submit form)
These filters are passed to the same url e.g. domain.com/widgets/ without any parameters in the url
Does anyone know if google's 'form filling' will have an effect on these pages? Or what could I do to make sure the page isn't compromised.
These filters are passed to the same url e.g. domain.com/widgets/ without any parameters in the url
I'd suggest changing that technology. Google hopes to uncover previously hidden (but still valuable) content by crawling forms. In theory they will just disregard any near-duplicates that they "find". But as we all know, with automated processes, things can go wrong.
So I prefer some kind of signal in the url that allows me to block the filtered or sorted results sets by using robots.txt or a robots meta tag.
The key here is that one url should not point to different versions of content. That approach is also not good for human users who may bookmark the page and then come back to see something different.
e.g. /widgets/red, /widgets/red/small, /widgets/red/small/50 or /widgest/search/ as it's basically reproducing the same products but on different pages
Your point about bookmarking is a valid one, but I've got the feeling that having more links/bookmarks to /widgets/ will help me more in the long run?
Just a bit unsure about G indexing all the products and creating duplicates of the page [as the search mechanism uses POST].
Taking your excellent points on board, thanks.