Welcome to WebmasterWorld Guest from 220.127.116.11
If someone searches on Google for "Cat", our highly ranked "Dog" page is displayed (as it also talks about "Cat")
So the user lands on a page that mainly talks about "Dog" = poor conversion rate.
Can we - Capture the referrer string containing the keyword and display appropriate results based on a custom 404 page for eg?
Remember, this is to improve the user's experience of the site - not for SEO reasons.
Assuming our site search functionality is good, what are the risks with this approach?
If you serve a 404 for search traffic on one keyword, then you're saying the original url does not exist. Google will most likely discover that, and then the url will stop ranking for ANY keyword.
However, this kind of Google Search traffic can be a help in improving your content. If there is significant traffic coming on a not-too-relevant keyword, I'd develop a new page that is solidly on-topic, and get that page to start ranking. Link to the new "Cat page" from the currently ranking "Dog page", for instance.
Like Amazon though, our text-heavy pages show "related products" which Google is picking up on a lot.
The pages are properly optimised & highly relevant for users - it's just Google is showing less than ideal pages about 40% of the time.
There's over 30,000 product pages involved & what we have now is bringing tonnes of traffic so am reluctant to make any drastic changes.
(There is already lots of x-linking to more appropriate pages as you suggest)
The site would easily pass a detailed manual review so do you honestly think we could get a ban/penalty for showing users a results page when Google sees an actual product page?
Thanks for your time
In some cases it can be a straight 'lack of PageRank for the content page' problem, other times it has been as a result of a site having a Duplicate Content problem and the 'real' page has had multiple alternative URLs which Google has ignored in favour of the other page.
I know Tedster has made comments about this issue several times in the past. I can't remember what he said back then.
The fix seemed to be broadening the semantic content of the page that "should have been" ranking - in other words, I abandoned the narrow focus on the literal keywords and allowed the content to range more widely into synonyms and co-occurring terms.
That was worth the effort because the problem was for one of the five top-level category pages - it's probably not a solution for thousands of individual product pages.
This is one of those areas where Google needs to do better, IMO. It frustrates me for the sites I work with and it frustrates me as an ordinary user of search.
It seems to be a result of anchor text as an ON-PAGE factor. A similar problem existing with blogrolls, and Google recently launched a new fix for that - maybe they will eventually use similar logic for other types of sites, too. I can only hope.