I've been reading some papers about terms vectors and the vector space that search engines build through indexing algorithms. I find a new angle is waking up for me, and I wanted to ask for opinions.
Here's what I'm thinking: Each web page that the algo sees as relevant to a keyword phrase is one point in this big space. These web-page-points are gathered in clusters by the algo, some closer together and some further apart.
What an SEO worker wants to happen in theme optimization is that a group of pages from their site are clustered very tightly ... closer together than pages from other sites.
I visualise the sites structure as a target, with the bullseye as the theme. I then "score" each page as either a bullseye or one of the outer rings, but at no time allow the target to be missed completely.
I include *every* page in this "scoring" method and also try to apply this to both outgoing and incoming links.
In relation to e-commerce sites, perhaps taking wildlife gifts as an example ;), the key point to establish is how you are marketing the products.
In the quoted example the decision needs to be made as to whether the site is targeted at; 1. Gifts, and hope people will want to buy the wildlife products or 2. Wildlife, and hope they will want to buy your gifts. Only when this is established can the site be optimised/themed/marketed.
I have a client who decided we should nuke the number two kw phrase, because looked like it was bringing in non-buyers, and seemed to be the main source for one-page-wonders.
That was 5 weeks ago, and the decision seems to be right. Traffic is down about 20%, but in the last two weeks sale are double! Getting the offending phrases off the pages seems to help the target audience feel more comforatble bring out their credit cards.