Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: open
I'm taking a class at University right now that involves a pretty spammy search area. I had some "content review questions" I needed to answer, based on my reading of the (very complicated) textbook. I couldn't find the answers easily in the textbook, so I turned to Google. I typed in a few basic 2- and 3-word search phrases into Google, and found what I was looking for on the first try each time.
Total time spent searching the book (where the material is definitely somewhere to be found): 15 minutes.
Total time spent searching Google: 20 seconds.
I needed to find some course descriptions at a University web site for some classes I took in 1997. I searched around the site for 10 minutes without finding what I was looking for. (Definitely some usability issues there.) I went to Google, typed in the course name and number and the university name, and *boom*, there was the course description.
Total time spent searching the web site (where the material is definitely somewhere to be found): 10 minutes.
Total time spent searching Google: 20 seconds.
Web site: Unsuccessful.
I was looking to buy a Valentine's Day gift for my Sweetie. Talk about your spammy content area. My first Google search came up with pages and pages and pages of affiliates - I wanted a non-affiliate dealer. I tweaked my search keywords by one word and searched again. The site I wanted was at the top.
Time of initial Google search: Five frustrating minutes.
Time of second Google search: Five seconds.
joined:Oct 27, 2001
Glad to hear folks are finding what they look for! Pretty sure you can find things on any search engine; is this news or something to get excited about or just a PR dept. thingy?
It's obviously news to the members of this forum who are convinced that Google can't get anything right. :-)
Good search results on Google has been my experience also and it continues to be my experience. Other search engines have never been able to provide as good a quality as Google. Yahoo! is close for obvious reasons. Others like MSN are a lost cause because I'm not interested in working through hundreds of sponsored and comm'l sites looking for the content I need.
joined:May 28, 2002
'Pages' that is a dead end for a key term will rank high.
Example: Red widget
500 pages linking to siteA.com/page.html with the anchor red widget...
where siteA.com/page.html is further linking to siteB.com/page.html
siteB.com is the dead-end for anchor 'red widget'.
In the old algo siteA.com will normally rank higher because of the backlinks and perhaps all the in-page factor.
The current algo however will not rank siteA.com higher because it passes the term 'red widget' to siteB.com therefore siteB.com is the absolute authority about red widgets.
The beauty of this algo is that it will seek all pages that are dead-end for the specific term hence the highest relevency. Sites such as .edu, .org, .gov, and other purely informational/tutorial site.
Those on some very specific query you end up with very high relevance serp.
However, this is just good in theory. If the current algo gives so much weight to theming of anchors it neglect to consider the ff.;
1. The relevancy of the source of the backlink. It seems to be that the current algo is totally ignoring content or have weigh down the value of it. Backlinks from blogs, forums, guest books, and other sources that are not even remotely relevant to the term are given equal weight value to their anchor.
So if a general blog or a spammed guest book link out with the anchor 'red widget' the receiving page is deemed highly relevant to red widget assuming it doesn't out-link again the term red widget.
2. Internal backlinks are ignored or weighted down as well. Sites that have multiple pages related to 'red widget' are at the mercy of the external links. Even if the site itself is truly relevant to red widget but if the the backlinks doesn't reflect that then the site is deemed unrelevant to red widget. The situation is further aggravated by the normal navigational cross-linking. If you happen to out-link 'red widget', that page loses authority, like you have given that authority to some other page. Hence the phenomenom of previously high ranking pages dropping out or buried in the serp.
3. Portals, Links, and Directories are considered valuable resource for the specific terms...it points out the site that is higly relevant to the term. Most often portals/directories/links have high numbers of backlinks which are mostly copycat by other portals/directories/links. With the current algo P/L/D's wouldn't rank high because they out link the term.
Problem however, which the algo didn't take into consideration that most small P/L/D are 'not up to date or poorly maintained' and most often the out-links are dead itself. When this situation happen the said Portal, Link, or Directory becomes the absolute authority for that specific term because that is where ther term ends hence sometimes we would see directories/links/portals high up in the serp.
4. Lastly - a gift to spammers. Most spam sites are set-up where their back-links are term specific and the site itself doesn't link out. Since the theme of the source of those backlinks have no more or less weigh but their anchors are like gold the vote for this throw-away domain suddenly becomes very valuable.
Imagine if you have 1000 backlinks from spam source all using the same anchor. The throw away site will rocket to the top.
I've witness this already with the highly competitive terms where spam is rampant.
joined:May 28, 2002
Clearly shows that the receiving site is the dead end for that term.
But in danger of being edited by Mods it's against the TOS to give out specific example. :)
That's right! just look for "Miserable failure"
It's interesting to do this same search on AV or Vivisimo. The results are not much different from Google.
I'm still happy with the results I get on Google when I do searches, which are mostly for info and not shopping (I mean, who shops every day online? I guess a few people do, but for me it's once every few weeks or months. But I look for information several times daily).
I think Vivisimo could potentially be a big rival because of the clustering of results, which is a big advantage over Google's presentation. But I'd imagine Google could emulate that and maintain their competitive advantage.
As for other searchers -- all I can go by is my stats, which show almost exactly 40 people arriving via Google for every 1 who comes from AV. I see no signs that the average Jo is leaving Google.
joined:May 28, 2002
But not to dampen your enthusiasm 3 words combo or even 4 & 5 words combo is not new to those who have been optimizing web site for search engines. These combos are just considerd an added bonus to your 'main' target keyword/s. Often times these combos are accidental.
What is considered 'main target keyword/s' then?
Keywords that 'majority' of searchers are using. As a webmaster this is the type of keywords you should be thinking about.
I understand the 'searcher' side of your perspective but as a web publisher would you just be happy to have majority of users served with spam result and be content that whatever trickle traffic you get because the searcher happened to type in 3 or 4 words combo?
The argument that Google is teaching users 'how to search' is patronizing in my IMO.
Nobody teaches users how to search, if they(users) are not happy with their 1 or 2 word search result they would rather be doing something else....just ask my wife. Point is, the tolerance level of users are very low.
I'm currently #1 out of over 8 million result of a 2 word keywords...Should I be happy about it? On the contrary the 2 word keyword is very rarely used by actual users. Not my main keyword/s
There are tools that can be use to determine the estimated traffic for a given keyword/s. If you are not aware of this tools then seo experts are way ahead of you as to what keyword/s to target. If you are just be happy to get a few hit from 3-5 words combo then you are missing the boat.
If you want to know how not to miss the boat then study the result of 1 or 2 word keyword/s related to your site. From there figure out why your site is not in the Top10.
Then ask yourself this question...Am I not doing right with my optimization or is there something wrong with the algo. Only then you can have a fair shot of your ranking otherwise just wait for a searcher using 3-5 keywords to land on your site.
I can only wish my competitors would just wait for searchers using 3-5 combos.
Answer: Latent Semantic Indexing
I saw one site early on after Florida. It ranked #1 for a very competitive one word term but it didn't contain that term and when we investigated deeper it was not the result of backlinks with the term in it.
The reason that it ranked so highly was because the words on the page were very close semantic matches to the term. In effect the page was a better description of the word than the word itself.
I strongly beleive that Googles (Applied Semantics) CIRCA technology is a form of LSI applied to specific terms and samples produced by the standard Google search. In this way LSI can be applied very efficiently and what we see is what appears to be an over optimisation penalty. In fact the term searched for is disregarded in the LSI element of the new algo. In fact the LSI element of the algo might be applied only once a month and stored as a vector for combining with the old part of the algo.
All of you travel and geo linked real estate site folks just consider how your pages would be analysed if your took out the word for your state, city or region. What meaningful words are left in your pages that give a semantic clue to its location?
For many searches adding in this theme authority element to the algo will certainly result in better results.