Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
Im new to search engines, and have been looking through these forums and learning a lot...
But my question is:-
Do search engines use the same technologies to gather websites keywords, eg spiders etc... and how often do a new search methods/techniques arrive. So the web designer has to change tactics to gain a better position.
Technological changes happen for a variety of reasons from financial factors, competitors, the market in general, and new theories of search that are researched and implemented. No one can give you a real hard answer here, the best bet is to study the history of the SEs and get a feel from there.
Check this thread for starters:
- 95-96: The early Database engines: early simple text matching database search engines. Stripped out the pages of html and looked for matches. (WWW Worm, Excite, WebCrawler)
- 96-98 Pure Net Play Engines : more advanced matching and adjustable algos based on html structure and word density. (Altavista, Infoseek, Excite)
- 98- The Link Counters : they combined the best of the first two versions and added inbound link counting to the algos. (Google, Inktomi)
- 2001- The Context Theme Engines : newer engines use "off the page" criteria and "context" such as directory listings, text of inbound links, and page context in addition to the first three formats. (Teoma, WiseNut)
So it has changed quite a bit over the years from the search side of things.
Spider side - pretty much the exact same today as it was years ago. The only change has been in crawler intellegence. They've gotten far better at knowing how to walk through a site and id'ing the important pages.
We see something new about once a year.