Forum Moderators: mack
with msn shutting down their search-operators, I was wondering the following:
Would it be possible for a good coder to create their own tool to spider the web (starting from their site) in order to find the links, that are pointing to their site (or page for that matter)?
P.S.: What languages are spiders/crawlers coded in, in the first place? just curious
If I see this correctly, there's no evidence on the page, that somebody else is linking to that page? In other words only links from pages, that have been spidered can be found?
Would it be incredibly hard to code one's own backlink spidering tool (that gives you the number of links pointing at your page & their sources)? Or is it just that nobody feels like coding something like that, as you can use yahoo's search operators in the first place?
If I have a link to your website on my website, unless there is a program running on yours that records the referring site, how would you know that my link existed?
You would have to be something like google that indexes all websites and from there pick out from the data which links point to your website.
However referring websites are easy to record, with many free stats programs.