It sounds like you aren't using this program correctly. After checking an URL (make sure you enter the complete address: http://www.example.com as opposed to example.com) XENU generates a report. The first two sections list broken links ordered by link and by page (URL) where the broken link is located.
Thank you... However, what are "The first two sections" you are referring to?
In the first column I have address and the second column I have status.
Just *one* long list of addresses which I presume are the broken pages.
There is no column showing which page on the site is pointing to the broken page...?
You need to look in the menu and generate the report. It will make an HTML page with further information (for simplicity, say "no" when it asks about FTP).
I get a popup asking for FTP details which I know nothing of. I then press cancel which takes me to a blank white page in opera.
Why is it asking for FTP details? What has that got to do with link data from someone elses site?!
Is there any way I can change the browser preference?
It asks for FTP so it can check the server and warn about files that exist that are not linked from anywhere within the site. Skip that step.
The pop-up asking for FTP detail is because an option, under 'More options...' called 'Orphan files' is checked. Clear the check box and the pop-up will stop.
After you run the link check tool on the site right click on the listing of reported errors. Each URL will reveal a single page or a listing of pages that the error has occurred on. You may have to scroll if the list is large.
Ann Smarty has written several good articles about Xenu usage. Worth looking up. The Xenu Wikipedia page at 'Xenu's_Link_Sleuth' has a Reference section of many links too.
The author has a Yahoo! support group at 'tech DOT groups DOT yahoo DOT com/group/linksleuthupdates/' In that group is a beta version tailored for huge sites. After you have tried the obvious posting a question there will get an answer. The tool's author usually answers in a day or so.
Besides using it to evaluate a site to identify broken links and the pages where the broken links are found, to find 301 and 302 redirects, and to see which external sites are being linked to (this is often useful for discovering spammers dropping links or hackers having planted links on a site)...
I also use it to get an exhaustive list of URLs for a domain (my client, a competitor, etc.) which I can then feed into a PHP tool that I wrote to request each URL and extract key elements from the resulting URLs including like <title>, <h1>, meta description, meta robots, meta keywords, etc. and write them to a CSV file so it can be loaded into Excel to be further processed and evaluated.
|I also use it to get an exhaustive list of URLs for a domain (my client, a competitor, etc.) which I can then feed into a PHP tool that I wrote to request each URL and extract key elements from the resulting URLs including like <title>, <h1>, meta description, meta robots, meta keywords, etc. and write them to a CSV file so it can be loaded into Excel to be further processed and evaluated. |
+1, although the tool I use is a bit of VBA in Excel
I also use Xenu often when link prospecting (particularly if I'm unsure of how legitimate the site is) so I can see who the prospect site/blog links to and how frequently they link out. It's also useful for seeing what kinds of outbound links they have, e.g. are 90-100% of their outbound links affiliate links.
EDIT: One more thing I almost forgot: I'd used Xenu for years before I realized that in the interface, right-clicking a URL then going to URL properties gives you the complete list of URLs that the selected URL is linked from. SOOOO useful for troubleshooting problematic links.
So far we have Xenu being used for researching what kinds of web pages are being linked to, which can help in making a similar kind of web page to suggest.
Someone also suggested using it for finding broken links, for doing the broken link approach.
It has also been suggested that Xenu can be used for quality control, to check if the site is spammed out with outbound links to bad neighborhoods.
Someone suggested it can be used for extracting data that can then be used by other tools for further processing.
Good stuff. Anyone else?