Forum Moderators: martinibuster

Message Too Old, No Replies

Managing a large link portfolio

How do you do it?

         

anallawalla

10:11 pm on Mar 7, 2005 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



How do you manage link dev for a large portfolio (hundreds) of clients? I know that you can do it in "spreadsheets" or "databases" but I'm looking for ideas on how to do it better.

Here are the issues facing link developers (feel free to add):

  • Keeping an up-to-date list of SEs, directories, and compliant web sites
  • Keeping track of successful and unsuccessful manual submissions
  • Managing email addresses, especially where one needs an address on the domain being submitted.
  • Managing email
  • Managing both new clients and "topping up" links for ongoing clients
  • Tracking backlinks seen in major SEs
  • Tracking Google ToolbarPR
  • Producing reports to see the big picture

Is the answer a custom solution? Or some existing product?

deverson

10:36 pm on Mar 7, 2005 (gmt 0)

10+ Year Member



In the past I have used Arelis for tracking links.

[edited by: martinibuster at 10:42 pm (utc) on Mar. 7, 2005]
[edit reason] url. [/edit]

neuron

5:40 pm on Mar 13, 2005 (gmt 0)

10+ Year Member



Is the answer a custom solution? Or some existing product?

There is no existing product that I know of that does all this.

We build custom DBs of potential linking sites for each campaign using ARELIS. We then export the relevant data to a mySQL DB and control the DB suing a custom php script that enables the linker to manually review every site before initiating a link exchange. Over time we've built up an extensive list of sites that do not engage in linking and the raw DB is run against this list to remove these sites. We've also built up a large list of sites that will add your link and then delete it after a week or two, apparently thinking we don't monitor the links. The raw DB is also run against this list to remove those sites. Then, we have another list of sites that do engage in linking but for some reason or another (too many links per page, un-indexable link pages, etc) are not good link partners and we also remove these.

The link management script is run on the linking site's webserver and also has a management console so that SQIs (Shift Quality Inspectors) can review random samples of a linker's work. The linkers do not get paid for errors. They are graded on different tasks such as Deleted Sites, Categorized Sites, Email Contacted Sites, and Form Contacted Sites. The also get a bonus for successfully establishing links.

All the emails are managed via an email client called The Bat!, which creates a seperate folder for every email account and can handle hundreds of email accounts.

We used to use RLC (Reciprocal Link Checker) to crawl sites looking for links, but because we have to look for so many, we now only use it for monitoring links and creating reports. I one time calculated that I would need at least 4.2 TB of bandwidth per month to properly have RLC do it's thing. Instead, we have a script we call The Crawler, that searches various search engines to find the links we've established, since we want the links we've established to be indexed by these engines. We also use this script to document previously existing links before begining a campaign. The last thing we want to do is charge someone for a link they already had. This crawler can also be used to document links to our clients competitors sites, so that we can also try to get links from them.

We also have a script that runs on serveral high-PR pages that delivers URLs of pages we have gotten links on to the various search engine bots. It displays an anchor text identical to the anchor text on the target URL that links to our client's site. This helps insure that all the links will be found and found quickly by the SEs.

Then we have yet another crawler that retrieves home page PR, link page PR, and also counts the number of links on the page, both internal and external. The API in RLC allows us to integrate those results into its report.

We haven't done so yet, but we will soon also be able to calculate, roughly, the PR being passed to the client site using the above figures.

The final report is a list of all the URLs, the PR of the linking page and the site's home page, the anchor text used, the descriptive text used, and the URL being linked to.

After a campaign is completed, the DB is downloaded and included into our master DB, which can be sliced, diced and searched in several ways.

I'm not a programmer so I can't tell you just how all these things work. It has taken more than a year to put all this together.

anallawalla

10:48 pm on Mar 13, 2005 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Thanks, neuron. Here is an updated list of issues facing large portfolio link developers:
  • Keeping a list of clients and their link campaign status

  • Deciding where to use manual effort and where to use a script

  • Keeping an up-to-date list of SEs, directories, and compliant web sites that offer clean links

  • Categorising this list by topics, if applicable

  • Deciding whether to run a link management script on each client's site or centrally

  • Keeping track of successful and unsuccessful manual submissions and updating the master list

  • Managing email addresses, especially where one needs an address on the domain being submitted.

  • Managing email

  • Managing both new clients and "topping up" links for ongoing clients

  • Tracking backlinks seen in major SEs

  • Tracking Google ToolbarPR

  • Producing reports to see the big picture