It's a difficult question for sure. For many of the projects I work on, the data that you need to do this is stored in different places, and you have to bring it together first and transform it into a common format.
I use a few tools to help me with this.
I use python as a Swiss army knife to explore data. There are many python packages available to access common APIs (Google, your ESP, whatever) that makes putting these pieces together easier. RapidMiner
[rapidminer.com] is open source and provides a nice GUI to build a data pipeline and do transformation, and also to run statistical analysis. You can script python (and R) inside of RapidMiner, or they have drag and drop controls you can use for analysis. They have a healthy community that is very friendly and commercial support if you need it. You can also do basic visualization in RapidMiner too and they have a server component you can use to keep data up to date and deploy dashboards.
The first step in the project is to define all of your data sources - so where are your leads coming from, and how can you access that data. You mentioned a CRM, and obviously you're using analytics of some sort. What else? ESP? Advertising?
Next, you need to figure out which of those systems have APIs, and roughly define what you want to get out of them. For example, if we're looking at our ESP, you may want to aggregate on a campaign level, or maybe on an individual email address level, or something else.
Then, I like to define my output - essentially, in Excel, mock up what I want the report to look like. Then, I go through the steps of using python and/or rapidminer to bring my data together, normalize it, analyze it, and format it however I like.