The larger your datasets become, the more sampled the data becomes. My advise is to prototype with a small sample set in the first place then look into data export api for collation of larger datasets.
I guess it depends what you are wanting to achieve/what story you are wanting to tell.
With sites with traffic in the millions, GA queries are bogging down or failing so regularly, I'm looking for a real answer.
As soon as I extend out time significantly, or increase the desired number of results, I'm thinking of giving up due to the increased numbers of failures - and I'm not even using any of my add-ins or other basic tricks to get more out of GA that I use every day with smaller sites.
Does anyone have experience with HUGE data sets and GA?
Surely, there must be ways to speed this up and or make it more stable.
Does anyone have any experience with the new PAID Google Analytics tool?