While I'm all for data collection, the task of constantly analyzing these issues is beginning to overtake the development and normal maintenance for some businesses. I understand the need for precise data, but that means everyone need to be using the same analysis tools and methodology. The SEO experts out there no doubt have more experience and knowledge in these methods, compared to those of us thrust into becoming SEO experts overnight. A concise checklist and how to guide would be a great start along with a boilerplate form for data collection.
During our previous discussions, some reports talked about major traffic drops, others talked about the same traffic level but conversions dropped like a rock. That sounds like two very different things, so maybe we can differentiate them.
Tedster, you're absolutely correct in that statement...in fact I find myself sometimes reporting "traffic drops" when perhaps I should say "traffic quality drops". Poor quality traffic equates to low sales and sometimes I just report "low sales" or "sales drops".
The only other problem I find with analyzing data is the anonymous nature we must all keep on these boards. Talking blue widgets, red widgets doesn't really help understand a particular niche or site type.
In the end, comparing general observations is the best we can expect from such a varied group of webmasters. As far as site types, there is obviously a big difference in how G handles a fortune 500 company with high brand recognition vs. a Mom & Pop venture with or without a trademark, so that should be a top sorting criteria. I honestly doubt any top name brand sites are seeing negative effect from these algo updats. That begs the question of "collateral damage" and at what level that is most likely to occur.
If Google's algo updates are a hand grenade tossed at a particular problem area, then precision is not practical and may in fact be impossible. Some smaller sites are clearly the one suffering. The big sites have enough clout to armor themselves from the updates.
Tedsters bold items above are a good starting point for a reporting check list. Maybe we can put together a master list & guide to get everyone reporting the same data and to perhaps find a common denominator.
Note, I sugesst AWSTATS simply becuase it is a common reporting tool included with most hosting accounts.
Many may not be using GA. GA data can be used later to drill down on specific cases.
Here's some of my suggestions to classify sites:
1. Approximate daily site traffic - unique visitors. (use awstats)
2. Hand crafted HTML or CMS?
3. Product / service vs. Adsense only monetization. (P, S or A)
4. Site age. (moths, years)
5. Branded - Trademark or Patented product?
6. Geographic location (country)
7. Low traffic only?
8. Same Traffic, Low Quality - Low Sales.
9. Percentage of Google traffic to site, use server logs. (using awstats)
10. Total keywords for current month. (using awstats)
11. Approximate deviation in traffic from same time last year. (using awstats)
12. Approximate deviation of sale from last time last year. (using awstats)
That's all I have for now...feel free to edit / add to that list of you like.
Moderators - please let me know if sharing this type of info between members violates any WebmasterWorld TOS.
Maybe we can refine this into a standard report format to remove as much ambiguity as possible so we can get to the bottom of our collective site issues. This is a fresh topic, so maybe it's common theme can be "more concise, standard reporting". I like to hear about other webmasters problems, but finding a solution is more important and might help save our struggling online business.
Cheers!