im
classic-banner

Skimping on Analysis & Reporting

As a full-service marketing research company, when we start a new relationship with a client, we are either starting a new research program or revising an existing program. When revising existing programs, we often have access to the prior analysis and reporting performed by either the client or a prior marketing research vendor/partner. Taking a critical look at past analysis and reporting is often a slippery slope, because you don't know the skill level of the former research project manager and you don't know how much resources that manager was provided in their effort to execute the program. Too often, we come across data analysis and reporting errors that could have been avoided.

A data analysis error is the error that occurs when analysis is incorrectly executed. Simple mathematical errors are common, which is why data analysis should be checked over by more than one qualified person for quality. If you don't have a standard procedure for getting that done, you definitely need to get one. A more significant data analysis error is when simple frequency reporting (straight number percentage reporting) is executed when far greater information can be mined from the results (often inexpensively) through additional analysis such as cross-tabulation analysis, multiple regression (driver analysis), cluster analysis, factor analysis, perceptual mapping (multidimensional scaling), structural equation modeling tests, etc.�

When it comes to satisfaction surveys, errors often occur before data analysis begins. For example, in performing call center satisfaction research, if customers' perceived hold-time is not measured and it turns out to have a significant impact on overall satisfaction, then that would be a critical flaw in the data. The obvious effect of such a mistake would be that a critical driver would not be measured and a key performance improvement opportunity would potentially be lost. A more subtle effect would be that the less significant drivers included may appear more important then they are.

Reporting errors occur because even the best program approach and design combined with the best analysis is only as good as the researcher�s capability to synthesize and report on the results. The most common reporting error by far is the improper representation of the significant findings in a format conducive to creating management understanding and buy-in of survey results. It could be something as simple as poor language syntax to as complex as choosing the wrong results to report or not choosing the best way to graphically represent the results.�

More common in the current environment is not selecting the best delivery vehicle. For example, an online reporting system is much preferred when distributing results across a company that is geographically spread out. The best approach is to talk with management early on and get their feedback regarding which results they are interested in, how they want to see the results and how they plan on using them ... and customize the reporting accordingly.


All Content � Copyright 2004-2011. Polaris Marketing Research, Inc. All Rights Reserved.

Send inquiries to newsletter@polarismr.com or call 1-888-816-8700.

ec