“Put your hand on a hot stove for a minute, and it seems like an hour. Sit with a pretty girl for an hour, and it seems like a minute. That’s relativity.” Albert Einstein.
The human senses are wired to respond to relative differences in stimuli and this should be a constant reminder to us as we design, execute and interpret survey results. Qualitative researchers have the benefit of “proximity” to the respondent and can perhaps better appreciate the context or relativity in research but quantitative researchers likely need to make an overt effort. In this article, ORC International will discuss how an appreciation of relativity or “context” enables us to gain deeper research insights.
In its simplest form, relativity as we define it, manifests as “order bias” – that is, our response to one stimulus is biased by what we have just been exposed to. We apply this argument to all aspects of questionnaire design and highlight how an appreciation of relativity can help us draw deeper insights from the research data. The hypotheses discussed in this article only provide a very brief overview of a larger in-depth research that the team at ORC International is planning to undertake.
Aspects of Relativity in Survey design
How our survey invites provide a relative context for response: We need to be conscious of relatively right from the time a survey invite is sent. In an attempt to increase survey response rates it may be tempting to let respondents know how important their opinion is to us. So between the two survey invites below, the second may seem the obvious choice:
- May I have 10 minutes of your time for a survey on food products
- We are interviewing consumers like you in an attempt to develop better quality food products. May I please request 10 minutes of your time to tell us your opinion.
However, we should note that the second invite may elicit more than average level of spontaneous feedback simply because the respondent may perceive his/her status in sentence two to be more like an expert relative to sentence one.
Response anchored to consumer belief systems: In multi-country studies it becomes particularly important to understand respondents’ relative position or belief on various aspects of decision making, before trying to understand the choices they make. For instance, in dontevaluating a price promotion communication, we believe it is prudent to first check the underlying beliefs regarding promotions – irrespective of the brand or specific message being tested. In a study done with GMI, we dontevaluated how promotions in general were received across markets by asking consumers what first came to their mind when they noticed a food category brand offered at a discount. We found that Filipinos were twice as likely to cite “bad/expired product” as a reason for the promotion versus those consumers in Singapore or Australia. This base level difference across markets provided us the relative context that respondents in each market were basing their response on and thereby enhanced our interpretation of their response to brand specific price promotion messages.
Beyond rotation bias: The team at ORC International in partnership with GMI did an experiment to dontevaluate how an extreme attribute in an attribute battery could influence respondents’ dontevaluation of other attributes (in the battery). We asked consumers from a variety of countries, their level of concern (on a 10 point scale – where 1 is not at all concerned and 10 is extremely concerned) on various issues such as rising population, global warming, economic uncertainty, and health etc. We deliberately inserted an “extreme” attribute like concern of “Alien Attacks”, which achieved a mean score of 4. Considered in isolation, we would conclude that residents of these countries consider “Alien Attacks” to be a real concern. However, when compared with the other concerns, it was as expected, the lowest on the list. But more important, the extreme attribute had likely distorted responses to all other attributes in the survey. Had this attribute not been included, the rest of the attributes would have been more “discriminating”. We are investigating the impact of breaking up any attribute battery into category hygiene factors and category differentiators (where there is likely to be unique/extreme attributes).
Survey format setting the context for extent of feedback: We miss the opportunity to effectively“probe” for richer responses to open-ended questions in online surveys. But, have you tried providing larger spaces to fill in answers versus smaller boxes? You may be pleasantly surprised to see that the amount of respondent feedback is proportionate to the relative size of the box on average.
Interpreting responses: In a retailer study we undertook, our consultant uncovered that retailers recommending a brand in second position in survey responses were not all necessarily meaning the brand to be their second choice to “push” or sell. Interpretation of results relative to their selling strategy was essential and it was common for some retailers to first promote a particular brand (and that’s what we captured in the survey response as brand recommended first) and then through the sales process ensure that a different brand (say recommended second in the survey response) is the one they actually sell!
Using the results: When results in a product line optimization test indicated one pack type was perceived too premium versus the others, a possible recommendation for the optimal value line up could have been to drop the pack that that was perceived to be at a premium. But we highlight that having a premium price pack type on the shelf may set the relative context that makes others either appear to be of better value and/or pulls up the perceived average price on the shelf enticing consumers to trade up. An appreciation of relative context helped us in making the recommendation to retain the premium perceived pack type as part of the line-up.
These are just a few examples of how an appreciation of the relative context can make a significant difference in the way data is collected and interpreted, which in turn directly impacts marketing decisions and ROI. At ORC International Singapore, we are continuously trying to sharpen our understanding of the relative context in which respondents answer surveys and how we can read between the lines to come to the heart of what they are saying. Our passionate team of researchers thrive in innovative, creative thinking to maximize the value a research study can deliver to our clients. And we thank the support provided by GMI in this journey.