Focus groups suck. They really do. Sure, it’s a long-standing qualitative research technique that has tipped millions of decisions made by product designers, advertisers, and policymakers.
And sure, some data is 100% better than no data.
And without doubt, some focus group designs suck more than others. In the end, they all suck. But they don’t have to.
The typical focus group study design calls for a segmented group of participants that act as proxies to the target demographic who use or consume the product. And you simply ask open ended questions about the product’s usefulness, appeal, clarity, price, etc.
Sounds useful, doesn’t it? Like it could generate good data, right?
Within the first two minutes of the first session, however, a participant with a strong opinion invariably highjacks the study. Skilled moderators work hard to pump the quiet participants for their opinion, but often fail — and/or forfeit the participant’s psychological safety within the group along the way.
Reeling-in a highjacked focus group is next to impossible because the moderator is engaged in a losing battle with something stronger than a client’s directive: human nature.
That’s right, the sociology and psychology in small groups is a stronger force than most realize. There are fewer things humans care more about than feeling like they belong to a group. People have an inherent desired to belong to something bigger than themselves. This “belongingness” phenomenon is well researched by both psychologists and sociologist and it’s the primary reason why focus groups suck.
But this article is neither about psychology nor sociology, it’s about research methods.
What if there was way to run a useful focus group? After all, having direct access to end users and their opinions gives your work an advantage.
So, when our client recently insisted (asked) that we run a focus group (despite my childish-esq objections), we decided to beckon the power of mixed research methods, among other things, to triangulate data and “LoJack” insights.
The focus group study was designed to evaluate the strength of four creative concepts for a prospective student recruitment campaign.
To address our biggest concern with the focus group research methods, we designed the study so that it couldn’t be highjacked. Instead of asking the group for their opinion on each creative theme, we asked each individual for their opinion on each creative. The opinions were scored on standard Likert creative measures. The data collection instrument was good ole fashion pen and paper; each participant quietly and independently rated each creative as they walked around the room where the concepts were on display.
What would normally have been a group qualitative data gathering process, we turned into an individual quantitative data collection process.
I hear you asking across the internet, “Well, neat, what’s the difference between this and a survey, fancy pants?”
But since we had 12 participants in front us in a room, we could do more than just a survey: we could test independent recall. And to collect good recall data, we needed to put time between the first part of the study and the recall part of the study. We filled that gap time with a traditional focus group style session.
The inquiry during the traditional focus group-style part of the study couldn’t have anything to do with the creative options — that would have generated unreliable recall data. We still had unanswered psychographic, media consumption and decision-making process questions — so we focused this inquiry to gather these data.
In the end, we collected reliable and actionable quantitative data that told a strong story about a preferred creative direction with high recall. Both preference and recall findings are explained by the qualitative psychographic, media consumption and decision-making process data collected.
Mix research methods and triangulation served to “LoJack” meaning and insight from what could have been another highjacked focus group study. More than that, we found a way to run a useful focus group study while still being able to reliably answer the research questions.
And that’s just awesome.