Flipping through the latest Industry Standard, I came across this article, The Great Pretenders
Companies that offer free magazine subscriptions in exchange for survey information are sitting on a landfill of garbage data, because in the questionnaire universe, everyone is a senior executive with the power to approve millions of dollars in hardware, software and consulting services. Cubicle workers may not have money, power or prestige, but they’re getting the same junk mail as their bosses.
The article also records a syndrome endemic to focus groups: not only do people lie to get into the groups, they then lie to get the approval of the other focus group members– flirty women adjusting their opinions to get the approval of the male members of the group and vice-versa. Their opinions just can’t be trusted.
At Carbon IQ we’ve luckily seen a lot less of this, partially because we do far more usability testing than focus groups so we don’t have to deal with the vagueries of group dynamics. We have come across the occasional “professional tester”; a person who lies aobut how often they participate in user research studies so they can do a lot more of them and make the nice little stipend. One fellow not only did that, but was savvy enough to say when filling out the screener that he was a student. He was a student– part-time. We recognized him as a fulltime local web developer, a profession we were specifically trying to avoid.
Otherwise we’ve caught folks by listening closely– often they become comfortable in the course of the test and say something like “At the other test…” Oddly enough they are often willing to confess at the test they are repeat testers, even though they have lied on the screener. Perhaps because they feel they are already going to get their money. We’ve long taken the precaution of testing two more than the minimum needed, as much to be able to discard those who provide us with useless data as to cover for no shows.
As budgets tighten, more and more folks are doing guerilla usability testing. However, areas with high concentrations of web and software companies (such as San Francisco) start to become a problem for finding suitable test subjects. Between the preponderance of designers and developers and the fact that almost everyone has been to a test or two, it’s hard to find an “average” user. My first piece of advice would be to hire a seasoned usability specialist and recruiter, but if that is out of the question, try these tricks:
- Write your screener carefully. Be very explicit in weeding out anyone who may design or build products. People who design can’t ever seem to just use a product– they are always moving into design mode. “If you move this to the upper right, I could find it more easily.” or “Red would work far better for this button.” Marketing and market research people are also problematic, as they are often thinking about how the test should be run as much as the product they are supposed to be using.
Also ask if someone has *ever* held one of these jobs– we had a test subject once who was a market researcher– except she had taken some time off to have a baby, and cheerfully filled out the survey question “Homemaker.”
- Also be explicit in asking about recent testing– most people prefer a white lie to a full one, so they may happily say, no I haven’t participated in usability testing in the last six months when they’ve been part of three focus groups. People who a lot of research studies start considering themselves expert opinion givers, and will also move out of natural “use” mode and into “design” mode.
- Mail a check to the test subject rather than pay cash at the test. When we’ve told folks “We’ll be mailing a check to you” some say, “Okay”, but some seem surprised, and others will outright say “That’s not how it was in the other test!” You can then dig a litle deeper to find out when that test was.
- Test somewhere else. Going to Sacramento (easy for us, since we are in San Francisco) can make a huge difference. Users are not only less tech-savvy, they are also less test-savvy.