It was such a good idea…in theory.

How many of those moments have you had in your life? Something seems so smart, so right, so reasonable – then you put it into practice and it flops.

We do this in research and marketing all the time. I was reminded of this recently while sitting at a Phoenix stoplight watching a sign dancer. I don’t know what they’re actually called, but local businesses hire people to stand on the corner of a busy intersection holding a sign that says something like, “Big Sale at Courtesy Buick.” In order to attract attention, the more enterprising among them dance and do acrobatics with the sign.

The one I saw was quite good – that sign was in constant motion, with a series of creative flips and spins. Really caught my attention.

Problem is, I had no idea what the sign said. Even though I was really concentrating and staring at the sign, it was in such constant motion that it was impossible to read. Such a good idea…in theory.

I’m amazed at how much outdoor advertising is like this. I see billboards all the time with print so small there is zero chance I can read it while zipping by at 65. Local churches that meet at school gyms or other rented space put up street-side sandwich boards on Sunday morning announcing their service times – they probably looked great to the sign painter standing a foot away from them, but have print so small that no motorist will ever be able to read it from a car. On a street. Doing 45 miles per hour.

All of these things sound great in theory, but when placed in a real-world situation, they fail.

There’s a pretty good chance a lot of your questionnaires are exactly like this. Pre-testing is supposed to correct some of these problems, but even pre-testing won’t catch everything (and with today’s budgets and timelines, how many questionnaires actually get pre-tested anymore?).

There’s one defense against this: you. Never, ever, ever put a questionnaire into the field unless you have answered it yourself. As a respondent.

You design the questionnaire as a researcher. You pay careful attention to which scales will allow you to run the statistics you want. You make sure the skip patterns are all right. You look for priming issues (questions that could be influenced by previous questions). You make sure the client’s information objectives are addressed.

Respondents care about none of those things. When completing a survey, most of them simply want to give you accurate answers to questions they can reasonably answer, and be done with a minimum of effort. Unfortunately, we often try to ask them things they can’t reasonably answer. We fail to include every possible response option. We ask for information they don’t remember. We write confusing questions. We ask them questions that aren’t truly applicable to their situation.

And those are the people who are really good at questionnaire design. We all make these mistakes, because we’re researchers, not respondents.

So before your questionnaire goes into the field, become a respondent. Answer it honestly the way you would if you were taking the survey. If it’s not applicable to you, make up a persona and play that role – become Gerald, a 23-year-old, single, Black IT professional, or Ruby, a 64-year-old Latino grandmother who attends Mass every week. Become whomever it takes to qualify and complete the survey. Then do just that. You’ll be amazed how many questions suddenly don’t make as much sense as you thought they did when you were the designer.

If that long grid bores you, it will bore Gerald and Ruby. If you’re not sure what a question really means, Gerald and Ruby will also be unsure. If there’s no response option that applies to you, Gerald and Ruby will have the same problem. If a question is not applicable to you, it will not be applicable to Gerald and Ruby.

Every single questionnaire I put into the field, I complete as a respondent. Usually multiple times (playing different roles). Invariably, I find ways to make it more clear, more concise, more relevant, and easier to answer accurately. I do this again after the questionnaire has been programmed (assuming it’s phone or online), because how the respondent hears it or sees it on screen impacts how it gets answered. There are times the question itself is fine, but the programming creates problems. If it’s a phone survey, have someone read it to you as the interviewer, because how you hear things is not the same as how you read them.

We do market research in order to get into the minds of our respondents. But before we get into their minds, we have to get into their shoes, and see the research from their perspective. Only then will we have questionnaires that are relevant for our needs as researchers, and relevant for our respondents to answer accurately and consistently.

0
Comments

    Leave a Reply

    Your email address will not be published. Required fields are marked *