It happened again. Grey Matter Research released survey information to the media, citing a study we did for Russ Reid (editor’s note: now One & All). The study found “rather than being in competition for the donor dollar, charitable organizations and places of worship may actually complement each other in fundraising.” In short, people who give heavily to a place of worship also tend to give heavily to separate non-profit organizations. In a variety of ways, giving begets giving.
In the news release, we were very careful to use words such as, “this suggests,” or “the findings may be saying,” or “tend to support,” rather than definitive statements such as “this proves” or “the findings demonstrate.” This was because we could present the actual data, but part of the release was dependant on our analysis of that data.
So the news release hits the media. What do we get? Headlines such as “Churches, Charities Don’t Compete for Dollars.”
Not that this should be a surprise. Our news release was filled with a careful explanation of how we did the analysis and how we arrived at our conclusions. A major news service picked it up and condensed it to eliminate all the “uninteresting” stuff (like the details). Various media outlets then picked up their synopsis and condensed it further, making our research sound like inarguable fact rather than analysis of the available information.
This isn’t a rant about the media. They’re unlikely to give as much attention to all the details as researchers are. But this experience is a good reminder of two things.
First, when you read about research in the media, realize what is often being done to the data. It’s being shortened, restated, spun, and sometimes even blatantly misinterpreted or misapplied (we’ve had that happen, as well). Rather than use any statistics you read about in an article, you’d be wise to go back to the original source and find out what the study really said.
Second, what the media tends to do is no different than what your clients are probably doing with each report. You write a detailed, carefully worded 25 page analysis. The marketing director then shortens that to three key pages of bullet points. Her boss only wants one page, and the CEO will then give it 60 seconds in the monthly marketing meeting, so it gets shortened to a single paragraph. And that’s what the decision-makers see.
It’s true that important nuance and detail is being lost, but at the same time, it’s also a fact that non-researchers generally aren’t going to give the same attention to important details that researchers are.
So what to do? There really isn’t a solution, but there are steps researchers can take to mitigate the problem:
- Realize it’s inevitable that this will happen, and attempt to control for it. Coordinate with the client to write those brief summary conclusions yourself, rather than allowing non-researchers to control the process (and possibly lose or misstate critical details).
- At the very least, offer to review what’s been written, and provide input.
- Work with the client to learn what type of reporting will be most valuable. If they’re only going to use a one-page summary, provide them with a fantastic one-page summary.
- Explain things in lay terms. As soon as most non-researchers see things such as “probability sampling” or “confidence interval,” they’re likely to skip that paragraph entirely, possibly losing critical detail.
- Remind, remind, remind – when non-researchers are observing focus groups, I usually go through a brief spiel reminding them to concentrate on the “why” responses rather than worrying about how many people held a particular opinion.
- Be a broken record if necessary. After my focus group reminder (from #5), if I return to the back room and hear someone saying, “But six of those respondents liked that name,” I gently remind the observers that the number is meaningless, and they need to focus on why the six liked it, and why the other four didn’t.
- I hate to say it, but there’s also a certain amount of CYA necessary. As the researcher, I have only so much control over how the findings are used by the client, but I can certainly put any necessary caveats up front in clear language. If they then misuse the findings, it won’t be because I left them any possible doubt about how the findings can be used.