So here’s why you always have to go back to the original source when you want to use a research statistic you find in the news media:
“An SSI QuickPoll by Shelton, Conn., researcher SSI shows that 96 percent of Americans believe their smartphone is a must-have/must-pack item when going on summer vacation, with 78 percent saying they would turn back to retrieve it, even if it were 30 minutes away.”
Wow, hard to believe that already more than 96% of Americans own a smartphone. I mean, if 96% of Americans feel their smartphone is a summer vacation must-have, it has to mean that at least another couple of percentage points of Americans own a smartphone but don’t consider it a must-have, right? So smartphone penetration in the US must be around 98% or higher, right?
Yeah, right.
So is this a bad study or bad reporting? Thankfully for SSI’s reputation, it’s the latter. Their original press release states the following:
“Ninety-six percent of summer vacation travelers in the United States who own a smartphone believe their smartphone is a must-have/must-pack item. Of the 1,001 people surveyed for this report, 611 own a smartphone and plan to take an overnight leisure trip this summer.”
Okay, that sounds a little better.
The amazing thing about this mistake is who made it: Quirk’s, a long-time publication of the research industry, in their Daily News Queue e-mail update (from June 3). But certainly Quirk’s is not the only media outlet to mess up reporting on a research study, or give only part of the story (and in all fairness, generally they’re an excellent and well-respected publication).
Grey Matter has had our research findings quoted by hundreds of media outlets, such as the Associated Press, USA Today, Financial Times of London, MSNBC, even The Tonight Show with Jay Leno. It amazes me just how often the media tell only part of the story (often in a way that is misleading), or flat out mis-report the findings. I’m not accusing the media of intentional bias (although I’m sure that does happen), but of having limited time/space, having limited (or no) understanding of research, and often having a specific angle from which they want to approach a story and then being left to find statistics or research findings that fit that angle.
(You know, just like the researchers or clients who are only given five minutes in the executive meeting so they have to cherry pick the most attention-getting stats and ignore the full story…or who don’t fully understand things like the fact that “directionally significant” is really another term for “not statistically significant at all,” or who want to approach the results from a specific angle so they focus on the findings that fit that angle…but that’s another rant for another blog post.)
Years ago Grey Matter did a national study among Protestant clergy. Every denomination was represented according to its size, but certain denominational groups (e.g. Baptists, Methodists, Lutherans) were large enough that we could break them out and report separately on them. One particular reporter doing a story on our research asked me why we did a study just of Baptists, Methodists, and Lutherans. I carefully explained that it was done among all Protestant denominations, but that certain groups were large enough to allow us to examine their findings separately (just like it explained in the original news release).
So how does the resulting newspaper article start the next day? “In a new study of Baptist, Methodist, and Lutheran clergy…”
Reporters and editors are also out to attract readers/listeners/viewers. Consider a hypothetical study that finds 15% of Americans support the Obama Administration’s treaty with Iran, 30% oppose it, and 55% have no opinion. Consider three different headlines about the research:
- Americans Oppose New Treaty 2-to-1
- Only a Minority of Americans Oppose New Treaty
- Most Americans Undecided on New Treaty
All three of those headlines accurately reflect the study findings, even if each one tells just a portion of the total story.
Sometimes it’s not just the media’s fault. I recall one news story about a study claiming that a huge proportion of “southern evangelicals” supported the use of torture in the interests of national security. The article never defined how “evangelical” was defined, nor what “southern” meant. So I found the original study to do a little digging of my own. The original study also never explained how “evangelical” was defined, nor what “southern” meant.
Those definitions are tricky little beggars. Without much effort, I can find news stories quoting research that claims evangelicals represent 35% of the U.S., 22% of the U.S., or 7% of the U.S. It all depends on how the researchers defined “evangelical” (which usually isn’t stated in those news reports).
Too often, the media do very little to vet research that has been conducted before reporting on it. I’ve seen one particular “study” about Millennials covered extensively in the media, and some of the findings struck me as a bit odd. When I dug into it, it turned out the “study” was done mostly through the alumni associations of about ten different Midwestern colleges (which led to their finding that 95% of Millennials held a college degree).
It took about ten seconds for anyone with a basic knowledge of good research to determine this study was a complete crock, but the wide coverage it received in some media outlets suggests that many reporters and editors either don’t have even a basic knowledge of good research, or they couldn’t be bothered to take the ten seconds. This is also why I consistently see reports such as, “The proportion who oppose the measure has risen from 67% to 69% over the last five years” and then read that the sample was 400 people.
Although I would really like to see things improve, I sincerely doubt that will happen (particularly as traditional media outlets continue to cut their budgets, combined with the fact that anyone with a keyboard and an ISP can be “a media outlet”). So the onus is on us.
Before you plug that key finding into your next marketing plan, speech, or background document, do a little research of your own. Go to the original source of the research and find out how the research was really done and what the study really found. Save wasted effort and embarrassment by doing some fact checking.
Or, go ahead and base your sales projections on the research finding that at least 96% of Americans have a smartphone and see what happens.