Online access panels. Love them or hate them – the reality is if you’re in quantitative research, you probably will use them sooner or later.

In 2009, Grey Matter Research ran a little internal test on a few panels we had used or were considering. We arranged for a selection of mystery shoppers to sign up for each panel and be typical respondents for a month.

What we found encouraged us to expand our test to include 12 major panels, and take the findings public. The result was the report Dirty Little Secrets of Online Panels, which burned up Twitter feeds and LinkedIn comments, and was requested by researchers from as far away as Finland, Japan, and South Africa.

Well, we’re at it again. A few panel mergers, plus requests about panels we didn’t include the first time, and it’s time for More Dirty Little Secrets of Online Panel Research. e-Rewards. Toluna. Clear Voice. Surveyhead. Opinion Outpost. MySurvey. These and six more were evaluated from the perspective of the typical panel member.

Why should researchers care much what panel members are experiencing? We pay a panel provider or a panel broker, get our N size, toss out the obvious cheaters, and use the data. Right? Well…

Imagine you’ve crafted a relatively short, engaging questionnaire that respects my time as a respondent. However, yours is the tenth questionnaire in a row that I’ve completed that morning, and many of the others were long, boring, and irrelevant. I’m tired and inattentive. Now just how reliable is your data?

Or let’s say that I’ve attempted 12 different questionnaires this morning before trying yours. One of them asked me ten minutes’ worth of questions before telling me I wasn’t qualified (and tossing me out with no reward). One of them froze when I was mostly done. Another one told me I wasn’t qualified and kicked me out before I could answer a single question. Two more were actually called “surveys” but were trying to get me to compare car insurance rates. Five of them were already closed by the time I tried to respond, even though the invitations were all sent yesterday or today. I disqualified for two more because I don’t own a pet, even though I stated in my panel profile that I have no pets. I’m tired, I’m frustrated, I’m annoyed, and now I’m evaluating a new product concept that you really hope I’ll like. Now just how reliable is your data?

These aren’t just hypothetical situations – these are real situations we found in our work with these panels. Plus, multiple other problems:

  • The panel that gave us opportunities to complete 50 to 60 questionnaires in a row, non-stop
  • The panel on which over four out of ten studies were closed within less than 24 hours after invitations were sent, and which closed some studies in as little as one or two hours
  • The panel that sent two of our panelists 61 survey invitations in just one month
  • The panel that pays its respondents the equivalent of $2.67 per hour
  • The panel that sent one of our panelists 15 survey invitations over a two-day period
  • The panel that carries advertising on its website – are panelists seeing your competitors’ ads before they answer your surveys?

Of course, there were also much better situations, such as the panel that actually prevents panelists from completing more than one questionnaire per week…the panel that actually pays an average of over $8 an hour to respondents as incentives…and panels that invite people to eight or ten surveys a month, rather than 50 or 60. It’s not all bad news.

It’s very easy to gloss over fieldwork or let someone else worry about it. Let’s face it, finding and interviewing respondents is not the most exciting element of research, no matter whether it’s RDD dialing, focus group recruiting, or access panel interviewing.

But always keep in mind that you are depending on these people to give you input that you will use in critical business decisions. Paying them pennies, giving them boring, lengthy, or irrelevant surveys, frustrating them with multiple closed studies, and bombarding them with opportunity after opportunity is most definitely not how you want to treat people upon whom you are depending for your success. And if you or your research vendor are not paying attention, this is exactly what may be happening in your research.

This post has addressed some of the problems that exist in panels. In my next blog post, I’ll focus on what we as researchers can do to avoid some of these pitfalls and give our research a better opportunity for success.

And if you’d like a copy of More Dirty Little Secrets of Online Panel Research, shoot me an e-mail at ron@greymatterresearch.com.

0
Comments

    Leave a Reply

    Your email address will not be published. Required fields are marked *