Recently I wrote an article for The Strategist, The Public Relations Society of America’s (PRSA’s) magazine dedicated to executive-level public relations professionals. The article addressed the sometimes sticky question of how to disagree effectively with clients and management. One of the points made in the article was how important it is to back up recommendations—especially when they may be novel, controversial, or unexpected—with data. In this blog, I’ve been invited to follow up on that point a bit.
Sir Winston Churchill is widely—and famously—quoted as having said, “Statistics are like a drunk with a lamppost, used more for support than illumination.” Embedded in that comment is a common—and valid—concern about an overdependence on data and the danger posed when data is used as a substitute for critical thinking.
If Sir Winston were seated across my desk at this moment, I’d make a confession to him: guilty as charged. I have used research many times simply as a means of supporting my opinions and conclusions, rather than as a predictive tool. Illumination is great, but I, too occasionally need something to lean against while I am fumbling for the light switch.
When it comes to measurement and evaluation, I think a lot of us begin there. And when management knows that our opinions are actually supported by something, that’s a fine start. But to the point of the lamppost analogy, if we are going to be effective counselors, we can’t just validate—we have to illuminate.
So, how do we make that transition? Here are a few thoughts:
1. Repetition, repetition, repetition. As we conduct focused research on the same topic over the course of time, the positive outcomes of that research are cumulative. We build on what we know. This also allows us to develop multiple research methodologies to drill deeper into the issues. It is especially helpful in conducting formative research—the research that actually helps shape the strategy behind programs and campaigns rather than simply evaluating their effectiveness.
Here’s an example: Some years ago, a group of public relations students was asked to assist a community blood bank in reversing the trend of declining blood donations. The students began by performing research to ascertain why blood donation levels were so low. The blood bank believed (based on current issues and observations) that donors were fearful of contracting a disease, such as AIDS, from the blood donation. The blood bank also speculated that a newly opened plasma center that was paying for blood had drawn donors away from the blood bank. At the time the students were called in to assist, the blood bank was planning a massive education, awareness and motivational campaign.
The students commenced the research process by polling 400 community members with specific questions about their attitudes and concerns regarding blood donation and the blood bank. Fascinatingly, not one respondent said he/she believed it possible to contract a disease by donating blood.
The students then conducted focus groups with various community segments and reinforced the survey findings:
- That the publics who would or did sell their blood to the plasma center had never nor would ever consider giving blood at the blood bank, and, conversely—
- That those publics who would or did give blood at the blood bank would never consider selling their blood to the plasma center.
Further, the survey and focus groups revealed the true cause of the decline in donations: respondents said there was insufficient information and publicity surrounding mobile blood drives. Understanding this enabled the blood bank to avoid a costly and unnecessary campaign to raise awareness about the safety and importance of blood donation and instead focus on resolving a much simpler publicity problem.
This experience illustrates the bigger point about repetition of effort: Because the students had time to build on that research, and because they developed multiple methodologies, they were able to truly illuminate the deeper issue.
2. Hang out with your data.
A true understanding of data is often a process, not an event. It comes as we examine data over time, and in different ways. It’s also helpful to share data with other people, because they, too will see that data in different ways. And when others pick up on something we’re not seeing, that can be an important step towards illumination.
3. Give your data a “voice” so that it speaks compellingly to management and clients.
- Make it relevant. Research findings, like people, get more attention when they are saying something interesting. Measure things important to clients and management.
- Tell a story. Well-told stories engage audiences. Lay out your measurement process as a story arc: “Here’s what the problem was, here’s how we tackled it, here’s what we found out, here’s what reassured/surprised/amazed us, and here’s what we think it means.”
- Be clear about what you are trying to accomplish with the data. People don’t want a good ending spoiled in the first 5 minutes of the movie, but they do want to know what the movie is about before they buy the tickets.
Christina Darnowksi, Research Director at PRSA and a former research consultant, notes: “When I was a consultant, people would often come into my office, show me a survey they or someone else had already used, and say, ‘I want to use this to survey a group.’ I’d then say, ‘But what are you trying to accomplish?’ When I heard the answer, I’d know that they really needed a completely different kind of data collection instrument. It’s important to take great care in crafting that instrument.”
The lesson? If we are merely looking for support, just about any lamppost—or any vertical surface—will do. But, if we are really after illumination—data that will point the way to doing things better—we have to be very clear about what we’re trying to accomplish.
Susan Walton is Associate Chair of the Department of Communications at Brigham Young University.