This post summarizes a roundtable discussion by the IPR Measurement Commission.
Recently, members of the IPR Measurement Commission participated in an online discussion about aligning digital and offline data. Led by IPR Measurement Commission Member Katie Paine, the IPR Measurement Commissioners discussed how to integrate offline data into practice. Here is what they had to say:
Katie Paine (Paine Publishing): What is offline data? What data is offline that you’re trying to integrate?
Johna Burke (AMEC): How about all those back in-person events? How about those in-person delegates? That’s offline. And now that everyone has a huge appetite to know every blink that a delegate makes in front of the program, I think that is going to be a big shift back to offline.
Katie Paine (Paine Publishing): The interesting question will be how much that happens. IPRRC was offline and in-person. So, I guess it is coming back. The bigger topic here is how do you get insights from this and how do you combine it?
I routinely incorporate recruitment data, employee satisfaction data, employee understanding data, and stock price into reports, and draw conclusions to the best of my ability. But I’m wondering what kinds of data you guys are trying to integrate since I live in my own little bubble?
Alan Chumley (Real Chemistry): The way we describe it to leadership over here and clients is “the quilt versus the weave,” if you will. What do I mean by “quilt”? If we’re really just talking about multiple methods, multiple audiences, and multiple data sets, we put all of these squares together. Some squares are good. Some are bad. But when you pull back and look at the overall quilt, it is telling a story. It’s light on method, heavy on visual, by design.
Now, if we’ve got a client who is ready for something more sophisticated, then we get into the “weave” or the composite where there is a heavy modeling methodology. It really depends on budgets, timelines, client understanding, and the need.
Katie Paine (Paine Publishing): Interesting.
Elizabeth Rector (Cisco): We are still doing a roadshow around integrated PR. So, you know, the traditional PR metrics: we look at social, we look at analyst relations in terms of output, but that’ll come in the outcomes.
We’re doing all kinds of paid, not just digital advertising, but sponsored content across the board, which is a lot to manage because every single publication does things differently. So, we’re trying to get our arms around that.
In our owned media, we manage our newsrooms. So, it’s a lot of the traditional .com metrics that we’re getting into. We do look at search as an input. We don’t have that as a “metric” but that is an insight that we use to inform. And then, we have our employee engagement with the content that we produce for employees.
Then, in terms of outcomes, we have the surveys. We do surveys with our IT and business decision-makers, surveys with the analysts, and surveys with the employees, where we have a consistent set of questions that are aligned around the outcomes that we’re trying to drive so we can see how we’re doing in moving the needle with all those different stakeholders we’re trying to reach.
We do look at correlating all of this data. I’ve tried so many different ways with super-smart data scientists to have true correlation causation. But with B2B, especially our type of B2B, it’s never proven successful from a lot of those metrics. But certainly, the correlation from, “Oh, we did this across all these channels, and here are the outcomes we’re getting, what are those levers?” and then figuring out the levers that we can pull to drive and improve those outcomes.
Katie Paine (Paine Publishing): Is the problem with B2B lack of data, or time, or…?
Elizabeth Rector (Cisco): No. It’s 50 touchpoints, and its an agreement on attribution. We can do it. It’s just the agreement on what is the attribution from the impact of a sale on their attendance at an event versus reading an article, versus their interaction with an analyst, versus their interaction with the employee.
Katie Paine (Paine Publishing): That goes back to Michael Ziviani’s thesis about how the marketing funnel is now a sieve because there are so many different touchpoints. And at any one of those touchpoints, you can lose all the work that you’ve already done because somebody says something stupid, somebody does something stupid, somebody supports something stupid. It’s all of those different elements that can derail that process.
Elizabeth Rector (Cisco): Well, and it’s not just saying something stupid, that’s huge, but it’s collaborating across the touchpoints, which is what we’re trying to do. So, as communications, especially with the paid and owned content that we control, it’s making sure that we’re linking in with marketing. We need to be totally integrated. It’s making sure it’s a positive interaction, but it’s not too much that we’re saying the same things and/or different flavors of the same thing and that we’re integrated, which is really challenging in a huge organization.
Katie Paine (Paine Publishing): Is attribution related to message alignment? Does anybody have evidence that it’s related to message alignment, other than internally?
Elizabeth Rector (Cisco): I mean, they’re related, but the attribution is just purely on the touchpoints.
Katie Paine (Paine Publishing): I think one of the things that people are realizing is, is the fact that the survey research data is key these days, and it’s hard to make it structured, right? I mean, it’s hard to fit it into little categories and really get the answers you want.
Justin Greeves (Education Advisory Board): Yeah, it seems like that’s been a struggle in our world for 30 years, people having homegrown solutions or using natural language engines. It’s never really been cracked. But it’s gotten better in terms of finding ways to take large reams of data and make it quantifiable and then put it into models and things like that.
Katie Paine (Paine Publishing): I think it gets easier if you have more data. That’s at least what I found out in the early days of just automating sentiment was that you could get away with it if you had 10,000 articles a month to deal with. But if you’re dealing with 150, every single one counts.
Elizabeth Rector (Cisco): With financial analysts and the investor mindset at large, messaging is key. We analyzed our messaging to see which words we should be using in terms of moving the needle on investor perception. We rewrote all of our scripts based on this analysis. We tend to go back to these same tenets anytime we’re sharing earnings and speaking to financial analysts.
Alan Chumley (Real Chemistry): Yeah, for clients specific to investor relations (IR) but even for just about any audience, it’s just a simple anagram content gap. For the nerds among us, it’s a chi-square test of variance but applied to language, a stacked bar chart, “Here’s what you as an organization are saying, and here’s what your stakeholders are saying about you. Oh, my God, look at that gap. Close that gap.” we will even do that monthly for clients on a scorecard, PowerPoint bottom left, and quantify that gap and say that you know, “Your messaging gap is X this month, and it better go down next month.” Clients sometimes respond to it and do something about the gap that we are showing them. The method works.
Katie Paine (Paine Publishing): Well, I think the other thing that is interesting is that the people that some of my B2B clients think they’re talking to are not the people they need to be talking to. And actually, the people that are buying this stuff are 20 to 40 years old, and they have a very different set of priorities than somebody of different age. We found that the number one priority was environmental sustainability. We’ve been trying to work with clients who are trying to be more sustainable so they aren’t losing sales because they are dumping huge amounts of plastic into the ocean.
Does anybody else have surprises or insights from their audiences? Part of this roundtable is about getting insights from your data. What’s the process? How do you guys come up with what you’re going to combine and that aha moment that says, “you need to change X to Y”?
Alan Chumley (Real Chemistry): Well, by design, it’s about the right combination of platforms, processes, and people. And if we focus on the people most importantly, we go looking for folks who have had time in PR and corporate communication trenches before going to the data dark side. So, I intentionally go looking for that left-and-right-brained unicorn or what the recruiters call a “purple squirrel.”
So we hire purple squirrels, and we make darn sure that those are the folks with their finger on the datasets and those are the folks that are then going and working with the data visualization people to help us tell this story. And it’s not so much force-fitting the data to tell a specific story, but what is the story? We will put these recovering-corporate-comm-strategy-turned-analyst types in front of VP or high-level clients, and that’s when the insights come to the fore. It’s about the people and having senior-level credibility with clients to get to those insights that matter.
Katie Paine (Paine Publishing): Jamin, how do you find insights in all of your data?
Jamin Spitzer (Abacus Insights): We would throw the visual up on the PowerPoint, we all got into a room, and we’ll just start asking questions. We had a phrase in our team, which is “if you torture the data long enough, it will confess.” And I would say, the biggest thing is you can’t rush it. So much of the time, you do so much work on getting the data, conditioning the data, and then making your PowerPoint. But the most important piece of the whole thing actually happens in the middle, when you think about the data and find its meaning, but this is the piece that gets squeezed.
Katie Paine (Paine Publishing): That’s a very good way. So, does anybody else ever have enough time to look at the data to make insights?
Dr. Sean Williams (Bowling Green State University): I don’t think there’s ever enough time, but that’s one of the axioms of our work. It’s really challenging to teach undergraduates about the very concept of deriving insights from data when they lack the patience to work through processes and examine what it is they find. And that’s a huge part of what I try to teach them – you can’t just race through something and then expect to derive insight from what you’ve read. You’re going to need to spend some time analyzing what you see and thinking about it through different channels and find new prisms to apply. That’s the number one challenge.
Number two, we have a generation of people who are coming up who are going to have trouble with the concept of taking the time. I think this has the potential to lead to a lack of insight and poor numeracy and data literacy. And it’s a real concern because I’m like, “I’m not a data scientist, dude. I’m a PR guy, you know.”
Jamin Spitzer (Abacus Insights): I totally agree with what you’re saying, Sean. I think there’s another element too where clients go and see a demo for the solution that they chose and they think, “Oh, yeah, this is just clicked, click, easy,” and it’s not.
Johna Burke (AMEC): Is it our job in analysis to serve as the resident contrarian, if they haven’t hired their own “purple squirrel”? Because so many times we’re talking about their results, we’re trying to talk about their effectiveness, but there’s a quote that says “we need to stop interrupting what people are interested in and be what people are interested in.” do we need to use data to help people target those efforts in a more meaningful way? To really help them remove themselves from channels where there are distractions to their brand, and to their reputation? Isn’t it part of data too to serve as that resident contrarian of those elements that might need to throttle back?
Katie Paine (Paine Publishing): I think you hit the nail on the head, which is the fact that you’ve got to understand who you’re trying to persuade, what they need to hear, and how they’re going to absorb that knowledge. It’s what I call the “walk a mile in their shoes” approach.
I have a client whose CEO is convinced that communications aren’t necessary because he’s got all the money and people he needs. So, I write my reports from the perspective of “what in God’s name can I do to convince him that if nobody knows who you are, nobody’s ever going to give you any money, and nobody wants to work for some company that they’ve never heard of, etc.?” And my process is to pretend to be him for a while and see what I can find in the data that might get me excited.
Justin Greeves (Education Advisory Board): One of my favorite things in the first part of my career was having other people who aren’t data people involved in determining what the data means. When you bring in a team of others who are not data people and talk to them and present your case and help you see different things… it’s frustrating, but it makes it so much sharper. The non-data people are good at seeing false insights, or seeing the start of something and then carrying that forward. I really like how the field has evolved in this area – you have working groups and people who aren’t good at data interpretation, being there as part of the audience. I think the insight has to be a deliberate group effort versus something that’s done in the analysis.
Dr. Julie O’Neil (Texas Christian University): I think I interviewed about half of you for a project on this topic, and you’re saying exactly what we heard from the 29 respondents about data analysis in terms of taking the time, building diverse teams, and having data scientists and non-data scientists look at the information. I also heard from people about the importance of taking risks and being the contrarian.
I’m teaching a new class called Social Media Insights and Listening and my students have been learning how to use Brandwatch this semester. So, they’ll be presenting their projects this Thursday evening. We’re going to have a pure evaluation, where everyone has to ask five questions in every group to look for better insight and to maybe do some further analysis. I think looking at the context is also important, and I think that’s why you’re saying people in communication are particularly good at this area because if you don’t understand public relations, or the context, or the good questions before looking at the data, it’s not going to go very far.
Katie Paine (Paine Publishing): The book “Making Numbers Count” has completely changed how I communicated data, especially to difficult clients. Instead of reporting what we thought was important to tell them, it’s all about how to say it in a way that knocks them upside the head a little bit. For example, I had this CEO client, and I’d been saying for months that he had the lowest share of voice of any CEO in his category. After I read the book I told him, “For every 1 time you get quoted, your competitors are being quoted 39 times.” He got it. So, sometimes it’s just rephrasing how you present your data.
Stacey Smith (Jackson, Jackson & Wagner): Katie, you make the point. We have to remember that we may be walking in with data, but we are communicators. We have to think about who we’re talking to, what’s important, what’s on their priorities, how we’re going to cut through the morass, and put it in front of them in a way that they can hear it so that they’re like a little baby who has never heard before and they put the hearing aids in and all of a sudden they’re, “Oh, my gosh.” That’s what we need to be striving for when we’re bringing all this data in.
Katie Paine (Paine Publishing): Yeah. How do you find that context with a new client?
Stacey Smith (Jackson, Jackson & Wagner): When we do it, we’re always looking at what their experience is, what their background is, what they bring to the table. If we’re going into a healthcare setting, I don’t use examples from construction. I use healthcare examples. We say, “this is this data that we have in hand,” we need them to think about it, look at it, and discuss it back and forth because they know their organizations better than we do. As much as we’re there, they still know it better.
Katie Paine (Paine Publishing): Anything else about getting insights from data?
Michael Young (Ford): Yeah, my big focus is on something called “relevancy measures,” so creating metrics for our own work that we send out. I’ve become quite interested in them because it’s a feedback loop in terms of where your stuff is going, who’s using it, and if it’s effective. One of the things we’ve found is that people read the emails, but they don’t necessarily click into the reports. These are very high-powered executives who have an extremely busy schedule. Now, with that, we challenge ourselves to deliver those insights in a very tight package.
One of the things we’re going to be launching soon is a newsletter. I’ve always been very interested in how Axios runs its business and I’ve always thought it would be a great communication channel to incorporate. We’re challenging ourselves to come up with three insights per week that take the industry and general news environment into context. We are going to find what people are talking about, what’s the headspace, and share what that means for our business, industry, and the people reading the newsletter.
The last thing is patience. Today, I played with some data where I was looking at media coverage, social media, market cap, and sales for a certain competitor. The market cap made no sense, so I chucked it and went to sales, and it was a better story there. So, you have to experiment. You have to try different things, and at some point, you find something that sticks.
Elizabeth Rector (Cisco): I have worked with my team, we have a new template just exactly to that point where we can have all the substantiating data, but we have no more than three buckets, sometimes four. This is the insight, and this is the action. We don’t do any presentation to any leader without having that upfront slide and usually, that’s all they care about.
Pauline Draper-Watts: To build on that, I think nowadays, people who are crunched for time don’t care about the data. It’s the presentation of the data in a way that’s compelling and makes sense to them more than the data itself.
Michael Young (Ford): I say we like to show our work.
Pauline Draper (Watts): I know we do. But not everyone else is as interested in it as us.
Elizabeth Rector (Cisco): Yeah, so we need to have that flash with substance that will attract people to action.
Katie Paine (Paine Publishing): So, let’s go back to the integration of data because, Michael, you have said that the marketing funnel is actually a sieve.
Michael Ziviani (Precise Value): Yeah. Our philosophy is around getting more steps by bringing together various data sources. So, it’s both the depth of the individual data sources and the synergy of collecting them in one spot, which creates exponential amounts of work because you’re trying to understand relationships, which are often buried or unclear or lagging in time periods.
And the second part of that is the funnel piece, which was around the leakage as you move through the funnel. We’ve done work to examine each layer of the funnel and the gaps between the layers of the funnel. We do work in brand health tracking, so we’re able to examine the movement down the funnel profiles of each level and understand what’s going on there.
Katie Paine (Paine Publishing): What kind of data are you using for that?
Michael Ziviani (Precise Value): Survey data.
Katie Paine (Paine Publishing): Do you find anything we need to know, like client consideration and preference or something?
Michael Ziviani (Precise Value): I suspect it’s a little different for each client. But you find that unexpected barriers come up, so you might make assumptions about certain messages being stronger than they are or you are missing messages which are very important to get people over a certain hump. Again, it’s person-dependent or segment-dependent.
Katie Paine (Paine Publishing): Yeah, I remember taking all of, like, four years’ worth of data for a client to try to figure out what elements actually resulted in greater traffic to a particular part of their website. And two days later, I’m cross-eyed. It turns out that the messages had absolutely no influence. The quotes from senior leadership had no influence. The only thing that ever had any influence was images.
Johna Burke (AMEC): Michael, do you think now that AI and machine learning are involved it’s confounding data within an inch of their original existence of what they mean or how they’re attributed to the original topic?
Michael Ziviani (Precise Value): Yeah, it’s easy to build a machine that’s so complex, you don’t know how it works anymore.
Johna Burke (AMEC): Right. I think part of AI and machine learning is that it’s over complicated. If you put Shakespeare into some of these AI programs, it doesn’t know if it’s positive, negative, or neutral because it’s talking about in fall, everything is alive as dead. If it can’t interpret Shakespeare, how is it interpreting all of the slang that exists in human language across all of the different regions?
Katie Paine (Paine Publishing): I mean, you can teach it slang in a particular region, but it becomes, as you said, so complicated that you have to question if it’s worth it. It goes back to the point that people don’t really care about the numbers, they care about what they can do to improve their programs and what they need to hear to make them better.
Mark Weiner (PublicRelay): That’s the way you would say it. That’s not the way they would say it. Most of the time, they’re looking for a big number, and AI and machine learning tools are perfect for that. It’s just that if you’re making important business decisions on that data, you’d run your ship aground.
Dr. Sean Williams (Bowling Green State University): One scholar said if you’re in public relations, marketing, advertising, there is a world that is here that you need to be involved with. I’m by no means an expert on these things. But at our peril, we dismiss the power of technology…
Johna Burke (AMEC): Or is it a lack of trust? Because I think as much as we’re talking about Web 3.0, I also can’t get a reliable cell signal all the time. And that technology has been around and no one’s working on perfecting that. I think it’s reliability and trust of how it can be applied. Some generations are skeptical of the technology. But on the other hand, all of those elements that make the counts and bring down that statistical computation of all of the big data that we’re trying to work with, I think it absolutely works great.
Cindy Villafranca (Southwest Airlines): We really like AI and machine learning and we’ve used it and then had to layer on people on top of it. I love AI and machine learning, but I think it has its limitations. And I think Alan said to the smart people layer at the top of the platforms and processes, that’s something we do all the time. I moved out of the communications department, and I’m now doing analytics in the HR department, so seeing the difference in the way the data is used, applied to the workforce, hiring, and training versus communications, it’s absolutely eye-opening.
One of the things our HR department does really well that we didn’t do in communications that I found interesting is they’ve got a whole team dedicated to what they call “just the reporting side of the house.” And what that means is they know the tools, they know how to pull the right data, they know how to slice and dice it for what internal customers may be asking for and not even realizing what they’re asking for. And other companies may do this too, but we did it in communications. And then they have the insights and storytelling side. So, you have someone that’s really good at pulling the data and analyzing that data, and then you have myself who takes a look at it and says, “Thank you so much for this. This is exactly what I need. I’m going to take it and run the insights and the storytelling,” because it goes back to what we said earlier, our leaders really want those executive summaries and those bullet points. They don’t want a 40-page slide deck to go through and pull all that information out.
Katie Paine (Paine Publishing): How else do we address the issue of getting insights out of this data and helping clients be smarter or helping our companies be smarter?
Dr. Sean Williams (Bowling Green State University): It’s a crossover. I once belonged to SHRM, as well as IABC, PRSA, and all that stuff. The key learning was that a lot of folks in HR really are quite possessive about employee communication, but they’re also rather tactical about it. For example, broadly speaking, they look at Gallup Q12 engagement surveys as just ticking boxes and they focus on getting their scores up. But from the communication side, we argue that the point is not to get scores up, but to improve communication ability and make the organization a great place to live. So, the crossover is there. And when there are good partnerships, then it can be highly effective.
Katie Paine (Paine Publishing): Rowena, do you have thoughts on getting insights from all of your data?
Rowena Ebanks: You know, it’s really interesting. I’m doing consulting and I was talking to a potential client yesterday, a startup. They haven’t started a measurement program at all yet and they asked “what should we be measuring?” And, of course, I go back to, “Well, let’s start small. Pick the metrics that tie to the business goals. Comms goals should tie to business goals. And let’s pick those metrics that you can pull.” And then I told them in the future they should bring all the data from the different sources together to pull the thing that you’re really, really trying to get, whether it’s brand awareness, reputation, and finding those correlations. Sometimes there won’t be correlations, but a lot of times you have to dig to find them. I think we really need to focus on the things that are important first, low-hanging fruit, delivering those to the client, and then with that information, they’re more likely to say, “Okay, we’ll give you more resources. We’ll give you more time to dig more into the analysis.”
Katie Paine (Paine Publishing): Thanks, everyone.