Recently, members of the IPR Measurement Commission joined in an online discussion about education on measurement and evaluation. The team discussed the core ingredients needed to make ideal measurement and evaluation practitioners.
Thomas Stoeckle (Bournemouth University & Dot I/O Health): What are the core ingredients needed to make ideal measurement and evaluation practitioners? We want to talk about hard and soft skills required, how do requirements change over time, but also, of course, how do they differ by sector, by function, by role, how do we align formal education with coaching and training on the job.
Dr. Ana Adi (Quadriga University): What is it that we really need going forward for the communication profession to be a profession where their various loyalties and balance and how do we mitigate for that? But equally, what do we have right now so that we can build on that expertise, on that knowledge, and how we can bring it in a meaningful way at various levels?
When we’re talking about education, as Thomas has pointed out, we’re not thinking only accredited academic institutions but rather, the wider variety of lifelong learning opportunities. These can be on the job or through associations and training bodies, associations, and the like.
Thomas Stoeckle (Bournemouth University & Dot I/O Health): We sent a few questions on Menti and here is what you all said (image below). Who wants to start elaborating, telling us what they put onto that list? Why they put that on that list? And why do you think it’s so central?
Angela Dwyer (Fullintel): I did “framework with objectives” as one of my pieces here. I think it’s important to understand why you’re beginning with research, measurement, and evaluation, and knowing it’s something that comes at the beginning and not at the end.
If we don’t understand the objectives, then we don’t know what or how to measure. We need to ask questions like “why” and “so what” and “what am I trying to accomplish?”
When it comes to teaching and training, we must change that mindset [that measurement is at the end] and underscore that it should be at the beginning where we’re setting the framework. We’re creating something that we’ll then be using throughout time. And because it’s set up in the beginning, we have that value, where we can then compare over time and get better insights as we approach it, and of course, measure what we’re intending to measure.
Thomas Stoeckle (Bournemouth University & Dot I/O Health): I think one key point, Angela, is the ubiquity of data and the necessity to look at data, process data, and draw conclusions from good data at every single point in the entire value chain, from upfront planning to implementation.
It’s more critical now that everybody understands we need data inputs at every stage of these processes. And people need to have that sensitivity and the skill sets and tools to do that. Who else wants to contribute?
Laju Obasaju (Comcast): I also had a thought about objectives. From the beginning you should have some idea of what the plan is and what the story is you’re looking to tell. Curiosity or questioning in terms of the data isn’t always going to go exactly where you think it is. Be willing to question what you’re seeing, keep on asking why, keep on pushing, and be flexible with what’s happening as you’re going through it.
Alan Chumley (Real Chemistry): Thomas, you mentioned the distinction between hard skills and soft skills. That really resonated with me.
This is a bit specific. But on the hard skill side, yes to data, yes to numbers, but language, language, language. So, content analysis, text analytics, linguistics, bringing that research rigor and that methodological rigor plus the right tools to do that is really what we’re after. Of course we’ll hire someone who’s certified in Google Analytics, but that’s table stakes stuff. I want the language. I want the linguistics piece from a hard skill perspective.
Soft skill-wise, schooling around how to write a headline, the difference between a finding and an insight, structuring slide two for the C-suite including the “what,” “so what,” “now what,” or an observation implication recommendation.
Others hit on this storytelling bit. And as a way to get there, some learning around data visualization. Or if we’re not teaching students to do that, go hire people who do that and pair the two together.
Chelsea Mirkin (Cision Insights): I fully agree with Alan. I think we sometimes underestimate the importance of keeping it simple as we’re telling the story. So making sure that we’re not overwhelming our audience with a ton of data points just because we have access to it, but really thinking about what story we’re trying to tell and focusing on that story and filtering out the noise.
Johna Burke (AMEC): And I think that speaks to understanding the objective and also being able to address the hypothesis of the C-suite with the data that you have. And understanding that that hypothesis is just a hypothesis and not, “Okay. Now, how do we torture the data to get to that answer?” Be open-minded and use data to answer the question, let that tell the story and help guide that journey.
I think that’s one of the largest deficiencies that we see in communication is communicators want to please and satisfy and meet these objectives. But when they are being given a hypothesis by the C-suite, they need to understand how to structure the data, how to come back with answers from the data and what that might mean for the hypothesis. Understanding the difference between the hypothesis and objective is really one of those key differentiators that I think everybody needs to understand.
Anthony Cheevers (Researchscape International): I liked what was said about headline [or the story we’re telling.] Because we often say, “Okay, start with your hypothesis.” But what if your headline is your hypothesis? And what do you do if data are not supporting the headline? Do you change the headline? Do you change the survey? Again, if you’ve start out with a solid objective, how do you evolve?
Rob Jekielek (The Harris Poll): I think one of the key things, especially in marketing communications, is making sure to use the right research. Understanding research methodologies, what they can and can’t do, and matching those to what you’re trying to accomplish. There are certain things you can do in survey research that you can’t do in social analytics and vice versa.
And then, is the question or the context more important? Is it more around quantitative data versus qualitative data? Are you trying to render the picture and get people on trend? Or are you trying to set a quantitative KPI that the C-suite wants to track over time?
I would also look at cadence versus methodology and KPI. Looking at what you’re going to be doing and the kinds of KPIs you’re going to be putting in front of your C-suite on an annual basis. It’s going to be different than what we’re doing on a quarterly or monthly basis. We’re into scenario planning and trend identification, which is different than what we’re doing on a daily or weekly basis, which is figuring out if there’s something popping that we need to respond to, build on, or get to somebody in the organization ASAP because it’s either a risk or an opportunity.
Dr. Ana Adi (Quadriga University): Some of you have mentioned real data. When you talk about real data, what I hear is you implying that some people are not working with real data. And I wonder, where’s that data coming from if it’s not real? And what do you understand then by “real data”? What makes it real?
Johna Burke (AMEC): I think the misconception is that Twitter can be real data. Twitter is data, but Twitter isn’t real data in the scope or statistical sampling of what that means on a broader scale.
And the best way I heard this explained was, “A show of hands, how many of you didn’t get enough sleep last night? Okay. How many of you were cranky with your family as a result? Okay. How many of you went out and committed road rage on your way to this location? Okay.” Your Twitter. That group is Twitter. For everybody else, it’s very different.
Unless you have the spectrum and understanding of the sourcing of that data, that’s the reality. For a real data value scope, you have statistical samples, it’s sourced from a very specific stakeholder or not, etc.
Angela Dwyer (Fullintel): It’s maybe even audience-correct data or audience-focused data. I think sometimes when it comes to measurement, people will say “Oh, yeah, we’ll just figure out what he’ll think on Twitter,” like Johna’s saying. But that represents a very narrow sample.
So really thinking, who is the audience I’m trying to influence? And then figure out, where do I access them? How do I get data from that specific audience? In a social conversation analysis, figure out where are they talking. In your survey methodology, make sure you’re reaching the audience where you want to see the shift or the change or the influence.
Anthony Cheevers (Researchscape International): There is a case study I often refer to. We did a survey for a client and the data showed that John McCain was going to win the presidential election in a landslide. So that’s real data. We collected it. But what we point out is the survey was conducted at the client’s request to AOL users, who tend to be male, white, and over 50. So it’s not valid and reliable data, but it is real data. It’s worth making that distinction, I think.
Thomas Stoeckle (Bournemouth University & Dot I/O Health): That’s a really critical point. When we look at these must-haves that we want for future practitioners and what we want their capabilities and rounded education to really focus on, we heard language earlier. We could add psychology to that to go deeper into application of language.
What we don’t usually talk about so much is culture. And yet culture has become such a critical wedge issue, particularly Anglo-Saxon culture in the U.S. and in the UK, but less in Germany and other countries. But culture is critical.
It’s not a topic we can hide from because it’s something that we need to constantly have on the radar. When we look at audience focus these days, I need to think about political partisanship and where I need to pick people up in their echo chambers when I want to make sure that I reach them on non-political issues. This is not something that would have occurred to me 10, 15 years ago
So where are we with this? How many of you are factoring that into their work?
Jason Forget (Center for Internet Security): I don’t know that I’m necessarily factoring that into how I measure currently. For example, my organization in the U.S. focuses on supporting elections officials across the United States, particularly with the ability to report potential misinformation around election processes and things like that, and then influence folks who have issues believing the viability of election processes as they exist.
So, one of the things I would look for in folks coming out of school to help solve this challenge is understanding how the algorithms on social media work. One of the things that we’ve discovered is that your ability to influence people who are in their own echo chamber is slim to none, even if you get in their echo chamber, because they’re already persuaded. They’re highly unlikely to come off their position.
We are looking to understand and influence the people who are not necessarily on one side of an issue. In our case, it’s the fact that, yes, elections are secure and election processes are safe. That all comes down to being able to understand how the system works, so that you can get the right folks into the right position.
Sarah Myles (McDonald’s): Going back to the soft skills component, I think it’s critically important (whether someone’s at an agency or in-house) to be able to simply articulate a measurement and evaluation approach and make a case for why it’s important.
In my experience, those are conversations that take place at all levels, agnostic of where you stand within the industry. Some of our communications colleagues aren’t as used to this side of the world, so there is a necessary education component.
Jason Forget (Center for Internet Security): I echo that. Helping practitioners (coming out of school in particular) understand how to meet their clients where they are versus assuming any knowledge is really helpful.
Ana Adi (Quadriga University): We spoke about a little bit about system trust and algorithmic thinking and whatnot. How do we bring in the practitioner’s point of view into this? And how do we mitigate with the situations when that practitioner’s point of view might be clashing with other views?
Rob Jekielek (The Harris Poll): Bringing the communications or marketing practitioner perspective on top of data analytics is, in my opinion, very important.
Because it brings in another layer of, how do you think about the problem? What are people trying to accomplish while using this? Is it, “Am I trying to convince a CEO? Am I trying to optimize a campaign?” They’re going to be very different kinds of perspectives and you want to bring that lens into the beginning of addressing the problem.
I would argue that it’s equally important for people to understand the analytics background and come up with the right methodologies, but also have a good sense of what we are trying to accomplish.
Thomas started the conversation around the cultural piece. I think especially if you’re doing a lot of global research and work, it’s easy to lose the cultural aperture if you’re in North America, as an example. Things work and feel very differently. For example, if you’re doing a project on data privacy and you’re asking things the exact same way in the U.S. as you are in Germany, you’re probably going to miss some things and get different answers.
I bring that forward just to reinforce the importance of the complementary nature of different datasets. So looking at Twitter, if you look at Twitter en masse, it’s not really helpful. But if you’re able to isolate large enough quantitative samples of experts, advocates, cultural representatives, etc., it can be extraordinarily telling in terms of what’s happening.
Dr. Ana Adi (Quadriga University): You can’t do research anymore that assumes that all people in a given geographical location are equal and/or alike. Ethnicity, descent, and the context in which people live has quite a lot of influence on issues related to value, political orientation, understanding of central issues around health care or, Jason, you mentioned voting, so electronic processes, etc. And very often when we carry out research, or when we see research being carried out, we look at markets or geographical regions but we forget these nuances.
Rob Jekielek (The Harris Poll): Or even gender. Gender splits are dramatic if you look across the globe. You have big differences in the U.S. but they become even more prevalent if you’re going to the Middle East or Sub-Saharan Africa, or Asia, or Latin America.
Thomas Stoeckle (Bournemouth University & Dot I/O Health): And that’s before we look at generational differences, and that’s before we look at the extreme disruptive changes in moving from one generation to the next. Generation has become much more relevant when it comes to how we look at the world from the perspective of marketing and digital marketing. There’s a generation that has been completely primed on Facebook, followed five years later by one that’s completely primed by Instagram, followed three years later by one that’s completely primed by TikTok.
Most marketers that I know don’t really have the natural sensitivity to grasp that and deal with that on a day-to-day basis. They do when they consciously put themselves in a position where they reflect on, “Now, I need to think about segmentation. And now I need to think about stuff from a media planning perspective,” but it doesn’t come naturally to them in their work environment.
For example, to have that level of awareness of how fast things are changing, and what captures people’s attention. I think so much has changed in such a short period of time that, in many ways, we’re lacking the tools to deal with that in the best possible way on a day-to-day basis. Am I an outlier in feeling that?
Anthony Cheevers (Researchscape International): I can add that we have client conversations all the time [about conducting research in different countries]. The gist in marketers’ thinking is highly educated people in finance, legal, commerce, tech in Germany, France, Italy, Japan, can take a survey in English. They can read English. They understand. And we have to say, it’s bad manners, why don’t you spend a little extra money just to serve them with something that shows that you respect their native heritage and culture and language?
Rob Jekielek (The Harris Poll): And it’s going to be representative. If you’re not choosing native language in those markets, it’s not going to be representative of the population.
Thomas Stoeckle (Bournemouth University & Dot I/O Health): So our second topic is focused on the skill sets that we want people to bring as they enter into tertiary education, university education, or the professional space. Is there anything that one of you wants to highlight and put emphasis on?
Justin Greeves: I feel like in the dark art of analytics, oftentimes, the qualitative is lost and the qualitative is the meaning and the personal relevance that we’re seeking.
I just started a new job, and my new workplace is very quantitative-heavy. And the marketing and research function is all about getting people interested in our products in the funnel. It makes me think how sometimes, all the data and the numbers and the importance of all that really overwhelms the recipient. They don’t always understand numbers. They know the numbers are there and they give some validation but connecting with them and helping them to see themselves in the research is the most important thing.
And I feel like in our world of quantitative, we’ve sometimes lost the importance of the qualitative. I worked for Richard Wirthlin, Reagan’s pollster, and he would say, “Persuade by reason, motivate through emotion.” You have to have both of those. And if you don’t, you’re failing.
And we even had a system that evaluated communication on that basis of the rational and the emotional. The qualitative is emotional. The quantitative is not often emotional unless it’s really bad news.
I think students and new employees who are new to the field are so hung up on the quantitative that they don’t learn the qualitative. And then they perpetuated the challenge of more quant, more quant, more sample, more data. That shouldn’t be the only objective.
Dr. Julie O’Neil (Texas Christian University): As an educator who’s teaching students research and data and social media analytics, I think one of the biggest challenges we face with our young entering professionals and our students is critical thinking. And I think you’ve all talked about it today. It’s really asking the question of “why?”
I think as educators and practitioners, we can remind the employees, our students of the context, looking at data from multiple points, the idea that there’s some creativity in looking at data. As Justin just said, it’s not all quantitative-oriented. There is some creativity and taking some time to think about how A relates to B and B relates to C.
Chelsea Mirkin (Cision Insights): I agree. [When responding to a client request] I am constantly reminding our people to be more than “order-takers”; to always ask the next question and dig a little bit deeper. I think it’s sometimes more difficult to teach than the hard skills, but you can get there.
Jason Forget (Center for Internet Security): One thing I would add here is an understanding of how business works. One of the things I’m big on with my team as we’re starting our measurement journey is to understand the business and organizational outcomes we’re driving toward, why we’re heading in that direction, and then work backwards to understand how communications can contribute. And then, consequently, how do we measure outputs towards those outcomes that we’re trying to achieve as an organization?
If you don’t understand how the business works and standard business processes, you’re not going to be able to speak the language of the C-suite, period. It was a hard lesson learned on my part when I was younger.
Sarah Myles (McDonald’s): Yeah, and I think just building on that, teaching how they can help others make their objectives measurable. Because at the end of the day, if we’re designing our frameworks around the objectives, and if the objectives aren’t true objectives, we’re in trouble.
In my experience, it is making sure everyone’s involved in that initial upfront planning process at least. Also making sure they’re comfortable being strategic counsel or an advisor at that place and help them to get where they need to be.
Thomas Stoeckle (Bournemouth University & Dot I/O Health): Thank you. Ana, do you want to share some of your experiences?
Dr. Ana Adi (Quadriga University): On teaching measurement and evaluation? Where do I start? My observation is that measurement is considered alien for various reasons, but I don’t want to generalize. We do have communications practitioners where I am, but in the context in which they work, it is quite likely that measurement and evaluation is being externalized. Or that it is not in whatever they’re given to do.
I do find that the students that I worked with who were mid-career, have a very precarious understanding of research. So I see measurement and evaluation as research methods. And all the conversation that we’ve had earlier about real data and transparency, acknowledgement at the source and the value, the differences, benefits, and disadvantages of qualitative and quantitative – these points are completely missed on them.
So we go back to teaching basic research methods, really. I also bring pieces of research that are industry-driven into the classroom and I challenge the research process by saying, “Well, look at this, this is supposed to be leading research, what do you see?” And “what is it missing?” “Where do you start asking questions and doubting the source of data?” Or “where is it coming from?” Or “how did they get it?” and whatnot.
That would be my insight from the mid-career German, Swiss, Austrian point of view. For those who are particularly in communications and public relations roles, less so maybe in internal comms or public affairs, there’s a precarious understanding of research. And there’s a heavy reliance on vendors, which, in my perspective, makes them vulnerable as practitioners.
Thomas Stoeckle (Bournemouth University & Dot I/O Health): As we, as we continue the conversation, just moving us forward to that third question that we had, which looks at technology, big data, and then asks the question, “which data analytics skills should professional development focus on?”
Storytelling comes out at the heart of it all, which is quite interesting given that the question says big data, data analytics. And then people turned this on its head, jujitsu-wise, and say, “No, no, no, this is really about storytelling.” Nice one, so who wants to comment on that first?
Jason Forget (Center for Internet Security): I would say storytelling simply because if I’m an organization, if I’m in the C-suite and I don’t necessarily understand measuring communication, if you come to me with a story that’s well-crafted, is compelling, and is succinct, what that tells me is (whether I understand what’s behind it or not) I can feel confident that you are really well-versed in what you’re doing. You’re competent and you’ve put a lot of detailed thought in it if you can come to me with one slide that is a very compelling story with the right data points that I need to make a decision.
Rob Jekielek (The Harris Poll): And with this, data visualization becomes critical. Part of telling a good story is how we visualize data.
And then data analysis has increasingly gotten chopped up into different things. I don’t know if you need to have one person who is both excellent at Python, knows how to run complex regressions, and all of it. But having at least some educational background in a lot of those things will be helpful in terms of being able to guide and decipher what’s going on.
If you have somebody who has a lot of those pure data science skills, that can be extraordinarily helpful, especially as you’re moving across different datasets, like running a regression analysis on social data versus survey data.
Anthony Cheevers (Researchscape International): What Ana and Rob were saying about storytelling reminds me, I was presenting years ago to a PR communication audience. We were talking through crosstabs and somebody raised their hand and said, “You do realize we’re all English majors.” They were well-skilled in narrative design and everything like that, but some were coming to them with Excel, and they don’t work in Excel. I don’t know if you see that same thing, Ana. It’s not native to their lifestyle. They’re not programmers or anything like that.
Dr. Ana Adi (Quadriga University): Yeah, I do see that. Though, I’m happy to report that I have been teaching my students to become robots. Meaning quantitative content analysis at incipient master’s thesis level, where you still need to do the coding on your own which has the benefit of having them go thoroughly through thinking of how we operationalize, what we put in categories, what we miss out, and go through saying “yes” or “no”.
That seems to be a very useful exercise to them. But yes, you’re very right. The much bigger group would be uncomfortable. I mean, they get completely lost when I talk about dependent and independent variables, let alone talking about composite variables and other things. But I think there’s a way to get there. Whether they’re German majors or philosophy majors or English majors or wherever they are, I think doing small exercises within the classroom, trying to play with apps that are out there and trying to take them apart pretty much like Legos, we do get to that point where we overcome the worry and fear that others have of an Excel sheet with numbers on it.
The thing that we haven’t touched upon is ethics. We went to this practitioner point of view. We’ve mentioned conflict and loyalties earlier on. We have spoken so much about data and cultural background, how do we deal and what do we deal with ethics?
And if I may, I’ll just put this briefly in context. A few years ago, we have run a Delphi method study here in Berlin with communication practitioners, asking them about the skill set of the future and ethics was one of the skills.
Now, one of the other questions that I asked was, who do you think should teach this, or whose responsibility is it to learn this and help you go through it? Your own as a communicator, your organization, a training body or an academic institution? And ethics was left to the individual, which was my biggest concern coming out of the study.
So thinking of what we have covered here in terms of what we want for the future. What do we need so that measurement and evaluation is understood? What do you think is needed for us to do to support the discussion about ethics and implications of the research, the advantages, disadvantages?
Justin, I go back to quantitative and qualitative data. They complement each other. They don’t need to dismiss each other, and one is not less valuable than the other.
So a very long question, but at the core of it is, what is your recommendation to approach ethics in a way that it has impact, visible impact, on how we communicate, measure, and evaluate?
Angela Dwyer (Fullintel): I want to chime in here on ethics. I did a lot of my research in the area of ethics, especially looking into different countries and seeing where there might be a global code of ethics, where there isn’t a code of ethics, where are they learning their ethics? What I found in my research was that having those codes didn’t necessarily make them more ethical.
The thing that came to my mind around ethics is the role of mentorship in our industry. However that looks, it might be an avenue for ethics training, where maybe they’re not getting it from their association or their classroom as a student, but they can be taught by a mentor early in the career and throughout the career.
Rob Jekielek (The Harris Poll): I think the ethics piece is just becoming increasingly important. As you start combining more and more datasets, and you start looking at things like data privacy. We need to look at the unintended consequences of combining different datasets.
Thomas Stoeckle (Bournemouth University & Dot I/O Health): With my AMEC course leader hat on, one thing that surprised me is that all of the wonderful tools and frameworks that AMEC provides were represented in some of the answers implicitly through objective setting and all these sorts of things, but none of you mentioned the AMEC Integrated Evaluation Framework. None of you mentioned the Barcelona Principles. I just find this interesting and I’m just curious, is this stuff just not particularly high on your various agendas? It’s a nice to have, but in day-to-day practice that’s actually not really crossing your minds?
Alan Chumley (Real Chemistry): We use funnels and frameworks ten times a day all darn day and love them. It may not look exactly like AMEC, but we’re honoring the theme, spirit, or intent of it.
I’m all about the funnel and the framework, I think we exist smartly in the land of frameworks. In fact, it helps get folks who don’t do what we do for a living organized and corralled and guided.
Candidly, maybe a bit provocatively, even though I voted on Twitter to help ratify the very first Barcelona Principles, I think they’ve come up once in my world in the last six years. So that’s a proxy for how “current and relevant” they are, at least to my clients. I think we all operate with those guiding principles innately.
Dr. Ana Adi (Quadriga University): Though on the advertising value equivalency, I think, the fight is bigger in some parts. The smaller the client is and the farther away they are from the U.S., then we’re still at “long live AVEs.” So that battle is still an uphill one, the farther away you move from the U.S.
I’ve noticed a couple of things. I think we go back to the elements that we’ve seen before: curiosity, research in terms of hard research procedures, methods, advantages, disadvantages. But we also go back to the soft skills a bit: the writing, the storytelling, the visualizing, the interpreting, what Justin called the “heart and mind.”
And with that, the added layer to the complexity, whether it’s ethics, whether it’s culture, I think mentoring might be an interesting solution that you have alluded to.
Allow me to thank you all for a very lively conversation, a very pleasant hour from wherever you’re joining us. And thank you, Thomas.
Thomas Stoeckle (Bournemouth University & Dot I/O Health): Thank you all, it’s been a real pleasure.