This post appears as a part of AMEC’s Measurement Month.

Last week, the SmallDataForum convened – this time to explore questions related to the opportunities and challenges of data in business, the rational and emotional side of decision-making, and the continuing erosion of trust and confidence in the truthfulness of information.

Stephen Fry’s fabulous narration of the complete works of Sherlock Holmes formed the backdrop to musings about data sleuthing and the skill sets required for successful forensic analytics.

Big Data’s Business Value

A recent Harvard Business School article discussed the issue that Companies Love Big Data but Lack the Strategy to Use It Effectively. Neville sums this up succinctly: “You don’t need big data strategy, you need a strategy that understands where big data fits in.”

Such a business strategy needs to look at the end-to-end process of generating better and faster data inputs for business decision-making. From data generation to storage, processing, to producing the required insights, big data and artificial intelligence capabilities need to form a seamless loop, and they need to be linked both to operational, and executive decision-making.

And therein lies a problem. Companies have the talent to do the data analytics, but they have not yet figured out how to connect that with big and small business decisions. It is a generational and cultural disconnect between the (often young and nerdy) data experts, and the (often older and less data savvy) C-suite executives. Sam calls this the “boardroom – coalface dichotomy”.

As businesses are figuring out how to turn data into value, they are also confronted with a talent challenge: how to acquire the right talent with the right skill set to solve near- and medium term problems. The knowledge economy is already here, and Sam reiterates the fundamental skills that are required: to interrogate and make sense of data, and use that for storytelling and influencing, as influence is a factor of analytic insight and storytelling.

Humans and Machines

This leads us to the human vs machine dynamic, and the leading paradigm of automation and optimisation which skews toward rational decision-making based on statistical predictive analytics, when we know that human decisions have an emotional core and are affected by our heuristics and biases.

We’re in the area of Kahnemann’s systems 1 and 2 of fast and slow decisions. It matters, and increasingly so, because the evolution of technology, and the evolution of humans are happening at vastly different speeds. Thus we experience a widening capability gap between technology (in data collection, processing, output generation), and human application in operational and strategic decision-making.

Simply relying on technology is not the solution. Rather, we need to learn to be more comfortable in a fundamentally complex, I.e. uncertain) environment. This is being addressed by the CRUISSE network, led by academics from the London School of Economics and University College London, and funded by three UK research councils, ESRC, NERC, and EPSRC. CRUISSE stands for Challenging Radical Uncertainty in Science, Society and the Environment. It is an attempt to factor uncertainty into our models, rather than ‘rationalising it away’.

The Vs that Matter – Veracity and Validity of Big Data

An early definition of big data (ca. 2001) focused on the ‘3 Vs’ of volume, variety and velocity. Things have moved on since then, and more recent definitions list as many as 10 Vs (volume, velocity, variety, variability, veracity, visualisation, value, volatility, vulnerability, validity).

For Neville, veracity and validity stand out as they determine trust and confidence in the data. The SDF concurs that countering fake news, fake facts and misinformation is key. Neville reports on recent work to tackle the problem, by the Society for New Communications Research of The Conference Board (SNCR), of which he is a founding member.

The Future of Fake News

To map out the challenge, and to start building a better understanding of the impact for business and marketing, SNCR are currently carrying out a fake news survey. They have also come up with a comprehensive definition of the issue:

Content that lacks trusted sources and often uses sensational headlines to encourage the consumption and spread of unverified or false information. It can be left or right leaning. Such content often masquerades as legitimate news reports and is often supported by ads, sometimes without knowledge of the advertiser.

We discuss a number of fact-checking initiatives, including Storyzy, FactMata, and FullFact. There are many more (for example see FirstDraft, or the Poynter Institute’s list), and to address this globally will require coordination and collaboration on a broad scale, perhaps also a level of central coordination that has yet to be established.

If anything, this is becoming even more of an issue as recent research reveals a significant number of brands advertising on known fake news sites, and peer-review sites being flooded with bot-generated and human ‘crowdturfing’ reviews. Increasingly, this is turning into an arms race, the outcome of which will be determined by humans, and not by algorithms.

Can Satire Help?

Ending on a lighter note, Sam refers to Marnie Shure, managing editor of satirical magazine The Onion. In a recent interview, she stated that in a farcical world, satirical publications become even more necessary. It is no coincidence that the British satirical news magazine Private Eye has seen its highest ever print circulation in 2017.

At the same time, it is worth remembering that fake news is not a new phenomenon, and that Le Canard Enchaîné, “the most feared newspaper in France” (not least because it broke the story that derailed Francois Fillon’s Presidential campaign), has its origins on the streets of 18th century Paris.

Nieman Lab, another critical source in the fight against fake news, recently reported on a study about ‘bullshit receptivity’ as a key factor in fake news impact. Our resident psychologist Sam points to false memory, dissociation and ‘spaciness’ as related factors. He recommends a course by two US academics, Carl T. Bergstrom and Jevin West: Calling Bullshit: Data Reasoning for the Digital Age is meant to help hone in on our critical faculty to make better judgments.

And better judgments are a necessity in this time of uncertainty and unintended consequences. Everybody is responsible for their own judgments and choices. It is not enough to look to others for solutions, like Goethe’s Sorcerer’s Apprentice: “From the spirits that I called, Sir, deliver me!”

Take a listen to the 11th episode of the Small Data Forum podcast.


Thomas Stoeckle is the Head of Strategic Business Development at LexisNexis Business Insight Solutions in London, and Associate Chair of the Institute for Public Relations (IPR) Measurement Commission. Follow him on Twitter @ThomasStoeckle1.

Heidy Modarelli handles Growth & Marketing for IPR. She has previously written for Entrepreneur, TechCrunch, The Next Web, and VentureBeat.
Follow on Twitter

Leave a Reply