Who knew? I didn’t and I suspect neither did you. Who could possibly have known at the start of 2016 that Britain would vote to leave the European Union AND that Obama would be replaced by Trump in the White House? Or indeed that Ed Balls would emerge as the most popular Labour politician in the UK thanks to a TV dancing contest? That last one still seems especially improbable.
2016 has been full of political shocks. Had anyone the foresight to accurately predict last year’s events, they might have made a considerable amount of money at the bookkeepers’ expense. If you’re the proud owner of a winning betting slip, congratulations – only nine national polls predicted a Trump win in the final months of the election.
For everyone else, those of us who only learned about the latest political earthquake with the breakfast news, there is a big burning question; how is it that almost all pollsters keep getting it wrong? How could they have failed to see a mere 24 hours into the future and not be able to predict the winner of a two-horse race?
Why are we so bad at predicting the results of elections?
The simple answer? Well, the simple answer is that most people rarely give a simple answer to a simple question. In fact, many people mislead pollsters asking about voting intention. Especially if they are hesitant to admit supporting President Trump. So, any pollster who faithfully records and reports their raw results are likely to fail when it comes to predicting a winner. Unless of course, they can also accurately estimate how many people have been misleading them.
Most people provide consciously (and unconsciously) biased answers. Some people deliberately lie to confuse the analysis. Others, lie for a range of subtle reasons. This is a tricky dilemma for any pollster. Including those working in HR because let’s face it, employee surveying is a poll, of sorts.
Employee surveying is also an increasingly important way for companies to gather information about their people, culture and organisation. However, given the perils that surveying presents to pollsters during a national election, should we trust the results?
Yes. Employee Survey data is still an important analytical asset. It provides important context to the masses of system data we analyse when building an analytical view of an organisation. But we don’t treat all survey questions simply, with equal weighting. We too can apply some analytical discretion.
If a liar tells a lie, can we find the truth?
There are many statistical techniques to test whether a respondent is providing a faithful answer. Conjoint analysis (i.e., asking the same question multiple times in different formats until a consistent answer is arrived at) is a well-established technique but it relies on large numbers of questions.
A simpler approach is preferable. One that uses very few questions but can gather rich context on the opinions and views of respondents. It uses free text data and open questions that are analysed to provide insight into themes (i.e., what people are talking about) and their sentiment (i.e., their tone of voice). In other words, we prefer to use a more human approach to surveying.
Humans are quite good at spotting lies. Especially if they apply a level of scepticism to their analysis of an answer. We rarely ask a simple yes / no question and then accept the first answer we receive. Adults have learned to be relatively cynical and apply judgement as to whether an answer is truthful. If your son has chocolate on his face, do you really believe him when he denies eating the cookies?
So, we know context is important, as indeed is the consistency of the answer. Computers can be taught to apply a similar approach. Let’s return to the US election to see how this works in practice - one computer avoided the pitfalls of traditional polling and successfully predicted the results of the US election by applying some big data innovation. The computer was called Eagle AI.
Eagle AI (developed by Havas) analysed billions of social media posts. It read the text of these posts and found people who expressed clear voting intentions (i.e., clear Clinton / Trump supporters). It then examined the patterns of language and themes for the Trump and Clinton supporters. In other words, the computer now knew what a Trump supporter typically ‘sounds like’. It could then identify other people who consistently sounded similar but who had not declared their voting intentions.
Eureka. The computer can now do what no pollster was able to do, it could identify hidden Trump supporters. Those people who knew they would vote for Trump but who wouldn’t declare their support in a traditional polling question. This is a golden, powerful insight.
What do you really think?
Now let’s apply this experience to the world of HR and employee surveys; they also, like election polls, can produce strange results that are sometimes hard to trust.
Employee surveys that tell us 80%+ of the workforce are engaged … but attrition levels are stubbornly high and productivity is persistently low. Exit surveys that tell us that most of our people leave because of ‘personal reasons’ without scratching the surface of what’s really going on.
HR faces the same challenges as a traditional polling company. But they can also access the same techniques to overcome those challenges. Intelligent machines can use increasingly advanced text analysis techniques to look at underlying sentiment and themes within employee surveys 1.
Trump was a shock result that rocked the US political establishment. BREXIT caught David Cameron unawares and ended his political career. Out of date polling techniques created a sense of false confidence amongst our political leaders. They were out of touch with the sentiment of their people.
Let’s make sure HR supports business leaders by providing real insight, not a sense of false confidence about the views of their people. After all, bad information is worse than no information.
Pete Clark and Alex Borekull, Qlearsite.
Qlearsite provides People Analytics technology and services, applying the latest Big Data and Machine Learning technologies to provide Organisations with insights that deliver value. We call our work ‘Organisational Science’, whether that’s predictive toolkits, visual data discovery or survey text analytics using machine intelligence techniques.
For the original article, more resources and information about our work, go to www.qlearsite.com.
Notes: (1) Techniques include thematic indices that track what employees talk about over time. For example, are people currently talking about management? If so, are they being positive or negative? How does that change over time and how does that compare to other similar companies? Intelligent machines capable of text analysis can provide sophisticated views of underlying employee feeling.