How to avoid biased questions in your next employee check-in or survey
Updated 1st April 2022
Whether it’s a little pop questionnaire, a regular check-in, or a full-blown survey with 100 questions, surveys play a key role in understanding the state of the workplace and its staff. But biased questions in employee surveys can throw off your people analytics.
What makes a question biased?
Biased questions skew people towards certain answers, or otherwise make it difficult to answer that question clearly or with nuance. They often present assumptions about the person answering that can prevent certain attitudes or responses from coming to light.
‘How often do problems with your manager impact your productivity?’
On its face, this question seems harmless. You might even think it cuts through to the heart of the issue. Again, this might not seem significant. After all, if there haven’t been problems, then it shouldn’t be an issue, right?
Well, the aim of any good employee check-in or survey is to get honest responses from employees that you can turn into actionable insight. So, although biased questions can be easy to overlook, they can have a serious knock-on effect on your ability to actually gauge and respond to employee sentiment.
Why biased questions are problematic
We tend to think of memory as being more objective than it is. After all, people are regularly sent to prison based at least partially on the testimony of others. But some evidence suggests that memory is more malleable than we give it credit for.
At the extreme end of how our minds play tricks on us, we have the idea of fully fabricated memories. In the mid-90's, Elizabeth F. Loftus performed a number of studies on the implantation of false memories. In one of the most well-known, she was able to make 25% of participants falsely believe they had gotten lost in a shopping mall as a child.
However, a 2017 literature review found that only 15% of cases like these resulted a level of recollection defined to be a "full memory." But the concept still has some merit, as an average of 47% of participants in each study experienced some level of false recollection.
But Loftus' work wasn't just limited to falsifying memories. Biased questions don't have to incept fake memories into someone's head to throw off their results. Loftus and Palmer's 1974 research has been the defining influence on how we talk about the impact leading questions can have.
A 2020 study from the University of Gloucestershire supports Loftus and Palmer's findings. It found that directive questioning (questions that assert a correct response) were found to significantly reduce witness accuracy compared to non-directive questions.
When it comes to engaging employees, you don’t just get points for trying. Biased questions in surveys are another barrier to communication that prevents you giving employees the support they want. And if you’re trying to boost engagement based on faulty feedback, you’re liable to just end up wasting money and alienating your team.
Different types of biased questions in surveys and check-ins
Keeping your questions clear of bias isn’t as easy as knowing one simple trick. Unfortunately, there are quite a few ways bias can creep into a carelessly crafted survey, like these examples of bad survey questions:
Leading and loaded questions
These two types of biased question cause pretty much the same problem, but they are technically different. Leading questions use language to favour certain types of response to maneuver the respondent to the conclusion you want. With a loaded question, simply answering it tacitly confirms an implied assumption.
So, a leading or loaded question might be, ‘Why do you enjoy working in this organisation?’
No, we don’t mean questions that ask you to be harsh or critical. Sometimes, biased questions in employee surveys are confusing rather than manipulative. They’re a prime example of how sloppy communication prevents employees from engaging.
Obviously, we tend to associate ‘Yes’ and ‘True’ with affirmative responses, and ‘No’ and ‘False’ with negative responses. So, a negative question inverts this paradigm. This can trip people up by messing with their reading comprehension.
For example, ‘I don’t have any major barriers to productivity. True/False.’
Instead, try ‘Are you experiencing any barriers to productivity? Yes/No.’
Ambiguously phrased questions
Sometimes, the instinct to avoid a leading question is so strong that you just end up being ridiculously vague. If the question has no clear scope, you can’t expect solid answers. Questions like ‘Do you think workplace culture could improve?’ aren’t helpful.
Firstly, everything can always improve, and secondly, words like ‘think’ that invite subjectivity can get different reactions out of people. You’ll have more luck with something like, ‘On a scale of 1 to 10, rate our workplace culture.’
Absolute or dichotomous questions
Sometimes, the most biased questions in employee surveys are those that don’t leave room for nuance, forcing people to over-simplify their beliefs and tick one of two opposing boxes. So, asking ‘Do you want to work from home? Yes/No’ turns it into a binary proposition.
Someone might tick ‘Yes’ because they want to work remotely a couple of days a week, or tick ‘No’ because they’re worried about being pulled out of the office entirely. Aside from including necessary follow-up questions, you should take a leaf from Buffer’s book and try something along the lines of ‘Would you like to work remotely, at least some of the time?’.
How to avoid biased questions in employee surveys
Now that we’ve made our case, it’s time to go over the steps you can take to make sure you’re achieving employee survey best practice:
Run your questions by others
It’s always good to have a proof-reader. Getting someone else to look at your survey doesn’t just eliminate pesky typos. They can also help you avoid leading questions and confusing language.
Avoid confusing language
Speaking of which, it’s important to phrase your questions precisely. Vague language leaves room for people to fill the gaps with their own assumptions, and double negatives make it impossible to tell which answer corresponds to what.
Use the right scale for each question
There are all sorts of metrics you can use for questions, from the binary ‘Yes/No’ and scales like ‘Strongly agree-Strongly disagree’ to an empty box for qualitative responses. Using the wrong scale can make a question effectively useless, like making a question about the quality of workplace culture a Yes/No response.
Machine learning can mitigate biased phrasing
AI has revolutionised HR’s ability to gauge employee sentiment. Not only can machine learning help you avoid biased questions in employee surveys, they can even pinpoint the most effective questions for your type of business to be asking.