News

How to account for bias in design research to create better products and services

14 Feb, 2019
3 Min Read RXP Services

How to account for bias in design research to create better products and services

Filter

Recent Posts

Subscribe to our blog

Bias is an issue that affects all humans both consciously and unconsciously. Bias is often described as the act of unfairly favouring one thing, person or group over another. Worryingly, we are often completely unaware of the biases we have (Nosek & Greenwald, 2014). Yet these biases show up in our actions or lack of action.

For designers — who create the world’s future products and services through empathy with the user – accounting for bias is not only an essential skill to doing ‘good’ design, but also a moral imperative. Not accounting for bias in your research or design activities, consciously or otherwise, can result in a design that systematically discriminates against a group of people.

"In my work, I am aware that the people I research are often from very different backgrounds to me and this can lead to biases in my work. Statistically I am not at all representative of a typical Australian and, often, nor are my colleagues. So, accounting for bias is an essential skill to improve the credibility and quality of my research and designs."

It’s unlikely that you will be able to eliminate bias from your work and life, however, by being aware of your bias and a few ways to account for it, you can reduce the impact it could have on your research and design work.   

#1 Accounting for bias in research recruitment

Designers of all disciplines often use qualitative research techniques to gain an understanding of the needs of users to inform their designs. These techniques vary from vox pops in the street, to interviews and diary studies. What all qualitative research methods share in common is (1) a need to find people, (2) conduct research with people, (3) document their perspectives and (4) report on the findings.

One of the most common types of unintentional bias I see in designers is in the recruitment choices they make to determine who they talk to (and therefore, of course, who they do not talk to). In normal research projects the aim is to explain the perspectives of a ‘representative’ group of people — that is — a subset of the population that represents the demographic and behavioural attributes of the overall population. The population in academic research is often the country or the world, however, for companies this is often their customer base.

In design projects there are many reasons not to recruit a representative population. For example you might want to talk to the extremes (research and then design for your advocates and detractors and everyone else is solved for) or you might want to find and solve known problems first by understanding complaints. However, many research and design projects, use as-if-random approaches to recruitment leading to sometimes odd or poorly supported findings. Some of the popular approaches are to use the company's existing marketing segments. Whilst this is sometimes ok, often marketing segments are more aspirational than reality with most customers falling outside of these segments (Kennedy, 2000). This is especially true for mass market products because they have to represent more people.

To minimise recruitment bias I suggest being deliberate about recruitment. Start by what a representative group of the customer base or larger population would be and then edit the criteria from there based on the research intent

In short, recruit deliberately and representatively of the customer base or population by default.

#2 Accounting for bias in research and synthesis

In conducting research and synthesis, bias is not only very common, but also, with qualitative research, quite difficult to detect. While there are many common biases in research such as asking leading questions, the two that I find both more prevalent and difficult to fix are (1) Confirmation Bias and (2) Social Desirability Bias.

When working on a project it is easy for researchers and designers to rapidly develop opinions about what will and will not work. For designers, these opinions can be based on experience, heuristics or research which improves its credibility. However, often these are not evidenced based. This becomes problematic when running research and synthesis, because it can lead the researcher or designers to disproportionately report on ideas and themes that support their private fancies.

In addition, when conducting research on themes that have more or less socially desirable behaviours attached, it can be easy to guide the conversation or synthesize the research in a way that supports the more socially desirable perspective (and, therefore, suppress findings that are less socially desirable). As an example, I once did a research project for a bank that explored how people managed debt and developed savings. We recruited a representative sample of customers for one-on-one interviews. When interviewing people who had difficulty managing debt and savings, I noticed I found it easy to skip over important parts of the conversation that related to what would be considered not socially acceptable ways of managing money. Not only was it difficult to explore as a researcher, but often my participants would try and avoid the socially unacceptable discussions, or worse, lie about their behaviours when prompted.

One effective way to account for confirmation and social desirability bias is to, essentially, keep a journal and actively document your biases at the start of a project and document your ideas throughout the project. This act of journaling, or 'writing memos' is borrowed from qualitative research in social sciences  (Glaser & Strauss, 1967). By being aware of your biases at the start of a project, you are better able to identify ways to account for it. For example you might choose to get support from a colleague with different perspectives, or you might choose to deliberately write questions in your moderation guide to answer the areas that you have a bias. Also, by documenting your biases and ideas you know what to watch out for in synthesis. You can ask yourself: "did they really say that, or is that what I wanted them to say?"

In addition to documenting your ideas and biases, taking notes in research can be full of bias. One way around this is to write in verbatim so that the original content of the comment is preserved, or, better yet, record and transcribe the entire conversation (this can also be a great tool for listening back to identify your biases in the conversation "Did I just ask a leading question again?!")

In short, be aware of your biases and ideas by documenting them and capture research data in verbatim.

#3 Accounting for unconscious prejudices

 Prejudice is an awkward thing to talk about because most people would hate to think of themselves as racist, ageist, misogynistic or homophobic. Yet, more commonly than not (Nosek & Greenwald, 2014), we harbour implicit biases that can affect the way that we facilitate research and report on the findings.

As an example, the Harvard Implicit Bias test shows that 70% of people who complete the test have a slight to strong automatic preference for white people over black people (Nosek & Greenwald, 2014). If you are one of the majority with this unconscious automatic preference during your research, there is a good chance that the way that you report on the findings could be skewed. 

The challenge with unconscious bias is that it can be difficult to know that you have a bias. And, if you do not know you're biased then you cannot account for it in the research. Online tools such as the Harvard Implicit Bias Project help to expose some common biases. Once aware of an implicit bias you can account for it. Some examples of how you could do this include getting a colleague with different biases to support you in research as a note taker and starting synthesis with the participants you are most likely to have a bias to.

In short, be aware of your implicit biases by using the online tools and account for it in your research planning.

Summary 

To summarise, all humans are impacted by bias that can be both conscious and unconscious. For designers its imperative to account for bias to create future products and services that meaningfully serve their customers. To minimise bias, remember:

1. Recruit deliberately and representatively of the customer base or population by default.

2. Be aware of your conscious biases and ideas by documenting them. Capture research data in verbatim
3. Be aware of your implicit biases by using the online tools and account for it in your research planning 

Armed with a better understanding of your biases, get creative with ways to account for them and inspire your colleagues to do the same.

References

1. Nosek, Greenwald, (2014). Data from the Race Implicit Association Test on the Project Implicit Demonstration website. Journal of Open Psychology Data

2. Kennedy, (2000). Competitive Brands' User-Profiles Hardly Differ. Market Research Society Conferences
3. Glaser & Strauss, (1967). The Discovery of Grounded Theory: Strategies for Qualitative Research