Cultural Blindness

Hidden in plain sight!

One aspect of cultural blindness is the concept that something can be hidden in plain sight and that once you become aware of it, it’s so blindingly obvious that you wonder how you missed it in the past.

A line of ten women queuing in a line.


Earlier this year (2019) I attended a Ludovico Einaudi concert at the Barbican Centre. It was a fabulous concert and the sound quality was exquisite. The Barbican Centre, now a Grade II listed building, opened in 1982 and is one of London’s best examples of brutalist architecture. The facilities at the centre are, on the whole, excellent, but for the first time that I can remember, my attention was drawn to a fundamental customer design flaw, and a great example of Cultural Blindness.

In this article I would like to explore Cultural Blindness in the context of the Barbican Centre customer design flaw with some other examples which will I hope will assist businesses and individuals in thinking outside the box. By doing so, I will show how opportunities that might be in plain sight, can be easily missed.

I’m also using a slightly wider definition for Cultural Blindness which is as follows:

  • Cultural Blindness is the inability to see something or understand how particular matters might be viewed because your culture and/or personal experience either blocks it out, or because it is sufficiently limiting to make it difficult to see alternatives and other perspectives.

Not only does this apply to people with different cultural backgrounds, but also those whose place within a society is different due to gender, disability, wealth, religion, social, ethnic or historical background.

We think that we pay attention, but we are wrong!

The Barbican Centre design flaw concerns the simple provision of toilet facilities. On initial inspection the issue may not be visible. The architects have allocated a reasonable number of toilet facilities and the size of each one is probably adequate. Both male and female toilet blocks are positioned next to each other and, as far as I could tell, there were an equal number of each. There is also provision of toilets for those with a disability. No doubt some modelling was done to decide how many toilets would be needed based on the numbers of people using the venue at peak times.

But what we see is not all that there is!

The price of concert tickets for men and women are the same, therefore there’s no reason to suspect that the concert experience would be any different. However, you would be wrong!

Unfortunately, women do not have the same access to toilet facilities as men. At times long queues form outside the women’s toilets, whilst there were no such queues for the men. As Caroline Criado Perez reminds us in her book ‘Invisible Women: Exposing Data Bias in a World Designed for Men’ 1, the everyday act of going to the toilet is not the same for men and women. This is primarily because men are provided with urinals, which take up less space than cubicles. This means that the number of toilet facilities in a given space is greater compared with women’s, which results in a higher throughput for men compared with women, hence no queues outside the door at peak times.

This design error could either have resulted from jumping to the false conclusion that equal space equates to equal facilities, or from simply not considering the throughput for different gender customers at all. A simple model would have provided an estimate for the ratio of space required to achieve an equal throughput and customer experience. It’s easy to imagine that male architects might not automatically think of this and so would be culturally blind to this insight. Sadly, in some parts of the world, poorly designed toilet facilities can mean the difference between life and death for women 1.
There are some easy ways to improve your customer experience by designing venues, products and services for all of your customers, particularly groups of customers who may have previously been poorly or under served.

Stereotyping and gender bias

In August 2019 the UK Advertising Standards Authority made a ruling against Volkswagen Group UK for their advert which had images of men in extraordinary environments carrying out adventurous activities, and women appearing passive or engaged in a stereotypical care-giving role. New advertising rules, which came into effect in June 2019, state that adverts that directly contrast male and female stereotypical roles or characteristics need to be handled with care.

News, social media and advertisements tend to reinforce and amplify our cultural biases by making them seem normal, expected and therefore fall outside of our awareness. In Radio 4’s Today programme on 14 August 2019, John Humphrys, a very experienced journalist and interviewer, struggled to grasp and discuss stereotypical gender biases in advertising. This was despite the best efforts of Jess Tye, an investigation manager at the U.K Advertising Standards Authority.

I’m sure that in John’s 75 years he will no doubt have been exposed to many personal and cultural biases that are no longer considered appropriate, acceptable or relevant. The important point here, is not the furore that followed this particular interview, but that we all suffer from these biases to a greater or lesser extent and this limits our abilities to see outside the box. Imagine a business that has great products or services but completely misses opportunities because they have unknowingly limited themselves to only one gender, social or cultural group.

Human bias and discrimination in Artificial Intelligence (AI) systems

In a blog post, Reuben Binns and Valeria Gallo at the Information Commissioners Office (ICO) 2, discussed how AI and Machine Learning (ML) can play a part in maintaining or amplifying human biases and discrimination. This is a very important issue for businesses as the AI systems they use have to conform to the UK Equality Act 2010, which offers individuals protection from discrimination whether generated by a human or automated decision-making system, and the General Data Protection Regulation (GDPR) which covers the processing of personal data, including the right to non-discrimination.

Biases can result from imbalanced training data, where subsets of the population are under/overrepresented, and the training data used is itself a product of past discrimination and cultural bias. In particular, the authors acknowledge that “a diverse workforce is a powerful tool in identifying and managing bias and discrimination in AI systems …”.

The important point here for businesses in the context of this article is that these invisible biases often equate to a missed or reduced opportunity.

The example given by the authors 2 relates to finance:

A bank has developed a ML system to calculate the credit risk of potential customers and to approve or reject loan applications. The system is trained using a large set of historical data containing a range of information about previous borrowers, such as occupation, income, age, and whether or not they repaid their loan.
When the bank checks against any possible bias, it finds the ML system is giving women lower credit scores, which would have led to fewer loans being approved.

Why it can be hard to think ‘outside the box’

Thinking outside the box is a metaphor for thinking differently, unconventionally, or from a new perspective. It refers to novel or creative thinking. In many ways it’s the opposite of the control, processes and transactional nature of our daily work.

The interesting aspect to cultural blindness is the concept that something can be hidden in plain sight and that once you become aware of it, it’s so blindingly obvious that you wonder how you missed it in the past. One reason is that we see what we expect to see, and it can be difficult to think outside of our knowledge, training, experiences, gender and culture, particularly when it’s something that is very familiar, accepted practice or custom or a subject that we believe we are knowledgeable about. Another reason is that much of our modern work tends to be procedural and transactional and doesn’t easily lend itself to more lateral thinking.

Thinking outside the box requires the time to think and explore ideas with a diverse set of colleagues and very importantly having the confidence to question our own thinking and assumptions.


In this article I set out to explore cultural blindness by using the example of different customer experiences based on gender at the Barbican Centre, together with other examples to assist businesses and individuals in thinking outside the box and, by doing so, show how opportunities that might be in plain sight can be easily missed.

At Technology Wellbeing we have designed an onsite pre-innovation workshop to assist companies and individuals think differently and open up their minds to see more possibilities.

For further information email:

Bibliography / Acknowledgements

  1. ‘Invisible Women: Exposing Data Bias in a World Designed for Men’, (2019), Caroline Criado Perez, Chatto & Windus. 

  2. ‘Human bias and discrimination in AI systems’, (25 Jun 2019), Reuben Binns, Valeria Gallo, Information Commissioners Office (ICO) blog. 

Next article