I’m often fascinated by embarrassing design mistakes. This isn’t because they’re a chance to point and laugh. Instead, they are an opportunity to learn how to avoid the same thing happening again. To do this it helps to understand the process that lead to the mistake in the first place.
When the refurbished Birmingham New Street station was launched last year, part of the striking design is a reflective surface showing the tracks below to people at street level. The builders were keen to point out that they had modelled the path of the sun at all times of the day and for every day of the year to avoid dazzling the train drivers below with solar glare.
I couldn’t help wondering if this was an indirect reference to 20 Fenchurch Street in London also known as the Walkie Talkie building, which during the summer of 2013 was alleged to be melting parts of cars in the street below. While some reports may have been a little dramatic, it can get sunny in London occasionally and concave surfaces can concentrate the suns rays on specific days of the year.
Ultimately this was an avoidable mistake and one that contributed further to the bad PR of an already unloved London landmark building.
Mistakes are valuable
In hindsight it is easy to see the flaws in the Walkie Talkie design, but at the time of development it is probable that tracking the sun’s rays was not a priority.
Realise your mistakes early
The bigger the negative impact, the bigger the future cost. The problem is that we are human and prone to psychological bias. Conservatism can make organisations slow to react because you’re less likely to face criticism for doing what worked before. Even in innovative, agile environments I’ve found there is a tendency to trust the de facto way of doing things until strong evidence emerges to the contrary.
Share often even if it causes discomfort
There is a tendency of groups to talk about information they are all familiar with, known as Shared Information Bias. Sometimes we like to talk about the information we and our co-workers understand the best as this avoids having to expend effort educating others about unfamiliar concepts. The problem is that project risks often live on the fringes of our understanding so talking about what’s familiar is unlikely to get you any closer to unearthing them.
Be honest about failure
With something like the building melting cars incident it’s easy to see a single effect and say that should have been spotted all along. This is hindsight biasalso known as “I told you so” syndrome. Surely it’s better to say that the process could have been improved and let’s learn for next time.
Be self aware
Organisations and people can get stale. That’s why staying just outside the comfort zone forces us to constantly re-evaluate and improve. Fortunately, this is easier when it comes to developing software products, but ultimately we’re all human and can easily be misled by our own cognitive biases.
A little bit of self-awareness goes a long way. Instead of just seeing a problem and searching for a matching solution, consider “Why am I thinking this?” and “Is this how I always solve this type of problem?”. Evaluating our thinking processes is always valuable even if it only confirms that our cognitive process was sound all along.
Sometimes I find being aware of cognitive biases is overwhelming, but as with anything that isn’t immediately tangible, giving a label to a thought process helps understand it better and objectively.