James's Blog

Sharing random thoughts, stories and ideas.

Nuance

Posted: Apr 17, 2020
◷ 2 minute read

People who work directly with objective reality (instead of abstractions) need to know the nuances and intricacies of what they are working with. For example, an engineering manager may not have to work with objective reality, and instead deals with things at a higher, more abstract level through the interactions with the developers. A developer on the other hand, who is actually making changes to the code, must know all the details on how the code works. Knowing the details is extremely useful, as it can help us understand why something is the way it is and how we got here. But there is also the common saying that new people can offer valuable fresh perspectives on things, precisely due of their lack of knowledge on the nuances. I think this is because knowledge of nuance can make things appear more reasonable than they should.

When working with nuance (in the weeds, so to speak), each subsequent action we take is generally based on the previous action alone, and makes perfect logical sense in that context. The trouble is, when this is repeated, it can lead to absurd outcomes, similar to how greedy algorithms can get you stuck in a local minimum despite following the seemingly best course of action at every move. Having too much awareness of the details often blinds us to this problem, and traps us in a biased state of mind, unable to see the bigger picture. This is a very difficult bias to overcome for someone who had worked through all the details, as they would’ve experienced first hand how “perfectly reasonable” each of the actions taken was. New people, and by extension novices in a field, who do not have the knowledge of the nuance, don’t have this baggage, and as a result can more easily spot the unreasonable things that are not obvious to people who have walked through the weeds.

But being new or a novice is a temporary state. As time goes by, the person will gradually accrue knowledge about the nuances, and begin to suffer from the same bias. One way to overcome this in a more sustainable way is to develop the habit of periodically forcing a few steps back, and looking at things from a more zoomed out view. But this, like overcoming many other cognitive biases, is much easier said than done. Another way to help is to periodically get feedback from people who work at a different abstraction level (e.g. the manager in the opening example). It’s not obvious for an engineer to consult a less technical manager on technical decisions, since they are missing the knowledge on the nuances. But that missing knowledge is precisely what gives their perspective its unique value.