Circular Wonderings is an exploration of the role of digital, software and technology in the Circular Economy. Exploration is the key word here. I write regularly, reflecting on my current thoughts and research. Expect typos, incomplete thoughts, varied rambling topics and (hopefully) a journey towards clearer understanding and insight. Subscribe here to join my journey.
A cautionary tail
There is a really great (partly apocryphal) story about unintended consequences and the need for systems thinking.
Set on an island in Borneo in the early 1950s, the story starts with, the very laudable, goal of reducing deaths from malaria. However, the results include people's roofs falling in, depletion of food stores and plague! And, then, having to parachute in cats the fix the problems caused by the original well intentioned actions.
It's a great story. And one of the key morals - beware of untended consequences - is really clear in the almost comedic extremes needed to tackle them.
The Circular Economy encourages us to think in systems - in fact to be successful it requires systems thinking.
Interestingly, there are two other lessons here.
One of the details of the story is probably not true. (It is probably not an example of mammalian deaths due to the biomagnification of DDT).
In this case the incorrect detail doesn't actually change the main moral of the story. But the fact that part of the story is inaccurate puts the whole thing at risk. One can imagine the incorrect facts resulting in future attempts to avoid similar unintended consequences actually making things worse. Or the true lessons being ignored. Our third lesson, therefore is that accuracy in details matter.
They don't matter to telling a good story. But they matter a lot if we are genuinely attempting to learn from past experiences in order to achieve a purpose.
In software engineering, just as in science, the truth matters. Especially because it is important to limit the unintended consequences of our work.