I wanted to jot down a quick thought I've had after learning about an interesting idea in my History and Theory of Planning class that I think is particularly applicable to those in tech: "wicked problems."
If you have the time, I'd highly recommend reading (or at least skimming) the original paper, Dilemmas in a General Theory of Planning by Horst and Rittel. In it, they describe wicked problems using ten characteristics:
- There is no definite formulation for the problem.
- There is no clear point at which the problem has stopped.
- Its solutions are not true nor false, rather they are good or bad.
- There is no immediate or ultimate test of a solution to the problem.
- Every solution is a one-shot operation and there is no opportunity for feedback by trial-and-error.
- There is no enumerable list of possible solutions: there are infinite ways of addressing the problem.
- Every problem is essentially unique.
- Every problem can be considered a symptom of another wicked problem.
- The existence of a wicked problem can be described in many ways, and the choice of how it is explained can bias how it is addressed.
- The planner has no right to be wrong (as scientists do).
Many societal-level problems can be described as wicked problems. Big issues like poverty, limited access to education, institutional racism. These wicked problems are in contrast to the many "tame" problems of the natural sciences, or problems that are much easier to define in their entirety and come to a clear conclusion on. Software engineers see many examples of tame problems in their line of work: problems like what database should be used for a given application or how can the performance of a given algorithm be improved. These problems can be reasoned about and solved in ways that wicked problems cannot.
While the concept of wicked problems was originally devised within the context of urban planning, I think it has interesting implications in the world of technology. Modern technology companies are filled with people who are highly skilled in the act of solving tame problems, however many of the problems they face are in fact wicked problems. This becomes a problem when wicked problems are treated as tame problems and predictably yield disastrous outcomes. Think of all the societal-level problems Facebook or Twitter see, yet are addressed with features that "roll out to a limited number of users," similar to how changes in an algorithm are rolled out to a limited number of servers before they're fully released to ensure there are no regressions1. The strategy reeks of engineering techniques being applied to social issues as if they're just another bug to be fixed.
As technology increasingly sows itself into the fabric of our society, I believe it is critically important for those who work in tech to take time to understand that the societal problems that crop up on their products are fundamentally different than the problems that occur on their server racks. The tame problems of our algorithms should be treated very differently than the wicked problems of the humans using our products. Perhaps acknowledging these differences will make us think twice before adopting a company culture of "move fast and break things&redirect=no)."