Thinking in Systems - and Products
Assuming you have followed enough of my writings (even here or on my LinkedIn page), or just know enough about me, you should know how often I like to use the System's lens (with a fractal kind of structure, permeating from the product itself to the way we go about building and maintaining it, to how organizations work, and more…). Part of what led me to that way of thinking comes from my university training, as I explained in this previous post.
Maybe you already know this, but for the sake of completeness, let me just briefly explain what I mean by System and the related lens (through which I can see and reflect over the world and what happens around me, or my problems or issues…). Some of the key insights of systems theory are:
Only when we see the relationships between structure (what things are made of) and behavior (what happens), can we begin to understand how the system works.
From that follows what Russell Ackoff famously said to summarize system thinking: "A system is never the sum of its parts; it's the product of their interaction".
And implied in that realization is an embracement of complexity, thus Ackoff's reasoning that whenever we are trying to manage a system, we ultimately "do not solve problems, [but] manage messes".
Speaking about key thinkers of systems thinking, I recently finished the book by Donella Meadows, precisely called "Thinking in Systems". It could well be worth a whole series of posts to share the insights, but it particularly called my attention to the very last chapter ("Living in a World of Systems"), where she attempted to summarize a set of the most general "system wisdom" (to the best of her knowledge and experience). I thought that would be not only a good starting point (in case I ever decide to share further insights from the book) but possibly the best way to connect the work with what it can mean to work with (software) products. To be clear, this is a subset of the insights shared in the chapter.
(Note: For the rest of the text, every time I put a phrase between quotes, that means it is a direct reference from the book itself, so those are Donella's words).
Get the beat of the System
"Before you disturb the system in any way, watch how it behaves." "Starting with the behavior of the system forces you to focus on facts, not theories. It keeps you from falling too quickly into your own beliefs and misconceptions, or those of others".
Here let's keep to the basics of what it means for (software) products: the ultimate supremacy of getting insight on what happens in the product, and how people interact with it, and follow an evidence-based path towards improvements and developing new ideas. Moving away from the "bright" (untested) ideas which are just too cheap.
Honor, Respect, and Distribute Information
Information holds systems together, and "... delayed, biased, scattered, or missing information can make feedback loops malfunction". There are many tracks I could take this insight towards; at the most fundamental level, it talks about the importance of having information and authority aligned. Every time there's a delay, or chance of bias or incomplete information (think of when in an organization information may need to flow up and down so that decision can be made), there's a chance of malfunctioning feedback loop.
If that wouldn't make any organization obsessed and relentless about ensuring suitable information flow, that information would be timely available at anyone's hand at any moment a decision is made, I'm not entirely sure what else would. And never forget this basic reality:
An engineer working on a software code trades off (makes some sort of decision) possibly many times a day, an engineer manager less so… and the further you go up, the fewer decisions are made over the same time. (Now, it is true that the decision's potential impact tends to go the other way around, but a system's performance is not linearly dependent on that decision's potential impact. That is to say: a scenario of "death by a thousand knives" (many small interventions in a downward spiral) is at least as relevant as a "live or die moment" (one-off big intervention), arguably more).
Pay Attention to What Is Relevant, Not Only What is Quantifiable
Our obsession with numbers "... has given us the idea that what we can measure is more important than what we can't measure. Think about that for a minute. It means that we make quantity more important than quality". The implication here is that we are better off (from a system's lens perspective) to consider something hard to measure if that is important to be considered somehow.
That made me think of something that John Cutler wrote about some time back on his LinkedIn page:
Powerful Ideas Imperfectly Measured > Perfect Measures For Less Powerful Ideas
This is also a good reminder of the risk of focusing on the wrong measurement, just because it's easy enough to measure it perfectly, and making it the goal. By the way, in another chapter of the book ("Systems Traps… and Opportunities), Donella identifies "seeking the wrong goal" as a kind of system trap, which can lead to a system obediently working to produce a result that is not intended or wanted.
Go for the Good of the Whole
"Remember that hierarchies exist to serve the bottom layers, not the top. Don't maximize part of systems or subsystems while ignoring the whole." I think there's quite some depth to the statement above, with different frames possible. A way I think about it is:
Your (product) strategy is only as good as your ability to execute, but at the same time, your execution is only as good as it is serving that higher level purpose (which a strategy is intended to articulate).
This means that you should have the clarity that execution matters more, and your strategy (the higher level) should exist to serve the execution, like when it enables making decisions and progress faster, better, and easier
Locate Responsibility in the System
"'Intrinsic responsibility' means that the system is designed to send feedback about the consequences of decision making directly and quickly and compellingly to the decision-makers". A very good way to think of it is along the lines of assuring "skin on the game" for those who decide, while also making sure information (feedback) is direct, timely, and presented in an appealing way.
Another way to put it:
Those who need to live with the consequences are to be also empowered (by information/feedback loop) to make decisions that are set to be enough times in the right direction.
It's interesting to reflect on this dynamic against, for instance, the so-called "empowered product team". Is it fair to say that the teams themselves ultimately live with all the consequences of the decisions they make? That also in the context of the previous idea of "go for the good of the whole"?!
I think that makes it quite clear the need for deep involvement of leadership as well. But only for as long as done in serving the lower layers fashion, also while portraying "honor, respect and distribute information" as a general behavior.
Stay Humble–Stay a Learner
"Systems thinking has taught me to trust my intuition more and my figuring-out rationality less, to lean on both as much as I can, but still to be prepared for surprises". This is a "slam dunk" for me… I equate the flow of delivering software/product as the same as cycles of learning:
The faster the cycle of learning, the bigger the chance of figuring out how to best influence the underlying system (or problem) that the product is supposed to tackle (or help with).
***
If this isn't useful "system wisdom", I don't know what else can be… right!?
By Rodrigo Sperb, feel free to connect, I'm happy to engage and interact. If I can be of further utility to you or your organization in getting better at working with product development, I am available for part-time advisory, consulting or contract-based engagements.