The Curse of Systems Thinkers (Part 1)
Somewhere between 15 and 20 years ago, I worked for a company. It was a very prestigious company, and it was a glorious and frustrating time. The company did amazing things. Literally unbelievable achievements - from my point of view anyway. But this was coupled with levels of chaos that led to inefficiency, wasted opportunity, and needless headaches.
The contrast grew so large that I had to reconcile it somehow, if only in my own head. So I went to the countryside and wrote about the situation, which is my go-to technique for processing Stuff In My Life.
By coincidence I came across that piece recently, and was struck by how absolutely relevant it was, all these years on! (Indeed, slightly more relevant today, in one way I'll talk about in part 2.)
But let me start off by reproducing some key passages below.
The company I work for is a classic example of what I like to call "smart people, stupid organisation vs. stupid people, smart organisation" syndrome. Despite many of my colleagues being highly dedicated and very emotionally attached to their company, the organisation surrounding them is incredibly chaotic. Most of the things my department does are very badly planned, if they're planned at all. Documentation is either non-existent or of appalling quality - woe betide the new hire attempting to understand the technical architecture of what is surely one of the most complex environments on the planet - they are simply left to sink or swim. The department itself is woefully understaffed, and struggles to catch up with the immense number of projects dropped on its shoulders, and therefore cuts corners with implementation whenever possible.
(Emphasis not in original, but they were the bits that struck me in the act of copying this out again.)
The American half of our department barely talks to itself, let alone across the water to us. It's a little demoralising, and we've had significant turnover as a consequence. If you read this, you'll know I am a great believer in having the system be intelligent as well as the people. Smartness or otherwise means very little, I believe, if you cannot reproduce behaviour reliably.
I turned some of that personal memo into an email to various folks, trying to improve things. Here's what I wrote about predictability and reliability in delivery:
Predictability is great. It allows engineers working on projects to expect that when they're asked to do something, whatever that something is will be reasonably well defined, the timeframe for doing it well-understood, and the end result can't be signed off until X Y and Z is done. This protects the engineer, which is good.
It also protects the organisation, since the organisation comes to expect that when it wants to do something, it should have designs/documents/inputs in well-known format Q, and last time we did this, it took length of time W for it to be done, therefore it'll probably take around that this time too. More importantly, the organisation won't get something which is called "done", but isn't. Last time we tried to behave as if X, Y and Z didn't have to be done, severe problems were caused N months down the line, when we discovered we did in fact have to do those - like we knew all along.
Protecting the organisation means not causing those severe problems, and consequently means not pretending.
By protecting both the organisation and the engineer, we define a stable interface between them. By defining a stable interface, planning is easier, day-to-day jobs are less interrupted by crises, management is easier, credibility increases throughout the organisation as deadlines are met, projects work when delivered, and your adrenaline glands can begin to sink back to normal levels of stimulation. Win all around.
(Today, I would probably be a bit more nuanced about the tradeoffs between speed and stability, but I stand by organisations as a whole generally benefiting from driving predictability. This is probably worth the slight drop in excitement; there's always skydiving if you disagree.)
Unsurprisingly, I got a fair bit of pushback, and I responded in turn:
[My colleague] would appear to see time spent in planning and writing documents as essentially wasted time, since he asserts that without process we are faster. We are not faster in reality. We are merely faster to say we're finished. That is not the same thing as actually being finished: we can all think of examples of that. And the uncertainty induced by the unknown magnitude of correction required is, IMHO, the biggest contribution to our inefficiency and ineffectiveness.
The art of good engineering is the art of saying no, and we must begin to say no to things to protect the organisation and to protect ourselves.
The strange fact was, however, that my little bit of the group was probably less affected by the insanity than others. We were probably the best planners and the best documenters. I suspected we had the most predictable schedules, and we were very cognisant of the physical limits of work: we said no to things numerous times.
But there were huge cultural and business imperatives that continued to create random stuff for us to fix, no matter how many impassioned emails we sent.
I wrote that I found it hard to give my best in these conditions:
It's very hard to do good work in this atmosphere, and particularly I - who has always had an emotional relationship with work - find it hard to be engaged with something so obviously crazy. I'd love to fix the chaos myself, or try to, but it seems unlikely I'd be allowed, since my previous emails produced only well-worded refutations. They explained quite factually why the setup is the way it is, and implicitly therefore why it could not change.
I'd be more generous today about people documenting the constraints they suffer under, but I hope I'd be as insistent that it's appropriate and good to think about the system, the team, and the goals as a whole.
I understood, even at the time, that a focus on the narrowest components of execution can be a huge problem for greater success. Given I've worked across networks, software, and systems, I'm probably one of those people who is going to be inclined to think in a holistic way anyway.
But back then, I saw it in terms of generalists versus specialists:
This feeling has become coupled with another realisation I've had recently. I’m a generalist, B-grade at a bunch of stuff. But the organisation does not want, or reward, generalists. The organisation wants specialists that it can slot into specific pieces of the hierarchy, who will then do their job with a minimum of complications. I've been thinking about this in a career context - I don't want to specialise to get a promotion. I have no interest in (for example) vendor certifications - I am wondering if I have painted myself into a corner.
Then I spoke to my friend Steve. Steve recast the problem entirely, and that was very helpful.
Actually, from reading this, it's clear you already have a specialty: Systems Engineering. Not systems in the limited sense of Network Jockey or Server King, but Systems, with the capital S and everything, where it is all about interfaces and trade-offs.
Systems Engineers often get the short end of the stick, because they have to be generalists. But without them, any project that involves more than one roomful of people is probably doomed. I've seen a lot of engineering projects fail: failing for technical reasons is way less common than failing for lack of Systems Engineering. It really is a constant theme.
It sounds like you're in an org where management hasn't understood the need for Systems Engineering yet. Systems Engineering is nearly always something that must be imposed, at least at first, because engineers will never happily demand that someone who knows less than them about a particular subsystem should be making the final technical decisions.
Steve recommended investigating Systems Engineering as a distinct subject. Specifically, reading the engineering histories of the Gemini and Apollo projects, and especially about the culture clash between the experimental aircraft guys who built Mercury, and the ICBM teams; additionally, thinking about joining a professional organisation like the IEEE, since a community with practical experience of dealing with these issues is always useful; and finally, coming back to my then situation, trying using references around Systems Engineering to prod the organisation into trying to do some of it:
If you can't get the ball rolling on even a small scale because no-one can see the need or will free-up the required resources, then you're free: they're fucked. Give yourself permission to let the organisation fail -- it's not your fault, and in your attempt to introduce a Systems approach you will have discharged your responsibilities as a professional, so now just do what's reasonably asked of you, keep saying no to the absurd requests and cash those paychecks till something better comes along.
The curse of Cassandra was to be correct, but never believed; the curse of systems thinkers is to be correct, but never valued.
In part 2, we'll see if this is, in fact, the whole truth, or if perhaps there is an upside for systems thinkers in organisations.