The flip side is that "accountability" often results in people redirecting a disproportionate amount of effort towards covering their own asses. To the point that it results in worse outcomes.
An accountable person isn't encouraged to make the best decisions. He's encouraged to make the most defensible decisions. And Goodhart's law is in full force there: "defensible" and "right" end up at odds quite easily.
Which is why certain systems introduce a lack of accountability on purpose. Ranging from Google's "blameless postmortems" and to the way the accountability of the police or the jury is reduced when they are carrying out their duty.
Systems that don't have this engineered in? When things go wrong, and when the most "defensible" course of action leads to something terrible, they can only hope to have someone with the balls to "take responsibility" - put himself at a great risk and do the right thing, damned be the consequences.
The Double Bind surfaces in tech/security hierarchies where the CTO manages the Head of Security, and is officially accountable for delivering on growth opportunities as well as managing security risks.
While there are great CTOs out there that are conscientious and thoughtful about this double-bind, most aren’t.
It’s good to have open discussions about upside opportunity versus downside risk and generally that happens best when your boss’ bonus doesn’t primarily depend on them maximising upside.
The flip side is that "accountability" often results in people redirecting a disproportionate amount of effort towards covering their own asses. To the point that it results in worse outcomes.
An accountable person isn't encouraged to make the best decisions. He's encouraged to make the most defensible decisions. And Goodhart's law is in full force there: "defensible" and "right" end up at odds quite easily.
Which is why certain systems introduce a lack of accountability on purpose. Ranging from Google's "blameless postmortems" and to the way the accountability of the police or the jury is reduced when they are carrying out their duty.
Systems that don't have this engineered in? When things go wrong, and when the most "defensible" course of action leads to something terrible, they can only hope to have someone with the balls to "take responsibility" - put himself at a great risk and do the right thing, damned be the consequences.
The Double Bind surfaces in tech/security hierarchies where the CTO manages the Head of Security, and is officially accountable for delivering on growth opportunities as well as managing security risks.
While there are great CTOs out there that are conscientious and thoughtful about this double-bind, most aren’t.
It’s good to have open discussions about upside opportunity versus downside risk and generally that happens best when your boss’ bonus doesn’t primarily depend on them maximising upside.
Is there any better way you could set this up? Just asking for a friend.
Get the downside risk people in tech to report to somebody who is accountable for managing downside risk at the same level of the CTO.
Typically an intelligent and tech literate CFO or Chief Risk Officer.
If the Head of Security and the CTO can’t come to a deal, it reaches the ExCo or board for a decision.
I call this “creative tension” and it works better than the alternative.
Sounds reasonable enough - thank you !