15 Comments
User's avatar
HBD's avatar
2dEdited

Another terrific post!

If you want to see an error corrector in action check out DataRepublican on Substack or @DataRepublican on X. (That’s a small “r” not a party affiliation. As befits an error corrector. )

It’s not just institutions. The vast majority of people don’t understand error correctors. They frame the situation as involving someone who “always thinks they are right.”* There are such people, of course, but a bunch of us just want to GET things right. We are the people who go back and put the apostrophe on “dont”. In addition to the autistics, there’s an overlap with OCD.

* If I didn’t think I was right, why would I say it?

Mark Atwood's avatar

I actually had DataRepublican as well as several others like them in mind, as I wrote this.

HBD's avatar

Who are the others you would recommend?

Stevo's avatar

Excellent piece. This has been my lived experience in several institutions over these last many years. The extension I’d make is that the institutional immune response does not merely suppress correction; it isolates the corrector. That isolation is psychologically powerful because it makes the target experience the conflict as private and singular: maybe it’s just me; maybe I’m the problem; maybe I’m the only one seeing this. But that same isolation also has a social function, because it demonstrates to others what happens when someone keeps pressing on what is wrong - and, social animals that humans are, most get the message. Getting shunned by your peers out of their fear of association with an institutional contaminant, and getting caught in the blast radius, is much harder than being sidelined by management.

Wes's avatar

Institutions only continue to be effective at their mission when they have the proper ratio of autists to normies

Richard Kringle's avatar

This excellent piece would have been stronger had it included one or more real-world examples.

Jacqueline W's avatar

The Open Source was a real world example. You can even get corrected on Stack Overflow for asking a question the wrong way.

Mark Atwood's avatar

I have personally called out Eric Lippert for doing that.

Steven C.'s avatar

That's one of the main reasons the Soviet Union collapsed. It wasn't only the near impossibility of succesful central planning, but that the information available to the central planners was usually incorrect for the very reasons desribed in this article.

Siezmo's avatar

Accuracy is “social construct.” What fresh hell is this? Fantastic article.

M Yao's avatar

The ultimate goal for an institution would be to create a system by which errors are corrected in such a way that the institution's immune system is unable to come into play. So protecting themselves against themselves, essentially. From a healthcare standpoint, one of the ways they do that is reporting systems. These are anonymous. Any employee can file them. And they can be minor things or large things, issues within processes, organizations, individuals that are incorrect. A system like this is present in virtually every hospital and large outpatient medical group in America. The benefit of this, besides having designated employees to follow those up and discuss them at committees, which of course are always subject to institutional bias, is that it creates a written record. Medical institutions, of course, are very, very concerned about liability. So having a written record like that is a good motivation to make sure they're not ignoring problems that have been noted in the queue, because those would be discoverable, potentially, during a lawsuit. So in that way, the institute can allow reporters to report and help block its own immune system from interfering. Other institutions could consider doing something similar to maximize the benefit of the reporters and minimize institutional resistance.

M Yao's avatar

A major focus in error reduction in medicine, which was taken from the airline industry, is to shift focus from individuals making mistakes to what in the system is causing those mistakes. This tends to promote preventative thinking and reduces defensiveness in individuals and organizations which ultimately help reduce institutional resistance to correcting mistakes.

M Yao's avatar

I should note that errors in medical systems are more likely physical than written. For example: a reporter might report that people in their department keep not closing the curtain separating dirty laundry receptacles and clean packaged supplies. Institutional members tend to naturally want to respond that that’s some petty bullshit with no real risk of causing harm. The reason the institute would still want a way to correct it is that ignoring minor errors in protocol can lead to ignoring more serious errors down the road. Kind of like broken-window policing.

Joseph's avatar

“Can’t sleep, somebody is wrong on the Internet”

HBD's avatar

But not “somebody”. Something.