Lots of people will get in touch with you about improvements to your developer-oriented product or library. This feedback is great, but take it with a grain of salt, there is a much quieter group with much bigger struggles.
The people who are getting in touch with you are the ones who aren't having the biggest problems. They understand your docs enough not to fear embarrassment of asking the wrong questions. They overcame missing context themselves, only to be able to ask questions which represent their group.
Survivorship bias or survival bias is the logical error of concentrating on entities that passed a selection process while overlooking those that did not. This can lead to incorrect conclusions because of incomplete data.
They did not get tripped up by implied dependency installation steps. They did not get lost due to assumptions of typical CLI behavior that you did not document. They survived to give you feedback.
During World War II, the statistician Abraham Wald took survivorship bias into his calculations when considering how to minimize bomber losses to enemy fire. The Statistical Research Group (SRG) at Columbia University, which Wald was a part of, examined the damage done to aircraft that had returned from missions and recommended adding armor to the areas that showed the least damage, based on his reasoning. This contradicted the US military's conclusions that the most-hit areas of the plane needed additional armor. Wald noted that the military only considered the aircraft that had survived their missions; any bombers that had been shot down or otherwise lost had logically also been rendered unavailable for assessment. The bullet holes in the returning aircraft, then, represented areas where a bomber could take damage and still fly well enough to return safely to base. Thus, Wald proposed that the Navy reinforce areas where the returning aircraft were unscathed, inferring that planes hit in those areas were lost. His work is considered seminal in the then-nascent discipline of operational research.
These are the planes which came home for you. Please find a way to account for the ones which never made it.
Top comments (21)
... Not that you shouldn't listen to the feedback you get, but run it through a lens of this understanding. At the end of the day, there are a lot of UX practices devoted to figuring out why people fall off. We don't get the same work done for DX.
If you are part of a project with the resources to bulk up on the docs and UX of the docs, please devote time and people here. If you don't have those resources, be on the look out for great Readme experiences and try to be empathetic to cohorts who would fall off early.
This is why exit interviews are so important! l’ve left a few jobs for the reasons Sarah mentioned and I basically had to go out of my way to give them feedback.
I form the other; feedback less exit group. I never-ever give the correct, the very true picture at any exit interview. In line with the survivorship hypothesis. I have found that doing so is harmful to my chances at the new organization, every one knows damn all everyone. Second, it harms the chance of reentry at higher levels, third, HR is most probably in with the dirty secrets I may have uncovered at that organization. Without HR remaining complicit or silent; its very hard to suppress what others are leaking on to the grapevine and during their exit interviews respectively. If my opinion could change anything, I would have spoken sooner and still be in the organisation. Due to these factors, the difference I could make by voicing my opinion at any exit is offset by the benefits of Omerta (silence is golden)
It depends since for me, my mentor gave me the advice that you should give neutral and leaving on good note. Instead of giving any advice that might help the company especially when you won't be part of the company anymore so it does not matter to you.
That's good advice by the way. Probably for the best. If you are not there, then the advice will more probably be misinterpreted and taken out of context. Since you are not there to temper it with good sense, its good to not start off at all.
After all good advice does not come free of responsibility.
All very true, it is the exact argument for supporting IE (nobody visits our site with it anymore) or accessibility (we have no disabled customers)!
The problem we have compared to Wald is that we don’t know how many planes we lost (easily).
tbh: the IE argument is valid, still it's extra work you will spend on supporting. I'd rather support it's extinction.
For the a11y side we are unable to provide easy/cheap alternatives to the people. So supporting these practices is a huge benefit.
Extinction is welcomed for IE, but I would always support a locked down browser such as Brave and Tor Browser, and low-bandwidth, screen-space challenged, ... which therefore means I should have standards to support plain text, non-javascript, accessible (visually impared, screen-readers) and so on, which would as a side-effect also render well on IE.
It sucks to land on an interesting project and have no idea where to get going because of a complicated or assumptive README
Writing a good README is hard.
If someone struggles with it, have a look at those articles
How to write a good README? #discuss
Jean-Michel Fayard 🇫🇷🇩🇪🇬🇧🇪🇸🇨🇴 ・ Nov 26 '18 ・ 1 min read
How to Write an Awesome GitHub README
Andrew Healey ・ Apr 14 '19 ・ 6 min read
Thanks for sharing
There are even more biases to consider. Most developer have a bias towards the languages, patterns and paradigms they are most familiar with.
That doesn't mean that those are always best suited for whatever problem there is to solve. A recurring example are JS WTFs that basically only show standardized JS behavior deviating from most other languages, thus conflicting with developer's expectations.
Someone who is familiar with jQuery might not immediately get the point of an MVC framework. A lot of developers consider CSS's cascade broken and prefer styled components.
They're not exactly wrong, but they're not exactly right either - they're opinionated. And thinking one's opinions are justified might just be confirmation bias.
My least favourite kind of OSS homepage is for *nix CLI tools that always say something suspiciously close to "If you're here, you already know what you're looking at and what it's for."
Working on the docs for StimulusReflex and CableReady could be a full-time job if I let it be... but it's also so worth it. Our documentation and support is driving our adoption. It's the best marketing we have.
I try to take inspiration from the Diataxis project; we're working towards it as an ideal. Our docs still can occasionally feel like all things to all people.
One thing I believe is that video is an under-appreciated mechanism for both onboarding and explaining. Frankly, there's a lot of folks who just won't read... but they'll watch a 90 second video.
The other thing I can say for sure is that running a really attentive and welcoming Discord server gives us an opportunity to get to know the folks who might otherwise bounce. Every issue that comes up makes it into the docs in some form or another. It's a loop/cycle of support->documentation that makes the whole thing take on a life of its own.
Great post! I've always been fascinated by the different types of biases and this one in particular is one of the most overlooked ones.
But there is something important to point out. Survivorship bias is only applicable when there is a strong correlation between the failure and the components of the system you're analyzing. E.g. When it comes to aircraft, the damage to their different components is strictly correlated to the crash, another example: when the brodie helment was introduced, there was a dramatic rise in field hospitals admissions of severe head injury victims, but it was because they were now surviving to the same hits and they made it to the field hospital. The correlation between the death and where are the entry point of the shots located is very strong too.
I don't think that is the case for most of the feedback though, because everything is more subjective and each problem is a whole different world for each person which depends on their background, situation, platform, context, etc. The correlation between the failure of a library or an application and their features or issues, while relevant, is not that strong and objective. For some people it may be the platform, for others the design and for others the speed and for the most part is actually external factors (like the time, location, context, etc).
That being said, it may still be applicable for specific cases, like a particular objectively killer feature or a tangible issue, and we should always keep it in mind!
This is a large part of why I frequently hold calls with stakeholders (from developers through to IT directors and VIP end users) with a general premise of "tell me what we, as a company, are either doing WRONG, or you think we could improve on. I don't care if you have the solution or not, if we know there's a problem, we can find a way forwards."
Sure, I'm still having bias, because my audience for those calls is limited. I just don't have the time to talk to millions of people during a Sprint...
The latest thing to come from a developer, was "Standups feel like I'm wasting my time, and they're in the middle of my workflow, causing an interruption." He was right, 99% of the time he reports "I'm working on this ticket, and should be done soon, I'll pick the next ticket when I'm free."
So effective immediately, I've banned standups for that project... the PMs can see the project status on the board, and I'll deal with project risks/blockers.
If I comment to say this is a good post I’m boosting a post that was published, but what about all the posts that never got published/finished/written? Survivorship bias tells us there are many posts that didn’t survive. I should be boosting those!
…but it’s a good post, thanks :)
This is a useful thinking tool which forces us to question our sense-making. This is why it is so important to question our assumptions and the information when making conclusions - ensure you're making decisions on the correct information.
Great call out. I struggle with this in building an internal developer platform for developers. In many ways, the audience finds it hard to give feedback because we/they are so used to using poor tools and being told that they should read the docs/manual. The little feedback I do get is GOLD. It won't always be written and sometimes will only be given verbally in a group setting. I'm still thinking through ways to show I am open to receiving feedback. Thanks for sharing this.