The employee who accidentally triggered the missile alert on Saturday pushed the wrong button. It was a disastrous mistake that sent Americans panicking and running for shelter. Needless to say this is the kind of accident that should be avoided at all costs. Check out this passage from the linked Washington Post article. The emphasis is mine.
Around 8:05 a.m., the Hawaii emergency employee initiated the internal test, according to a timeline released by the state. From a drop-down menu on a computer program, he saw two options: “Test missile alert” and “Missile alert.” He was supposed to choose the former; as much of the world now knows, he chose the latter, an initiation of a real-life missile alert.
That's right. The person who made triggered the alert did so by selecting from a dropdown menu and picking the wrong one. That is not a human error, that is a software design error. Failing to account for this incident is absolutely negligent. I'm not sure what other people do, but when I'm designing interfaces with potentially destructive consequences, I do so with a lot of fucking care.
On this website dev.to, a community platform where software developers share posts like the one you are currently reading, admins have a few internal interfaces for performing fairly destructive behavior. One action we might take is banning someone for spam. In doing so, we take down all the bad actor's posts and comments and disallow them from signing up to do so again. This is what that button looks like in our backend:
It's not even that destructive. We can recover this data if needed.
Furthermore, that screen is only accessible when the user qualifies to have this action taken on them. If I visit the same page for most users, I get this message where that button would be:
This means the admin has to perform all the actions manually. A time-consuming activity nobody could possibly do by accident.
The consequences of us making mistakes are minimal. The worst case is a bit of lost data. Something we badly don't want to do, but ultimately not that big a deal. Alerting a nation that it is under attack is a huge deal. Software developers cannot ship interfaces that make this sort of human error possible. If the mistake can be made, it will eventually be made. We call this type of error "fat fingering" and it happens all the time.
Sure, the designers and developers that created the button weren't the ones that pressed it, but this disaster was their fault. It was not the pusher's negligence or their training system. Mistakes happen, I make them all the time and so do you, but let's learn from this one and write better software.
Shameless plug because this post is getting a lot of fly-by social traffic:
If you're still a lurker on our platform, I'd recommend taking a minute to create your account right now. There's a lot to be gained from being part of our dev community and we'd love to have you. 🙏❤️
Latest comments (44)
Please let me add that all that was discussed here is possible and probable, however, in this particular situation, a latter explanation was similar to what Pee Wee Herman said when he road and "popped wheelies" on his bicycle too fast, he flipped, flew up in the air and landed back on the seat. When his audience looked amazed he said "I meant to do that", in his Pee Wee voice. So according to The New York Times article of Jan 30th, 2018, the person who pushed the button "meant to do that". Why? He really thought there was a missile heading to Hawai'i.
I'm just going to assume the team(s) that designed and developed this had reservations about the design but were powerless to change anything. That doesn't absolve them of anything, but this is definitely a lesson learned for me since I'm new to coding and development.
And let's not forget the other half of the problem, there was no cancel or takeback, or way to send an "Oops, it was a drill" message using the same broadcast method, so they had to resort to Twitter and highway signs to try to get the follow-on message out.
The discussion here seems focused on the responsibilities of developers and designers, so I feel obliged to make a different point:
The fault for causing panic and fear over an incoming nuclear attack lies with the existence of a system in which one must fear an incoming nuclear attack. It's technology's biggest mistake as an industry and a collection of people that we look at this situation and conclude that, in an ideal world, the design of the nuclear warning system would prevent accidental false alarms. In an ideal world there would be no nuclear warning system and no nuclear weapons. Let's stop asking how we can make terrible, monstrous things more user-friendly and instead ask why we are building terrible, monstrous things.
I struggle with two sentences, Ben.
First one: "that is not a human error, that is a software design error."
I think it's best to say "it's not a user error" (software still designed by humans ;-) ).
Second one, the title: "The Hawaii Missile Alert Was the Software Developer's Fault". This is pure speculation on my part, but here it goes: I believe the software developer would have done it differently if he/she had a choice. But in these days of Product Managers, Product Owners and Product Managers contradicting each other, of budget restrictions and bitter discussions between customer and provider about the cost estimation of each Change Request, ... in such environment with such flawed and poisonous work processes, common sense cannot be exercised by the ones ultimately doing the job, i.e. the sofware developer in this case. Customer pays for a drop down menu, customer gets a drop down menu. End of the story. Dare you not put in a few more hours to design a better UI, or else next time the customer asks for something, he/she will expect the "drop down menu" price for someting that requires more effort.
I bet my hat something like this actually happened.
This is kinda standard on my city, i worked in some places that we had buttons for very critical processes, but they didn't had a warning message or something like that, it's pretty funny how the end user really blames the devs for that but when we suggest them about a confirmation, in most of the cases the client or project managers says "NO" we don't need that. It's kinda like a loop on this.
I really can blame the dev, maybe the client said "****" the confirmation, or the UI designer we don't really know, but one thing can be said, that was a freaking bad mistake.
This GIF is (likely) an accurate representation of what happened. (twitter.com/okvro/status/953397954...)
To be fair, I'd rather that the UX design err on the side of getting the alert out than not issuing the alert at all.
A false positive in this case is much better than a false negative. A few minutes/hours of temporary fear is better than thousands dead because they didn't get to shelter.
As such I'm grateful for the current design.
Putting a Dev (and only the dev) to blame is quite arrogant and maybe also a bit ignorant of all the circumstances.
Especially for such a scenario its contra productive to try putting someone to blame. There was not only the Dev, or the people who created the specification, there was also an uncountable number of people who used it already and did not insist on a change.
There is never a way to create an application to prevent all user errors. And the examples you told about, they were not there from the beginning but were created as a consequence of a user who did something wrong in the beginning.
Just that their software is probably like 20 or 30 years old, do you remember the state of blog and forum software back then?
Also you completely ignore the main usecase, which is of firering a warning in the fastest possible way.
Having a false positive from time to time is a lot less of a problem, then failing to send out the notice because of to many safety guards.
I agree, +1 design error. Lotta repercussions if the wrong switch is thrown. If there wasn't a lead or manager overseeing the development of this, that's bad. If there wasn't any QA involved, that's bad. Short story: the individual atop this software development effort gets 10 lashes with a wet noodle.